K-lite about and battery usage of different decoder

3 posters

Go down

K-lite about and battery usage of different decoder  Empty K-lite about and battery usage of different decoder

Post by punsher2011 Sat Jun 07, 2014 8:00 am

Hi .
1- A Suggestion !
I have been using k-lite for more than 4 years and one thing that bother me is that it has no information regarding k-lite version .
and there is no info on how to access this site . i think it deserve a nice , decent GUI that shows if there is newer version available and some feature like auto update (not necessary , I know how much server cost Very Happy) and some functionality of k-lite setup that is preferred by admin . ( i don't think just a setup is enough , because of the time you spend on supporting and updating codec and customizing usage of them )

2- A Question ?
Which one Consume more battery power ?
-Hardware Decoding  or Software Decoding ?
 -- Software decoding is usage of cpu ( hardware @the end ) but is there any difference between using "QuickSync" of intel or "normal decoding"
 -- what about using of GPU ( my case /Cuda ) and CPU ?
     --- I have read that in new haswell , QuickSync has Improved for quality since it was all about speed ! and it iss less power consuming .
          Is it better to use QuickSync over CUDA ? ( i am asking because i believe my spec can run smoothly any kind of encode (High 10 and ... ) but i need it to consume less battery )

Thanks . Embarassed
punsher2011
punsher2011

Posts : 4
Join date : 2012-01-27

Back to top Go down

K-lite about and battery usage of different decoder  Empty Re: K-lite about and battery usage of different decoder

Post by Admin Sun Jun 08, 2014 7:01 pm

It depends on the hardware whether hardware acceleration uses less battery power. With most laptops, hardware acceleration uses less battery power.

CUVID uses MORE battery power than other methods, because the NVIDIA driver puts the GPU in high performance mode when using it. Using DXVA2 or QuickSync is better for battery life.

High10p is always decoded using software. Hardware acceleration only supports regular 8-bit content.

Admin
Admin

Posts : 7632
Join date : 2011-06-17

https://codecs.forumotion.net

Back to top Go down

K-lite about and battery usage of different decoder  Empty Re: K-lite about and battery usage of different decoder

Post by punsher2011 Mon Jun 09, 2014 3:58 pm

Thanks  pirat 
punsher2011
punsher2011

Posts : 4
Join date : 2012-01-27

Back to top Go down

K-lite about and battery usage of different decoder  Empty Re: K-lite about and battery usage of different decoder

Post by olee22 Sun Nov 22, 2020 7:32 pm

I was searching for this question, and came across this post. I've found a good article that made some measurements, comparing CPU only, Intel Quick Sync, and NVIDIA CUVID. They also tested the DXVA from Microsoft.

It looks like the best option for battery is to use Intel QuickSync hardware acceleration.
CPU only uses a lot of power, and the NVIDIA also uses more battery.

https://forums.lenovo.com/topic/view/2255/4200222


Results for h.264 video:

Before I proceed to analyzing the results it's necessary to explain the inclusion of DXVA2. DXVA2 is the second iteration of DirectX Video Acceleration (DXVA) which is Microsoft's implementation of offloading video processing from CPU to GPU on PCs running Windows OS and XBox consoles. DXVA2 will utilise any hardware acceleration available on the given hardware.

As we can see the integrated Intel GPU (HD630 in this case) is being utilised whichever device decodes the video. This is because in the Optimus configuration the integrated video adapter is always used as the last device in the chain right before the video output is displayed on the screen even if the dedicated GPU renders the actual image. Both the CPUs you can find in Legion Y530 (8300h and 8750h) as well as dedicated GPUs (Nvidia 1050 and 1050ti) potentially consume much more power than the integrated Intel GPU. That means that whenever we see a decrease in CPU and dedicated GPU utilisation, our power consumption drops dramatically. For a laptop running on battery power this is crucial.

In case of the h.264 encoded video we can observe 3 to 7-fold drop in CPU utilisation when a hardware decoder is used. Each time the video played smoothly without any hiccups or stuttering.

However, any time the CPU utilisation increases or the Nvidia GPU is being used, the overall power consumption will increase significantly.



Results for HEVC/h.265 video:

For the HEVC video we can see that even the hexa core 8750h struggles to decode the video. Surprisingly the 1050ti had slightly less utilisation when decoding the HEVC video than the h264 video. Due to lack of implementation of HEVC decoding using QSV in the current version of LAV video decoder both the DXVA2 (native) and QuickSync options did not properly work with the video. However, HEVC decoding is supported by QuickSync since Skylake microarchitecture. Strangely enough in copy-back mode there was no issue at all and again the utilisation of the GPU went down when compared to h264 video. For further reading on DXVA2 modes please refer to the link below.

https://en.wikipedia.org/wiki/DirectX_Video_Acceleration

VLC Player natively doesn’t give as much flexibility in choosing the hardware decoding, and the video playback did not work with HEVC video using DXVA decoding. Due to those issues the test was not performed using VLC player in the end.

However, with all settings set to automatic, both videos played with no issues and utilised the QVS properly.

MPC-HC download link:
https://github.com/clsid2/mpc-hc/releases

olee22

Posts : 1
Join date : 2020-11-22

Back to top Go down

K-lite about and battery usage of different decoder  Empty Re: K-lite about and battery usage of different decoder

Post by Sponsored content


Sponsored content


Back to top Go down

Back to top


 
Permissions in this forum:
You cannot reply to topics in this forum