Hybrid decoders that use GPU shaders are somewhat rare; HW decoding pretty much always means "ASIC". And ASIC power draw for decoders is typically in the <1W range.
For dav1d, even YouTube-tier 1080p SW decoding is using +4-5W on my laptop, and 4k60 is +15-20W.
> ASIC power draw for decoders is typically in the <1W range.
Many times even "standalone" HW decoders use or share GPU components (e.g., almost always the memory). Just bumping the memory controller clock up of the GPU already consumes >10W on my system.
it's hard for me to imagine that video decoding would need a significant bandwidth boost like this to run. that seems like either a driver or hardware issue, and one that ought be solveable. 4k60 is 12Gbps. even inflating that number a bunch, it's hard to imagine most discrete graphics card memories needing more than their base-clocks to serve this.
on mobile at least, where graphics are integrated & using main memory, there ought be little/no difference in memory throughput use.
last, some new GPU's like AMD's Navi (RX6xxx) have on-package caches, "Infinity Cache", between i think 64-128MB. i want to think think could be used like Intel's Crystal Well L4 eDRAM, to keep from needing to go to main memory at all. how much if any of a win that is & whether that would even be possible i'm not sure.
i'm somewhat skeptical that there really is a problem here. if there is, i suspect it's somewhat rare & probably a bit of an oversight. i should test though. i would love get a wider picture of what the real impacts of video decoding are.
Indeed. For example, hardware decoding is the difference between choppy video and smooth video on the PinePhone because the CPU isn't powerful enough and the GPU is useless for decoding.
(And to fguerraz's edit that their comment doesn't apply to mobile phones "where manufacturers control hardware and software end to end", the manufacturer does not control the software on the PinePhone.)
To be exact, it depends on the generation of hardware. At leat for Intel and AMD, the first version tend to have more shaders, then they switch to ASICs. Intel actually open sourced the shaders that they use.
So was I? Which phone can even achieve a 20W power draw...
The only hybrid VP9 decoders were AMD's that only supported Windows, which they stopped shipping years ago (any current/Linux AMD drivers that support VP9 decoding only do so via an ASIC), and Intel's that was only supported on 3 generations of GPUs (Gen7.5, Gen8, and Gen9) and is obsoleted with an ASIC in Gen9.5.
For dav1d, even YouTube-tier 1080p SW decoding is using +4-5W on my laptop, and 4k60 is +15-20W.