Hacker News new | past | comments | ask | show | jobs | submit login

> with GPU you may not always get the same kernels available

No kernels are available _out of the box_. You code a pixel shader, implement any kernel, or any other resizing method besides kernels: https://stackoverflow.com/a/42179924/126995

> that only handles resizing, not compressing/decompressing

In my previous comment there’s a link to a commercially available JPEG codec, 100% compliant with JPEG Baseline Standard, that does both compression and decompression.




Yikes. If we had had to write our own image resizing kernel, this would have taken much longer. And ok, it can do JPEG but what about PNG, GIF, and WEBP?


> If we had had to write our own image resizing kernel, this would have taken much longer

I don’t disagree but this is very subjective.

You don’t need to invent anything, you only need to carefully implement a well known approach, e.g. this one: https://developer.nvidia.com/gpugems/GPUGems/gpugems_ch24.ht...

Also there’re third party libraries for that, e.g. here’s one from the same company who do JPEG codec: http://fastcompression.com/products/resizer/gpu-resizer.htm

> what about PNG, GIF, and WEBP?

As far as I understand, you goal was to cut server costs, right?

I assume the majority of pictures on the Internet are jpegs. If you have them processing on the GPU, this leaves you 16 virtual CPUs you’ve already paid for just sitting idle and waiting for the GPU to finish the job. No need to do everything on GPU.

P.S. Some other people already implemented what I’m telling you: http://code.flickr.net/2015/06/25/real-time-resizing-of-flic...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: