I sense another flame war. Unless there is new data from a new set of image quality/compression tests, I would urge people to just look up the old threads instead of rehashing the arguments again.
Note: Although YouTube announced a switch, Google+ has long been using WebP to serve up images in the stream with substantial savings, so the usage of it by Google on their services is not new.
Things to consider: WebP browser support is currently terrible (http://caniuse.com/webp), and it seems the solution to make WebP images display in non-supported browsers is through the help of a rather large JS file (http://webpjs.appspot.com/).
Actually it is Google Pagespeed where the magic happens. You serve jpg, gif, png, pcx^H^H^H and then your nginx/Apache converts them to webp for you, on the fly, cached. If the person requesting your page runs chrome they get webp.
So that means everyone else gets a slow image format? Yes, but you have saved bandwidth and your pages have loaded quickly for ~50% of your users.
I don't think webp is really a source image format, more something that is browser and Pagespeed specific. Try it, you will be impressed.
There is a big problem with auto-converting images, and its with HTTP caching.
Consider this: I have page.html, with an <img src=logo.jpg> tag. If I'm using Chrome and I request logo.jpg PageSpeed/the web server detects that I support WebP via the Accept header. It sends WebP image for the URL logo.jpg.
And that's a problem. Because the content for logo.jpg is no longer dependent entirely on the URL. It the URL, and the values of the User-Agent or the Accept headers, which determine the response.
This is called content negotiation and it utterly nullifies shared HTTP caching. This is bad because shared HTTP caching can be a huge performance win.
The right way to serve WebP images is to change the URL. The application logic that generates the HTML should detect that browser support webp and rewrite the image tag accordingly. In other words, page.html should include a <IMG src=logo.webp> tag.
This means that browsers that support webp get it, browsers that only support JPEG get it, and shared HTTP caches can be used without polluting the cache.
Sadly, PageSpeed and other technologies like it aren't currently smart enough to re-write the URL in the base HTML page. As always, technology that "automatically fixes problem X" rarely is the best solution.
Isn't the caching issue trivially fixable with "Vary: Accept" header? (No idea whenever PageSpeed adds that or not.) Content negotiation shouldn't interfere with caching, if both are done properly.
Also, if you respond with different HTML, you still get the caching issue, because you may mistakenly serve cached HTML linking to WebP images to non-capable client. So the cache should still consider Vary and Pragma headers (or it's broken).
The only thing I don't really fancy about "/cute/kitten.jpg" serving a WebP image is ".jpg" part. If content-type may vary, the "extension" part of the "filename" should be generic ("/cute/kitten.image") or missing ("/cute/kitten"). That's for saving pics, when URLs transform into filenames.
First, "Vary:Accept" and Vary:User-Agent effectively nullifies shared caches. Because instead of saving 1 copy of a response for a URL, and serving it to all requests, a shared cache has to save 1 copy per unique combination of URL and the Accept request header and/or the User-Agent header sent by the browser.
Browsers send wildly different Accept header values, not only across browser vendors but across versions as well. And the User-Agent string has insane variation since OS, CPU, language, and more can be included.
The net result is that all these unique request header values fragment the shared cache so much that you don't get any cache hits. The shared cache has been nullified.
Second, if the HTML is a static file, yes you have the same problem. However, in the modern age of web publishing, the HTML is almost always being dynamically generated by some application logic/CMS/something. This means that it is rarely is ever cached or marked as cachable to shared caches.
> Content negotiation shouldn't interfere with caching, if both are done properly.
Unfortunately, “done properly” excludes things like nginx's cache module, a number of CDNs, and even certain browser caches. Some of the bugs are obvious – e.g. returning gzipped data to a client which didn't request it – but others will simply require monitoring to realize that your cache isn't caching anything or has a dismal cache hit rate because every permutation of request headers and the values referenced in Vary are being treated independently. Worse, all of these can change unexpectedly due to updates in “stable” software so you need deep monitoring checks to ensure that everything is still working the way it was when you set it up.
The more I've used content negotiation, the less I'm convinced that it's a desirable feature. Unless you tightly control the client, server and all intermediaries you'll spend a ton of time on operational overhead and the alternative is that you simply use unique URLs which will always work correctly after the initial implementation, which is also easier.
That does not solve the problem either, than you cannot cache the served page, as it will either have logo.webp or logo.jpg in it (well if it is dynamically generated per user this might not be an issue).
The correct answer is something like the video tag, with a choice, polyfills, or headers that are useful enough to do a small scale Vary on. None of which are very good solutions.
I'm confused what you are doing. You are taking an existing lossy format e.g. JPG and then recoding it in another lossy format. So yes you may save an insignificant amount of bandwidth but at the expense of introducing potentially significant artefacts in your images. What serious web developer is going to want to do this ?
Both the original article and the Wikipedia page state that WebP support lossy and lossless compression. I guess using lossless WebP would not make sense for JPEG files as the result would probably be bigger than the original JPEG, but it might work well with PNG.
Perhaps 50% of your users aren't, but services exist for which that is the case. It doesn't really change the point being made however: would you really have preferred the statement "...your pages have loaded quickly for 38.271% of your users"? That's why the "~" was included to indicate approximation.
Just to follow up on your comment. For those less willing to use the apache module, there is always the pagespeed web service where you can change DNS to get this accomplished. (And utilize Google's CDN as well).
Google thought of this. Their Chrome browser probes for WebP server-side support through content negotiation. So you can still serve JPEGs to browsers, but those which are modern can get WebP images.
You missed my point. Technology is not the reason WebP is probably going nowhere.
Apple, Microsoft and Firefox for whatever reason (politics/principle) either won't support defacto standards or have to be dragged kicking and screaming into doing so (see SPDY). And without those three you are never going to get above 45%.
On top of that, SPDY is an experimental protocol that may eventually (after enough iteration) make it into a standard (HTTP 2.0). It's not standard yet, and definitely not defacto standard.
Interestingly enough, WebP is a great format to use if you don't want users saving your images.
More than once I've right-clicked an image to save it to disk, only later to discover I have no program on my computer which can open or view it, except for Chrome! No thumbnails, no previews, no Photoshop, nothing. (I know there are plug-ins, if I really wanted to.)
-alpha_q int
Specify the compression factor for alpha compression
between 0 and 100. Lossless compression of alpha is
achieved using a value of 100, while the lower values
result in a lossy compression. The default is 100.
http://caniuse.com/webp
Webp support in other browsers than Chrome is virtually nonexistent. However if a big chunk of your audience browses from Chrome, you could serve Webp today. It can be done in a browser neutral manner by looking at the Accept header.
How is that prescient? Standards can get replaced when something better comes along. The question is whether WebP will be compelling enough to overtake what came before. It looks promising at first glance.
If anyone at a browser development company is reading this, vector image formats are more in need of disruption than raster image formats, SVG is versatile but it's bulky, a more compact vector image format would be welcome, especially in the age of responsive design.
That said, some technology standards can stick around far beyond what they should. MP3 is a classic example, there are far better music formats than MP3 (you'd be hard pressed to find a worse one that's still supported), but because of the sheer volume of music in that format, the 'good enough' status, and the limited abilities of some of the hardware players to handle better formats, we're stuck with it. This isn't something we should seek to encourage, but it does happen.
> a more compact vector image format would be welcome
SWF. It's more than just a vector image format, but it is far better than SVG in terms of both filesize and resources required to render. Too bad for the "Flash hater's stigma" and the direction Adobe took their player after the acquistion... although there are now a few other alternatives including Mozilla's Shumway.
SWF as an image format also requires a client side runtime be installed. A runtime with spotty cross platform performance and a poor security track record at that. The distaste for Flash is pretty well justified, I'll take the larger and slower SVGs over Flash any day.
Also, since Adobe removed the restrictions on creating programs to display Flash content a few years ago, there's nothing stopping others from writing a minimal, image-only SWF plugin.
It's not just the filesize, it's also the rendering speed. I'm not saying SVG is terrible, I'm just pointing out there's probably room for a smaller vector format that's more focused on the needs of modern websites.
Astonishingly concise? SVG is based on XML, I don't think I've ever seen XML and 'astonishingly concise' in the same sentence before, congratulations for being the first. Besides, I've already pointed to another vector format that is far more concise, just by way of example. SVG is versatile, a jack of all trades if you will, but concise it is not.
>SVG is based on XML, I don't think I've ever seen XML and 'astonishingly concise' in the same sentence before, congratulations for being the first.
The rudiments of the vector description is extremely concise. That it is XML is superficially irrelevant, and if you really think the text markers are so important, consider it a highly compressible binary format via GZIP. Ultimately it doesn't matter.
I've never seen any metric that holds SWF as being more efficient than real-world SVG.
Note: Although YouTube announced a switch, Google+ has long been using WebP to serve up images in the stream with substantial savings, so the usage of it by Google on their services is not new.