The title is a bit misleading, it sounds like its a general improvement, when actually its a patch for the intel driver, so it only affects those that use intel cards and this driver, or did I missed something ?
This is great. Now if only the newer Intel drivers didn't make X segfault!
This seems to be a common problem. I had to downgrade to 2.14.0; some bug introduced in 2.14.903 and still present in 2.15.0 break it thoroughly. Fast is nice, but stable is better.
I agree. The graphics system is basically the only part where I'm envious of windows users. As far as I know, since vista the graphics driver can crash without tearing down the rest of the graphical system, i. e. no data loss.
Modern distributions are unusable for me because of Intel 855GM graphics driver bugs. The latest one I can use is Ubuntu 10.10 with special patches by glasen-hardt.de, even Fedora 15 and openSUSE 11.4 give me trouble with flashing windows, missing content etc.
Yeah, I have an old lightweight Sony with 852/855 and it ran better with Gentoo five years ago than with Maverick today. To get it to work Ubuntu turns off the acceleration so it feels so damn slow. Crossing my fingers.
Debian testing would make a much better choice on the desktop for anyone not actively trying to help Debian development by trying bleeding-edge packages and filing bugs. Run Debian sid if you don't mind encountering breakage and filing bugs, for the benefit of the users of testing who then won't have to deal with the buggy packages you did. :)
These bugs range in severity from "inkscape becoming the default PDF viewer" to "file conflict prevents installing a package" to "libc6 lacks the /lib64 symlink and now no dynamically linked programs on the system can run", and everything in between. Fortunately, the "break the whole system" bugs don't happen often, but still: don't run sid unless you want to help filter out breakage for others by hitting it yourself.
I know that the plural of anecdotes is not data, but I currently run Sid, and I have been for several years. It hasn't been problem free, but I don't recall a system-killing update. The most common problem is some missing or broken dependencies for a package forcing me to hold off installing a package or updating my system for a few days.
In the mean time, back when I used to live with someone who used Ubuntu as his main system, and when I used to run Ubuntu on my laptop, I had to fix system-crippling bugs at least 3 times.
So, anecdotally, I've had less system-killing problems with Sid than Ubuntu's stable releases.
When it comes to system-killing bugs, it also matters whether you update daily or less often; system-killing bugs often get reported right after a mirror pulse and fixed one or two mirror pulses later (less than a day).
Not everyone cares about having the latest bits. Sometimes, you want a system which you don't have to upgrade or fiddle with frequently. And yes, that can include desktop systems, particularly desktop systems other than your own. :)
I run Debian Squeeze on the desktop because I have jobs I want that desktop to do for me, without my having to worry about it unexpectedly not being able to do them.
However, I treat Squeeze more as a base to build on than as a complete system. For instance, I work with Ruby via rvm, not native, and use a Firefox Nightly build. In that sense it makes a rock-solid foundation: anything that I'm not responsible for, works. Anything I am responsible for is my call, and can't break the underlying system unless I try quite hard.
FWIU, this new architecture makes most of 2D acceleration go through the 3D pipeline, saving a lot of overhead from switching between the two current "rings" (BLT for 2D and RENDER for 3D).
It's nice to see Intel cares about their X support now.
I suspect some of this is a side effect of the popularity of GPGPU computing. If they don't want their lunch eaten by NVidia and ATI/AMD on certain computational benchmarks, they're going to have to provide a professional implementation of OpenCL on Linux. I suspect (am hoping) that the OpenGL acceleration and plain X11 improvements come as part of the package.
People don't use Windows-only hardware in their supercomputers. NVidia was the first to realize this and this is one reason they have much of the GPGPU mindshare.
I am skeptical of this, for two reasons. First, the GPGPU thing really hasn't affected Intel, who still sells more units than either AMD or nVidia, and will probably continue to have a stranglehold on their current markets. GPGPU really doesn't matter to most people.
Second, the guy that wrote this code, Chris Wilson, is one of the big guys behind Cairo and had an experiment, cairo-drm, where he plugged Cairo directly to an Intel graphics chipset by talking directly to the kernel and bypassing the Intel X drivers. His results were probably the foundation of this new work, as far as I can guess.
Intel has been trying to get into the GPGPU business for years. The problem for them, of course, is you have to have a credible GPU before you can GP with it. Here's where they tried to glue together a bunch of existing cores (yes P54C Pentiums) and call it a GPU: http://en.wikipedia.org/wiki/Larrabee_%28microarchitecture%2... Like early experiments in aviation, it didn't fly.
I didn't know about Wilson. I'd somehow gotten the impression that this work had been supported by Intel. Oh well. If they didn't, they shoulda. :-)
we can take advantage of additional efficiencies, such as relative relocations, that have been incorporated into recent hardware advances. However, even older hardware performs better from avoiding the implicit context switches and from the batching efficiency of the 3D pipeline...
My current laptop is a Dell C510 running Arch Linux, rocking a massive 384MB of RAM and LXDE. It's relatively (surprisingly) snappy as is, but any improvement is going to make a huge difference. Great work.
The arguments against large commit size do not hold in this case. If you look at it, the patch consists of a few changes to Makefile.am, configure.ac, man/intel.man, src/Makefile.am, and src/intel_module.c, which contain no significant logic, and completely new files in the directories src/sna/ and test/. It's one new feature that is practically self-contained, so I don't see any sense in splitting it up into smaller commits.
For non-X people (IOW, everybody except me and Josh):
This commit speeds up general desktop rendering, in both general and pathological cases, for nearly all Intel graphics chipsets, causing everything to feel snappier and more responsive. It's a universally Good Thing.