Funnily in linux we've had kind of the opposite problem where dmcrypt at some point was changed to use high-priority, multi-threaded workqueues, creating a priority inversion situation.
Regular userspace processes, even those with niced/lowered priority, could generate overwhelming amounts of CPU load just by creating a bunch of IO burden on a dmcrypt-backed mountpoint. Far more than their fair share in terms of their own process scheduling.
The effects of this were especially visible when listening to mp3 files on something like a core2duo (or older) machine. Audio buffer underruns galore, just because you had a big git checkout or rsync running.
It took me years to convince upstream to revert the responsible dmcrypt workqueue commit and fix this.
> It took me years to convince upstream to revert the responsible dmcrypt workqueue commit and fix this.
As a regular user who would never track an issue like this down i just wanna say thank you!
Yeah, the dmcrypt queuing is also a nightmare for getting low latency IO out of your SSDs. IIRC dmcrypt ops go through like 2-4 different queues before hitting disk in the default configuration[1].
AFAIK, this was also due to the disk IO scheduler used in Linux.
Back in the late 00's, audio or video playback would skip on Ubuntu if you had a torrent download on the same HDD, an issue not present under Windows or MacOS.
AFAIK, desktop Linux distros now have different schedulers to mitigate this issue, but back then, that made it impossible for me to use it.
Back in the early days that was a problem as well, but things were better for quite some time even with dmcrypt in play.
Then someone at Google landed a change prioritizing dmcrypt throughput for their use case, and many on desktop linux using dmcrypt suffered for years because of it.
Ah, the "Linux is for servers" problem. Linus thinks the desktop is important, but most contributors work for companies that do server stuff or have mostly server paying customers...
Linux has better RT performance than windows or macOS (or almost anything else that isn't actually an RT OS). That took a lot of work and it didn't come from the server crowd.
IMHO, low priority io throttle has been just about horrible in every way since it was introduced.
You can tell when it was introduced if you google for things like "time machine got super slow for no reason" and see when they start occurring en masse.
It has a strong habit, even on 100% completely idle machines, of slowing down throughput by 10x.
Looks like a thorough analysis, but I question the use of "bs=1m". Author says they verified that "dd was correctly only using a one meg buffer", but "bs" is more than just a buffer, it's the "count" that read()s are issued in. As in "read(fd, 1000000);" for the syscall.
I don't know about MacOS, but in the distant past I found that using larger "bs" values tended to cause my Linux machines to behave sluggishly, particularly when interacting with very slow media like USB sticks or SD cards. I tend to use "bs=32k" at the largest because it offers good performance while also limiting sluggishness. Using too small a value limits throughput because you spend so much time issuing read() calls and looping.
I'm sure the author isn't going to set up that environment again just to run a test, but it would be interesting to know if a smaller bs value would have had any change in this environment.
> a character device (e.g., /dev/rdisk0) ..., however, bypass the buffer cache and spec_strategy entirely. ... This is pretty much exactly what we want
This actually bit me earlier this week when I had to write a Windows 11 iso to a usb stick. Using dd on macos, everything seemed like it wrote ok. Checksums matched. All of that. Yet when I booted the target machine with the usb stick, windows installer kept asking for drivers. Back and forth with the motherboard manufacturer and debugging and I narrowed it down to this bug in dd which skipped a few bits (or wrote them incorrectly).
Needless to say, I used a friends windows machine to make an install stick using Microsoft’s media creation tool and that worked out of the box. Strange. The only other explanation I have is that their direct download iso was corrupt or that this dd bug prevented it from writing correctly.
> The only other explanation I have is that their direct download iso was corrupt
I definitely got this impression, after a 2 or 4 gig boundary was crossed. I think I tried a Raspberry Pi to rule out a Mac issue. I had success with https://github.com/WoeUSB/WoeUSB-ng .
Sorry, I didn't mean "the iso was mangled in transit," I meant "they appear to not understand iso."
The iso checksum verified without issue. Looking into it a bit, this might have something to do with UEFI firmware that only had FAT32 drivers. It started the install fine, but it broke when it hit a 4+ gig file, which would probably look truncated.
> These people either have simply more CPU cycles to burn, or no life.
Or they're just faster at learning stuff than you, or smarter :) Sounds like a pretty interesting life.
Hinting that the jailbreaking/security community has no life, on a site called "Hacker News", where most of us have some sort of hacker spirit, is misguided at best.
Any particular reason you're not using the rdiskN (raw) device family, even if you're reading from the disk? I've done plenty of HDD dumps/restores on OS X and macOS over the years, with USB-to-SATA adapters, and so far I haven't caught any problems with neither reliability nor transfer speeds, always maxing out USB2 at roughly 35 MB/sec.
when they're being artificially delayed to slow down their I/O operations, with the ultimate goal of optimizing the performance of higher-priority I/O
Could this be another example of "the road to hell is paved with good intentions"? While the intent might be good, adding complexity with non-obvious side-effects is not.
HOLY COW. I believe I have been running into this weird bug for years now, even through getting a new Mac (iMac 27" 2015 -> 2020). I could never figure out _why_ things would slow down like this. I would just reboot and everything would be all better. At first I thought it was my iMac or the memory, but after a new computer that couldn't be the case! I have external drives, and often use them, so that's why this seems to happen. But I will confirm next time does.
There also seems to be a bug that eats up free disk space, until restart. I've experienced it on my Intel MBP, which slowly goes from 20+ GB available to zilch. I assumed it was related to swap performance, but my wife is now seeing the same issue on her M1 MBP (on which she doesn't even use all of her 16 GB of physical RAM). On her machine, it eats up nearly 100 GB of space before forcing a restart. Has anyone else had this issue, or figured it out?
That sounds like something gets written to a temporary directory (but not memory backed but filesystem-backed). Have you checked all system logs and such? Maybe take a look at running processes and sort by disk activity as well.
Will give this a try! The thing that surprises me is that it's happening to both my wife and me, since our list of overlapping applications is very small. I assumed it was a system-level issue and will have to be fixed with a software update. In the meantime, I'm restarting my machine like it's 199x...
I have this. Do you put the laptop to sleep a lot? My thought has always been itll write the ram to disk when suspending, but doesnt clear this on waking, accumulating slowly on the disk until reboot. Also updates are semi silently downloaded (and installed) by default. On iOS there was a similar bug, first with ghost photos; deleted photos were never ever really deleted nor deletable slowly filling up the device.
I probably sleep the laptop several times a day. This could explain the issue, because when I’ve run Disk Inventory X it shows lots of large files that I can’t figure out.
I have my Mac set to never pre-download software updates though.
If writing RAM to disk is the issue, what’s the solution?
TIL about "spindump". These kind of triage reports are really useful for learning more about macOS. THink about it: when is the last time any of you read a user manual for an OS? macOS doesn't even have one. You're just expected to figure out how to use the GUI, and unless you pay for a class, debug through the command line is StackOverflow. Thanks OP for writing this up.
I should have been more specific. Yes, you are correct, an apple guide that explains how to use the red/yellow/green buttons, and what windows and double-clicking are certainly is the user manual I asked for. I meant something more in depth, like in the old days, I have two Apple //e manuals that have annotated ROM listings, and SystemV had a manual that was about 6 feet long on its side.
As someone who has spent many years supporting users including desktop backup systems, backup programs that are even slightly 'visible' to them in terms of slowing down the machine are deeply reviled.
Apple's target market is creatives. Lots of audio and video folks. Folks who really, really do not want their computer screwing around making a backup when they're in the middle of trying to get things done.
The web page there describes this feature working correctly, not the bug this article is about. Low priority Time Machine I/O is throttled to allow regular I/O to run at full speed.
This feature is an antifeature which is doing harm by interfering with system functionality.
If you have a system with millions of files it is the poor performance introduced by this antipattern concept that breaks backups, copies or systems with huge codebases, I could go on. It's defective by design...
But as has been said, this isn't a problem Photoshop users hit so will never be addressed.
If you can't see an equivocal here you aren't understanding the problem and probably shouldn't be strongly commenting on it.
If I were being rude I'd just be rude. My comment is fair and the downvotes reflect how you need to learn more about this area before sharing comments of "green apples aren't red apples" when we're talking about oranges.
Either your upset that the community agreed that the comment was irrelivent, or, that there have been people deactivating this broken feature for ~10 years now.
To those who use MacOS in furstration, it's not a real surprise when an IO problem on MacOS is related to this scoring mechanism, and it only ends up hitting experts trying to use their device to do anything other than realtime video editing.
Unfortunately, having a system without a properly working IO stack will lead to eventual data loss, this is just a matter of life, be that through, "I wish I had a backup of that", or, "I couldn't copy the data off before it died"...
This is like saying it's worth keeping around a dody SSD which is failing to read/write quickly enough on a bad day because it's full of family photos. Anti-features and anti-patterns lead to bad behaviour and should be stopped.
This can't and won't be the first time the author has come up against this, however, now he knows that it's there he'll probably leave it turned off which is for the best.
About halfway through I was guessing the problem was that the device node lived on the root filesystem, and therefore everything written to a device looks like a write to that filesystem, which may be another way to manifest a similar bug.
> Amusingly, even spindump's symbolication step—which relies on an external daemon—suffers from this kind of problem—one which I diagnosed, of course, another well-timed spindump.
Yeah, this can get real annoying when trying to diagnose hangs…
I really feel this way. The two Apple products I own are an iPhone, and an M1 Mac Mini. The hardware is good. The software often thwarts things I'm trying to do and pisses me off greatly.
At least on the Mac Mini though, I can disable SIP and go crazy, or install Asahi Linux. iPhone is gimped really hard and even a $100/yr developer license doesn't really help you that much; you're still not getting basic WebM in Safari or alternate browser engines.
I know it's popular to not like Android for being janky, not as privacy-respecting, fragmented, or simply because of the fact that it's Google, but I don't really mind Android that much, even given all of its faults. It's mildly annoying but at the end of the day it feels like there's very little that I simply can't do. I mostly use Google devices, so I can always root and flash other ROMs.
Hey fyi webm support is native in iPhone safari now. Noticed it a few days ago when I was surfing an image board and suddenly I could click and play them.
Thanks for letting me know. Unfortunately, I suspect that the website you were viewing was either transparently decoding the webm in JS or transcoding it on the server; I can’t find any info suggesting MobileSafari supports WebM in any version, and I just updated my iPhone but I still get a negative result over here:
At least on Gnome, animation speeds aren't dependent on the aspect ratio of your monitor. macOS is an embarrassment of a system to actually use day-to-day.
> software better than the low low bar set by everyone else
as a former Mac user, I think Mac software is kinda great on ways other systems cannot keep up: constant upgrades, function over form without sacrificing form, when an old/surpassed API is deprecated, community reacts very fast.
Apple used to be an oddity; a hardware company that was also good at software. If they are now reverting, it’s at least reassuring to see that the conventional wisdom still holds; i.e. that companies can be good at software or hardware, but never both.
The way I put it is that with hardware, Apple had their "come to Jesus" moment a couple of years ago. With software, they're still Johnny Ive^H^H^HSwift: just as cocksure as to how perfect they are as they are actually bad.
It used to not be that way at all. They had OK-ish hardware that was basically a dongle for the awesome software.
Apple: Great hardware. Great software. Terrible combination.
I like Windows on Bootcamp. I like hackintoshes and my 2015 MacBook Pro. It's trying to bend macOS to my will on recent systems that causes wailing and gnashing of teeth.
I wish there were some kind of mechanism to maintain current levels of system security while allowing previous levels of freedom. Maybe a way to sign a Yubikey and the Recovery partition with an Apple Developer account, after which booting with the Yubikey on that system would allow you to, e.g., install an older version of macOS, or make changes to /System.
I was on this boat too, building hackintoshes, etc until the M1 came out. I refused to ever buy an actual Apple machine, especially since work gives me one to use for work/coding. I'm sold on the m1 though, miraculous little device, love my 14" powerhouse. M1 max, 64gb ram..
And I have PORTS again. I used the HDMI port the first day I got the machine.
Expensive af but it'll hopefully last me 5+ years.
Better software than windows, though. Every time I go back to a Windows machine I'm traumatized by something that should be trivial within like the first hour.
The most recent example was navigating control panel menus while trying to figure out why a microphone wasn't picking up my voice. Somehow every tiny management task like that is an absolute disaster on Windows.
OS X disables volume control if you're playing on an HDMI monitor, for no particular reason. I don't think it's clear that Windows is worse in this area.
modern macOS can also semi-silently deny an application access to the mic. It's not a stupid thing to do, but it certainly can confuse the crap out of users who missed the "allow X to use mic" popup in the past.
Another thing that’s confusing to me is similarly with some applications requesting access to folders.
I swear I’ve had situations where I download some software, and then I try open with it some file in my Downloads folder. OSX then shows a pop about requesting access. I remember either just ignoring that pop up or clicking cancel, and then still having the software somehow load the file! That seemed like a serious security issue and I don’t really trust those filesystem access protections on osx anymore.
macOS will allow access to single files to software without any permissions if the action to open the file was deliberately done by the user (e.g. the file was selected in an OS open file dialog, the file was double-clicked, or dragged onto the app icon, etc). The app gets a runtime-limited lock on that one file and still can't willy-nilly list or read other files.
IIRC this is also the only run-mode for sandboxed App Store apps, where all file opens must be user-initiated (and some utility apps work around this by asking you to open the disk root manually)
The hardware looks nice but it's not for normal people getting normal work done. Things have gotten marginally better since Ive left, but they still have a long way to go.
I disagree, having migrated multiple family members away from shitty plastic laptops full of bloat ware and ads and shitty Android phones full of the same they are much more able to use their devices.
I also find as a dev that iPhones and MacBooks “just work” and let me get my stuff done, much much more so than a Linux desktop environment or windows laptop.
I guess we may be talking about different things though if you’re saying the apple ecosystem isn’t living up to its full potential, but IMO the rest of the phone/computer ecosystem is full of cheap shitty plastic crap with windows installs that OEMs have had free reign to install whatever they want on (and even without that, why tf when I install windows 10 do I get ads for “candy crush” on my start menu?!)
This is absolutely not the case; such comparisons always leave something out that isn’t important to the person doing the comparison but would in fact bloat the price on the PC side if it were truly equivalent.
Wait, what? I think I'm missing something because the price difference is still there. This argument is old as the hills but has generally held up.
The 2020 MBA (to avoid having arguments comparing ARM and x64 cpus) launched at $1000 started at a 256GB disk, 1.1GHz two-core i3, and 8GB of RAM.
A quick search will find the 2020 Acer Swift 3, $700, with a 512GB disk, 1.8GHz eight-core Ryzen 7, and 8GB of RAM. It looks pretty sleek, metal, and thin too.
I do mean this sincerely, am I missing something? Because I don't know of any perspective where the price-spec gap disappears with Apple laptops. When people talk about buying Apple "for the hardware", I really don't think the argument is about the specs.
I’ve done this exercise, and the non-Apple laptop always has a worse screen, or way slower SSD, or slower RAM, or a combination. Just looking at the top-level numbers isn’t a great comparison.
I looked at the 2020 Acer Swift 3 you mentioned and it’s the same sorry: 1080 250-nit display, and IT Pro’s review says it’s “mediocre in several areas – the screen, build quality and design are all underwhelming, and the keyboard and battery are middling.”
The impression I get is that is all already "known".
Over the past 10 years, I think the assumption is that Apple laptops cost more for equivalent computational specs, but are prettier* and built better (controversies aside), and have MacOS (which is a plus for many.) (I did pick this laptop for comparison because its RAM and SSD sounded good above its base numbers.)
I guess my main argument is that there's not enough here to "debunk" this trend, even with deeper insights into hardware. This might be different with the M1, which I have no experience with.
(*This is a quality that I don't mean to dismiss. A better screen is important for a device that might see >10h of use in a day.)
I think the issue is that if one was to buy a typical laptop I'm sure one would be perfectly happy with it, and (that specific laptop aside) there are definitely PC laptops that are pretty decent. The Razer Blade has been reviewed well, there are some good ThinkPads, etc.
But if you've ever actually owned an Apple laptop and then use ANY of these other ones, it's completely obvious right away what you're paying a little more for. Apple has issues too (I had a previous model with the shitty keyboard, my current Touch Bar is dying, and it was a horrible idea in the first place), but even if I didn't care about macOS I'd still never even consider getting a different brand because those shortcomings are so stark.
The screens are great, they don't flex or creak, the touchpad is amazing, it's not covered in plastic, and it doesn't feel like you're paying for maybe a great processor and SSD but everything else is crap.
Those differences don't show up on a spec sheet though, so for many people it makes no sense why people pay a little more for Apple, and that "little more" really isn't a whole lot when you are actually trying to compare equivalents. When you get a similar-specced PC laptop and the MacBook is maybe $100 more, that $100 is actually going to better components in general. E.g. both have a trackpad but Apple's is significantly better.
Another example is monitors: why pay $1700 for Apple's Studio Display? Well, because literally no one else sells a 27" 5k display except for that one from LG which was widely known to have tons of reliability problems. Why do you need a 5k instead of 4k? Retina text. If you haven't lived with it this doesn't matter, but if you have then moving back to 4k kinda sucks.
Ubiquitous 5K displays has been a wishlist of mine since 2009.
Thankfully DisplayPort 2.0 compliant GPUs starting to surface (both the upcoming AMD and NVIDIA GPUs will do it) may finally see the start of their ubiquity in the coming years.
Been doing multi 1440p (or 2560x1600) since 2005, because I refuse to give up logical area by going to 4K. 3x5K will be happy days for me.
I really hope so... my iMac is starting to age and depending on what products get announced in the next year or so I may end up replacing it with a Mac Studio. If there are other displays to compete with the Studio Display I'd happily look at them to save some cash.
I guess what I meant to express is that, I think most people are roughly aware of these things as explaining the spec-cost gap. I thought there might have been something more? (My own experience is lacking-- I haven't used Apple laptops for more than a few weeks at a time, and I haven't used any of the recent M1 laptops.)
Based on experience, I would assume all of the components in the Acer laptop are lower quality than the MacBook Air.
Once I learned about “binning” years and years ago, I stopped paying attention to specs and instead just pay attention to commercial/business line branding. Or Apple in this case, which does not have a business line, but has earned a reputation for not using bottom dollar components.
I think you mean a lot more plastic crap, right? For a PC laptop of the same price (eg Lenovo X1), I find that they are a lot more flimsy than a aluminum unibody construction. In fact, I mostly moved to Apple to get away from plastic.
If I’m wrong and you have specific examples, let me know. But the last time I checked, PC quality price per price was still pretty bad compared to Apple.
Comparing Apple's hardware to shitty plastic laptops is a false equivalence though. Once you spend Apple money on hardware, you get much better stuff in return.
Apple is perceived to be high quality because they only serve the high end. When you compare ecosystems, you should at least compare the same market segment. There are also market segments that Apple has no products in (convertibles, for examples, or sub 600 dollar laptops).
Candy crush hasn't been installed by default for years now. There's plenty wrong with Windows 11, but the spam has severely reduced and the user experience has improved in many points (and made worse in others, i.e. not being able to dock the task bar to the side, though you can't do that on macOS either).
I would not qualify entry level MacBook Airs as “high end”, yet they are unmatched for the tasks that 80% of people require from their laptops. And it has been this way for 10 years.
> I would not qualify entry level MacBook Airs as “high end”
They aren't high end, but if we're comparing to other laptops, $1000 is far from entry level. Entry level is $200-250, and mid-level, probably good machines tend to start around $500; for $1000, you usually get either good specs or nice aesthetics and sometimes both.
Of course, for some users, the $250 laptop has more than enough computing power for their needs (as long as it has a half-decent SSD, cause windows 10, and I assume 11 can't stop thrashing a spinning drive and there goes your perceived performance; I've never dug into it like in this article though, swapping in an SSD is good enough)
I have never seen a laptop for sale for $250. I have probably seen laptops for sale for $500, years ago. Which I assume came with malware, a 30min battery, and the worst quality components that lead to significant odds of failure within a few years.
There is somewhat of a correlation between price and quality of product. After a certain price, any lower, and you start getting into the “it’s more expensive to be poor” scenarios, where the amortized cost the product over its lifetime ends up higher than the ones that cost more upfront.
I still remember the standard advice of buying a Windows consumer laptop was to reinstall the OS after buying it. In what world should that be acceptable? In my accounting, that time and effort spent installing an OS gets added to the price.
Have you been shopping lately? For $250, you can get something from almost everyone. It's likely to have an Atom processor, or something anemic from AMD, probably only 4GB ram, but most will have a (small) SSD these days. It may or may not come with preinstalled garbage, but you can usually uninstall that lately. Or just live with it. If these computers fit your needs, the junkware isn't going to impact you that much anyway. Some models even are upgradable at these prices, but soldered parts do save costs, and you have to accept cost savings if you're buying at the bottom of the market.
Sure, there's some correlation between price and quality, but if you're worried about longevity, 3-4 laptops of questionable quality are likely to last longer than a $1000 laptop anyway. And, screen aside, the 3rd and 4th cheap laptop might end up with better specs than the single quality laptop. If screens are important to you, then that's not going to work, and that's valid; but a lot of people get a fancy hires screen only to run it in 2x mode and push 4x the pixels for a small difference in experience; it's certainly worthwhile for some people, but it doesn't make a big difference to me and many others. In an ideal world, you could pick between screens on a laptop; there's a huge spectrum of screens that meet different needs and wants, but most manufacturers aren't giving options beyond glossy (eww) or matte in a normalish resolution, and on higher priced machines maybe one higher res option with no choice in finish. Sometimes, business oriented laptops will have a couple adjacent sizes available with the same bottom half of the chassis, but that doesn't happen for consumer laptops.
> I still remember the standard advice of buying a Windows consumer laptop was to reinstall the OS after buying it. In what world should that be acceptable? In my accounting, that time and effort spent installing an OS gets added to the price.
A lot of people say a lot of things. Windows works fine out of the box, most of the time. If you want something that values your time, Chrome OS devices are better: works out of the box; cold boots in a couple seconds; no junk (other than google login, but you can run in guest mode if you don't need persistence); updates are done in the background, reboot whenever, none of those long waits at startup to finish stuff like MacOS and Windows. Plus, they start at even lower prices: usually something for $100, something with a mainstream x86 processor around $200.
I used an x86 Chromebook from Acer running Ubuntu for years at work, as a light meeting and trip machine. Still holding up really well 8 years later. Just put a bigger ssd in it. C720.
I have honestly, earnestly tried. I have tried to find a laptop that is for my needs and purposes truly equivalent to an Apple Macbook Pro. Or better. I always end up going with an Apple machine. There has always been a significant part of the assembly that just isn’t as good. It’s often disk speed or display quality. Build quality too.
I want to underline that this is true for what I need out of a laptop.
The opposite has been true in desktops. I have a Ryzen 3900X box and there still isn’t anything from Apple that I would replace it with. Not even to run macOS on it (which I do on the AMD box, using GPU passthrough).
The opposite has been true in desktops. I have a Ryzen 3900X box and there still isn’t anything from Apple that I would replace it with. Not even to run macOS on it (which I do on the AMD box, using GPU passthrough).
My MacBook Pro 14" is pretty much on-par with my Ryzen 5900X CPU-wise (the M1 Ultra would surpass it by a wide margin). The GPU of the M1 Max is nothing to sneeze at, but I'd love it if they'd bring eGPU support to the M1 line. (And if one can dream, if NVIDIA would also make CUDA available.)
Comparing e.g. geekbench results for the M1 Ultra https://browser.geekbench.com/v5/cpu/14664498 and a decent 5950X (PBO) result https://browser.geekbench.com/v5/cpu/14665931 it looks like a lot of Apple's multicore score is attributable to AES (which uses special fixed-function HW). And with other tests, different CPUs win different benchmarks. 5950X notably wins in clang :)
> bring eGPU support to the M1 line
Probably not M1.
marcan tweeted recently that Apple's PCIe integration has been is broken in the same way as on Broadcom's RPi4 SoC — mapping PCIe BARs as normal memory (which allows unaligned access) doesn't work. I can't find the tweet (deleted??) but here's the RPi thread: https://github.com/geerlingguy/raspberry-pi-pcie-devices/iss... Basically there's no quick kernel level workaround for this, if you really want a GPU to work on such a broken platform you need to patch every single thing in userspace to avoid unaligned access.
We'll see soon if that's fixed in M2, but I suspect they don't care…
No contest! – The thing is mostly that I can mess with the Ryzen box, and run a bunch of different OS-es at full speed, and use various add-in hardware like video cards.
Apple has a lot of other bugs in their code, too. Did you ever try to delete a Bootcamp partition and merge it to you main disk when using their new filesystem? It corrupts the disk and in order to be able to reallocate the disk space you need to completely wipe the Mac and start fresh.
Regular userspace processes, even those with niced/lowered priority, could generate overwhelming amounts of CPU load just by creating a bunch of IO burden on a dmcrypt-backed mountpoint. Far more than their fair share in terms of their own process scheduling.
The effects of this were especially visible when listening to mp3 files on something like a core2duo (or older) machine. Audio buffer underruns galore, just because you had a big git checkout or rsync running.
It took me years to convince upstream to revert the responsible dmcrypt workqueue commit and fix this.