Hacker News new | past | comments | ask | show | jobs | submit login
Maximizing your 24″ 2006 iMac with an eGPU, Yosemite, and a replacement screen (lowendmac.com)
51 points by jandeboevrie on Oct 18, 2023 | hide | past | favorite | 34 comments



> Here at Low End Mac, we believe in the long term value of Apple hardware.

> The primary issue with the 2006 24″ iMac is its GPU, which tends to fail

So long term value that the author has to frankenstein new hardware into it. Just buy things that can be easily repaired, people.


devil's advocate here: From 2006-2023, the GPU is the hardware component that's seen the most development.

I'm not running heavy workloads, honest question:

If someone's trying to build a top-of-line bottom-of-budget system, is the best GPU they can afford attached to whatever 2006 iMac they have laying around honestly a bad approach?


Sigh. I have a 2011 iMac downstairs rotting away unplugged and thought about doing to it what I just did to the Mac Mini from the same era -- toss in some more RAM and an SSD drive replacement and put Linux on it.

Except once I started watching videos of doing work on the iMac I was instantly turned off. It's like brain surgery. They made that machine really tough to work on, and accomodating a 2.5" SSD means putting it in the optical drive slot, etc. etc.

I would love to be able to replace the mainboard entirely with something else. Doesn't look super feasible though.


If you don't mind parting with it, you could sell it on ebay. There are plenty of people who are interested in restoring older mac, and using it for fun and learning. It'd be better than just sitting idle, although it could also gain value in the future, who knows.


We stopped using this one because it would occasionally freeze up / halt, with garbage on screen; I think something wrong with GPU, not sure.


I vaguely remember disabling the GPU on mine by removing a kext. Obviously no good for games but it wasn't great for that anyway.


I have a 2013 MacBook Pro I got off eBay that has issues with the GPU, but otherwise is a perfectly usable machine. One NVRAM flag to disable the GPU later and it works most of the time. If it goes to sleep, you have to reboot it but it usually comes back on properly if that happens.


This makes me wonder how long does it make sense to keep running older, less energy efficient computers? There will be a point where energy wasted on less efficient components is higher than energy used to build a modern computer.

CPU in this iMac has less than a quarter of iPhone's speed. If it wasn't for macOS, this could probably run on a Raspberry Pi.


I just looked it up – if you're interested in comparing iMacs (2006) to iMacs (2021), it seems like the comparison is ~160W vs ~80W.

All of this is moot, though, if it's just a proof of concept/I did it to see whether it was possible!

That being said, it's a good question: I always wish there was a tool where you could calculate all of the tradeoffs involved - TCO for you (both purchase and operating costs), TCO for the environment, etc.


Also, power is not energy. If it uses twice the power, but takes 4x longer, that's 8x more energy.


Oh my god you're right – I see high school physics teacher shaking his head at me!


This is one of the reasons datacenters turn over equipment every 4 years.


A single solar panel on the roof will run a lot of computer, old as well as new. Just add a panel and you can keep on running that older hardware.


Probably not that big of a deal


Is it possible for a non-Apple-silicon Mac with eGPU to get faster than a maxed out M2?

When working with LLMs, inference speed is crucial and M2 Ultra is not there yet.


You can more powerful / faster GPUs in an Intel Mac Pro than the M2 Ultra GPU for sure.

I do recall a while back Nvidia did release updated MacOS drivers, but I’m not sure of the state of those on modern macOS.

But it certainly was possible to run an NVIDIA eGPU at some point.


Depends but no, only AMD cards are supported (in MacOS) for eGPU and they're all slow as hell too (for inference)


I happen to have a photo from 2006 of mine driving an external 23" Cinema HD display in addition to its built-in display, with apple.com, yahoo.com, cnn.com, and slashdot.org up:

https://imgur.com/a/ulfPrvf

Unfortunately the focus is soft. I wish I had a higher resolution version with sharp focus.


If you bought PC instead of Mac, you could upgrade any component at any time.


I'm not one to defend Apple, but that's not really true in a practical sense.

You can swap GPUs, sure. But eventually the CPU will be too slow relatively speaking, so you'll need a faster CPU.

And if you need a new CPU, most sockets don't support future CPUs (or only +1-2 gen), so you'll need a new motherboard.

The power draw of CPUs and GPUs have increased, so you'll likely need a new PSU if you only just barely were getting by before.

Because of the new motherboard, you'll need new RAM. You can't socket DDR3 into a DDR5 slot.

New CPU socket may require a new CPU cooler too, depending on the brand of cooler and how good they are at maintaining compatibility. Even if it has a mount, you may need it just because power draw has increased so much that an older cooler may not be sufficient anymore.

If you're just swapping the GPU, thats easy and will most likely work for a few generations. But sooner or later, the whole computer will need to be replaced, except maybe the case.


Practically I think you end up with a couple genuine breaks:

- CPU/Motherboard/Memory

- GPU

- Storage

- Monitor

with PSU sometimes happening on one of those and case being an optional anytime upgrade. Eventually you'll need to replace everything over enough time but the nice thing is you don't need to immediately replace everything the moment the first component gets outdated.

The ultimate killer of this value is most people just don't upgrade components, it's too in the weeds for the average user.


That's fair. Those are all mostly mutually exclusive.

Personally I wait until several generations have gone by (at minimum 4 intel, 2 Nvidia), so it's best to just replace almost everything. It's a lot of fun assembling a new machine.


I have reused PSUs for about 3 or 4 builds. It is always a good idea to overinvest in a good quality PSU, with some room to spare.

CPU+board+RAM are usually replaced together, as a combo. So, not ideal, but not that bad either.

Storage is also independent and easily upgradeable, I've had hard disks last several builds, just as PSUs. I think my last desktop had four disks inside.


And if that isn’t satisfying enough, you can spend your time judging people who have possession of a nearly two decade old Mac and who might be interested in doing something with it?


Keep in mind that if you bought the PC at the same time as this Mac, you'd likely be looking for an AGP graphics card. While there's new old stock available, even if you can find a 2007 HD3850 it's not really much of an upgrade.


Whilst I think such projects are a nice way to prolong obsolete hardware lifespan, I don’t understand who needs macOS there? Especially the obsolete one as well. What is the use-case? In what way that better than Linux or any other free OS?

If you need some proprietary software, like Adobe suite or alike, e.g. that’s for work, why not buy a much better/newer machine? The machine is clearly a home computer. I have old MacBook that serves as a home laptop as well.

A modern updated OS better than an outdated one. If the hardware is too slow, Free OS world has you covered, the number of DEs (desktop environments) is plenty for every user. Even behemoths like KDE/Gnome work very well on 4 GB ram and some ‘dumb GPU’ machine. So why bother with macOS supported hardware?


Someone might want to run old Adobe stuff, prior to its exponential bloat and rentification, on an era appropriate OS like Snow Leopard (which is widely thought to be one of the best versions of Mac OS) that’s familiar. It wouldn’t be smart to take such a machine online, but there’s no reason why it couldn’t serve as someone’s comfy offline “zen mode” creative machine.

That said if I were to do this I’d probably build a hackintosh running Snow Leopard or Mavericks or at least procure an old cheesegrater PowerMac/Mac Pro tower for the extra flexibility offered rather than using an old iMac.


I use old PC and Mac because it continues to work. I also have an old Macbook (2008) that I wish I could repair, so I could continue using it. While the old macbook doesn't run modern software, it ran Linux well, and it would've been fine running Home Assistant, web servers, NAS with Thunderbolt2/3, etc. I'd hate seeing old computers go to e-waste, if I could just fix it cheaply. Also, I might've run some classic software that is no longer supported on modern system, like old Adobe Photoshop that ran without subscription, and classic games as well.


Why can’t you repair your 2008 MacBook? Is it the aluminum model A1278 or the polycarbonate model A1181?


Main issue for most people is legacy peripheral support. There was never a good reason for the iMac to be glued together over just using screws.


Legacy peripheral is why I keep an old Mac mini in my music making setup. I have a tascam mixer/audio interface that the whole thing is built around but tascam never released a 64 bit osx driver. They released a 64 but windows driver but stopped support just after windows 7 so it’s still a nightmare to get it to work on any windows version post 7 even with the 64 bit driver. There are workarounds to get it working in windows 10/11 and there’s even ways to get it working in modern macOS by getting it working on a windows machine and doing a network midi bridge although you lose the audio interface part.

All that’s a headache so I just keep my old Mac running old software. It’s annoying to not have the latest features, an ancient version of logic that runs like shit, no support for new plugins, etc, but it still works.

Maybe someday I’ll upgrade but it would be pretty costly to do so and this works despite the headaches. Really wish manufacturers were required to at least release documentation of hardware like this when they dropped support; it was quite expensive back in 2005 or whenever I purchased it and now it’s basically useless outside of this niche scenario. Frustrating and one of the many projects I want to someday get to although I’m probably too stupid to reverse engineer a FireWire bus device and write a driver ha


I wonder what the consequences are of blocking the GPU's fan intake with a sellotaped ribbon cable...


Seems like a lot of effort for a really old computer. Was it worth it? Also, the 2006 iMac comes with a mini-DisplayPort connector. The article didn't seem to discuss whether that was usable.

Edit: Okay I missed the part where the GPU is failing. The same thing happened to my 2010 MBP, it's currently collecting dust.


Honestly, at this point it easier to just tape M1/mini off eBay to a monitor of your liking. You won't have a fancy logo, of course.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: