Hacker News new | past | comments | ask | show | jobs | submit login
Mac Mini Pays for Itself After 2 Years (cjgill.wordpress.com)
78 points by sahaj on Oct 26, 2009 | hide | past | favorite | 72 comments



A few people are debating the numbers, but I think there's a much more important message here. Consumers rarely factor in the total cost of ownership (TCO) of a computer, and only look at the retail price.

I used to think this way, particularly because I didn't have much money, so I needed to buy the cheapest computer I could at a given time, resulting in paying more over the lifetime of the computer (which might not be that long).

It was only once I began working with data centers when I realized what a huge amount the power and cooling bills could be. In a datacenter, the power and cooling requirements of a high-end system can easily be more than the cost of buying the system initially. A great example is Western Digital's new "Green" 2TB drives. The drive costs more $/GB initially than two 1TB drives, even when you factor in the cost of housing the drives; but if you look at the power required by the hard drives (and hard drives really do pull a lot of energy) the 2TB drives can come out cheaper. The same analysis applies to SSD drives.

If you want to be smart with your money, and have more in the long run, don't forget to account for ALL the costs you'll incur when compared to another solution. This applies to just about everything, not just computers.


Saving on power in the datacenter is one of the worst ways to try to save money, in my experience.

It varies greatly, but $10 million dollars worth of servers, network gear, and storage might cost $25k/mo in power. Shaving 5% or 10% off that bill is practically meaningless. You can save a lot more money in other ways.

Obviously at massive scale (think Google) it's a different story, but for the average small-medium company it's not even worth thinking about until you've gone after everything else.


While I agree that for those numbers, 5% to 10% savings isn't worth the effort, this isn't a good reason to dismiss saving by reducing data center power. Nissan recently starting rolling out virtualization solutions for some of their datacenters, resulting in 34% energy savings. For the kind of numbers you list, that would end up being over $100k of savings a year.


I found the press release you were referencing: http://www.informationweek.com/news/software/server_virtuali...

It proves exactly my point: That focusing on power usage is backwards. You should worry about other stuff, like using less servers.

The guy that worries about using less servers can consolidate 159 servers to 28. He saves $5 million dollars and $10k/mo. The guy that worries about power usage itself just buys 159 slightly less power hungry servers. He saves $2k/mo.


Ah, now I understand the distinction you are making. And I agree; get fewer servers, not the same number of slightly more efficient ones, and then you can talk about saving money on power.


Saving on power in the datacenter is one of the worst ways to try to save money, in my experience.

You know what's a worse way to save money? Running out of power/cooling/floor loading and having nowhere to go...


Our colo was built back when computing per watt took up a lot more space... meaning we had two racks of machines in a giant empty room and we'd already blown our power budget.

Power is often the constraining factor.


Oh come on...

Computers get more efficient all the time. My last power hungry desktop was replaced with a mobile ultra low voltage CPU notebook. Guess it paid for itself, too. But it wasn't from Apple, so I didn't make a fuss about it.

You know what, since you are so good at maths, I could sell you my old car for 10000 bucks. That way you would save 20000 bucks because you would not have to buy a more expensive car instead.


Whenever Apple takes a crap, fanboys flock to the stall to comment about how smart the smell is, and how stylish the toilet paper is. How only Apple could produce turds of such quality because they really, you know, think about what goes into the process and their specially formulated fiber diet is a totally unique combination of oats and bran that you simply can't replicate in the Windows or Linux world. So you have to pay 3x the price for that formulation.

And then blogs are filled with how it's all just so totally brilliant.


fanman?


He might have saved even more electricity had he gotten a laptop with the same specs as the mini. My Dell Inspiron 1525 has a 65W power supply, vs the similarly-equipped Mac Mini's 110W, and that 65W includes the screen. Adding an LCD to the Mac Mini adds another 30-50W.


Your power supply only documents an upper bound. It does not tell how much power your device actually uses.

Oddly, on laptops the power supply may not even be capable of satisfying the peak power consumption. There are laptops that can only go to full peak usage if they have their battery in and charged. The charger just covers average consumption plus charging current.


Is that the Mini's power supply max rating, or its average power draw?

I'm guessing the mini has a bit of extra power for USB devices and when the processors really kick it up. Would be fairly easy to test... where's my multimeter..


That is the max. it actually slurps about 35W under load

edit: and then you need a display of some kind ;)


Your energy-star rated 20"+ screen (with an LCD backlight) typically consumes 20W when on and 0W when powered off by the computer.

So I believe the Mini would consume, on average under use, about 55W, which is a rather commendable figure for an official desktop and not a nettop. Adding peripherals will obviously effect this, a portable hard drive can pull up to 18W (maxing out the USB power supply). I'd definitely say opting for a bigger HDD on purchase is better than getting an external drive.

I do have a question for Mac Aficionado's, is the OS X Server akin to Windows Server 2008 or the Ultimate editions of windows? I was wondering if the Mini 1TB with OS X Server is as usable and accessible to a new user as any standard OS X (I'm rarely a Mac user, but my next purchase is likely to be Mac, so I definitely don't want to get my ass bitten by either losing out or screwing myself over by lack of knowledge when making a purchase).


We run a few Mac OS X Servers at work. Unfortunately, they are a few versions old (2 are Panther, 1 Tiger). They are probably closer to Windows Server 2008 than Ultimate editions of windows. But neither is probably a good comparison.

The core system is your standard Mac OS X. I feel odd when I log onto our servers and see an iTunes icon. So you won't miss anything by getting the Mac OSX Server.

The "server" part is more like having all of the capabilities of a Linux server, with a nice easy to use GUI to configure them. The newer versions include some Mac specific services, but for the most part, it's like having a Linux server and a Mac Desktop in one package.

I think the main benefit is licensing... with the "normal" version I think you are restricted as to how many people can connect (10?). But the server version is unlimited. This probably doesn't matter to you, but the Mini server would be nice for a department file server, or something like that.

If I were getting a Mini, I'd get the Server one just for the hard drives. Don't let the "server" part discourage you... it's still a Mac.


From my superficial experience with OSX Server, it's like Windows Server in the sense that it is OSX Plus. All the normal stuff is there, then server specific stuff is there. That includes both nice GUIs on top of open source stuff (Samba, etc), and nice GUIs on top of totally custom stuff (Calendar sharing, Wikis, etc).

If you have normal web-serving, open source using needs, regular OSX will be just fine. Be sure you really need server before buying, regular OSX can do any server type things that linux can.


I was wondering if the Mini 1TB with OS X Server is as usable and accessible to a new user as any standard OS X

Yes, but:

- Mini has 2.5" drives, so 1TB is not possible yet - Why would you use OS X Server? There's nothing in it that you need


I'm confused - or you are.

A new option is a Mac Mini with 2x 500GB drives (without an internal optical drive), running OS X Server, for $999.

http://www.apple.com/macmini/server/specs.html


Fair enough, but if you still want to use it as a server (btw, why?) you'll probably make RAID 1 from them, so it'll still be 500gb max.


I suppose. I use our house's mini as a fileserver most days. No screen needed. The 6+ hard drives connected via USB and Firewire probably eat more than 35W however. Other days its a media center and we're using the projector for a screen (which certainly eats up a great deal of power).


While your suggestion is along the right lines, keep in mind that computer power supplies are not like lightbulbs. They are spec'd to rate the maximum power they can produce to the device (and thereby draw). So, your 110W power supply is not drawing a constant 110W in the way a 110W light bulb would always being drawing 2x the power of a 55W lightbulb (all else being equal).

That being said, it is reasonable to assume that a PC with a 65W supply will generally use less power than one with a 110W supply, but much would depend on the use.


Quite so, my Studio 1555 is a 65W unit and my LCD monitor is now aging. Even with my 60W Acer AL1916W drawing power I use less than the Mac Mini would. And when I don't need it I turn off the LCD.

Doing a little math based off my electrical bill, my laptop and LCD (together) cost me $112.57/yr USD. At roughly $240/yr for the Mac Mini alone ... I'd say that my Dell laptop wins out.


http://www.goodcleantech.com/2009/03/its_official_apple_mac_...

"The mini uses only 15W while idling in our tests, and a low 34W while running the CineBench benchmark test. This is the second lowest active score, after the 18W observed on the Asus Eee Box nettop."

Moral of the story: Don't judge power consumption by power supply capacity.


When running off battery power, the battery widget in KDE gives an estimate of current power consumption. With the screen at full brightness, my 17" notebook usually stays around 27W doing non-intensive tasks like surfing the web.


They don't make laptops with 42" screens, which is what he has attached to the mini.


You could just as easily connect a 42" screen to the laptop, as I sometimes do using my laptop's HDMI out. It will still end up using less power than the Mac Mini.


You could just as easily connect a 42" screen to the laptop, as I sometimes do. My laptop has an HDMI out


Rule of thumb: for each watt your always on device consumes, you will pay $1/year in electricity. (at $0.12/kwHr, common US price)

You folks in Silicon Valley are paying 50% more than that for your electricity, so adjust accordingly.

His savings are high, $233/year which says to me he saved 150 watts, which seems a touch on the high side, but if his Pavilion did not sleep well (common in PCs) and his Mini does it would be right on the button.


An always on torrent server, as described in the article, doesn't sleep.


Possibly a title reworking is in order so it looks less like spam...

edit: exactly, thanks.


[deleted]


It certainly bears a high resemblance to the ubiquitous banner ads advertising "Free* <Insert Miscellaneous Personal Electronics Here> *after you sign up for 'steen credit card offers or what-have-you and get five formerly-friends to do likewise."


[deleted]


I think everyone who's read the article understands that it's a genuine article not out there to scam people, but ptomato suggested that the title be changed (perhaps to something along the lines of "Mac mini Pays for Itself After 2 Years").


Possibly a title reworking is in order so it looks less like spam...

i guess i misunderstood. it looks like he was saying that i reworked the title to make it look less like spam, but this is not spam at all.


ptomato isn't saying this is spam, he's saying the title looks like spam. Good article, bad title.


Side benefit of low power: low noise.

That's actually the first thing I check when buying a new computer. How silent is it?


The Mac Mini is almost completely silent unless the CPU is being maxed for long enough (a few minutes), causing the fan to kick in.


No it isn't. I tried to use my PowerPC Mini as a homeserver, but my gf complained about the constant noise, so I had to switch it off. The fan is spinning continuously.


Wasn't the PPC notorious for running hot? My Intel Mac runs completely silent unless I'm building something from source.

(I have no experience with PPC Macs. Comment solely based on anecdotal evidence. Correct me if I'm wrong.)


G4 Powerbooks (PPC) were known to cause severe burns to people. So much so that Apple never called them laptops because you couldn't use them on your lap. (Granted, those people were wearing shorts and had the computer on bare skin, and the heat built up slowly so you didn't notice it).

Point being, the PPC Macs ran damn hot. My old G4 powerbook ran hotter than my current MacBook Pro. IIRC, notebook cooling and power requirements were the main reasons for the switch to Intel chips.


It is not unreasonable to expect CPU be maxed out while computer is switched on. If you don't use it why do you keep it on anyway?

It was a big disappointment for me to find out that my new Unibody MacBook Pro whines like a rack server even with a slight load of way below 100% on one core. Meanwhile my old MBP was barely noticeable with 100% on one core until the fans got a bit worn out.


beagle board + ssd = really silent.


remember to check the LCD monitor for high frequency noise, too. recently i came across several samsung monitors with this issue :|


It seems to a be a quality control issue. We had 100s of the same model monitor where I used to work. Some of them had the high-frequency whine, some of them didn't.


There are too many other factors to consider when looking at the final bill. Next time, use an in-line power use monitor and tell us about it.


auto sleep your computer after x minutes of idle. done.


There are many things beside auto-sleep that can save you power, in fact an LCD monitor consumes as much power as an idling Mac Mini (roughly 20W each). Simply setting your monitor to auto-off when not in use can save you vast amounts of energy, your typical LCD monitor consumes no energy while turned off by the computer. A CRT (for those who still use them) save phenominally when turned off, dropping from ~70W-80W down to ~2W of consumption, essentially meaning that leaving a CRT turned off but connected to power is less harmful to the environment than working an extra hour at it. These are all energy-star monitor figures too. Screen size and backlight also varies energy consumption, some 20"+ LCD monitors can easily hit 100W, while Dell's current line hit only 20W.

My core duo laptop with a 17" screen only consumes a max of 90W with a CCFL, a Mac Mini consumes up to 110W with an additional 20W for a display. So I'm still not sold on Mac Mini's as environmentally friendly desktops, as there are better solutions out there, namely my old laptop still beats the Mini in efficiency terms.

I'd also add that anyone using SSD should have a very short time (maybe 10 minutes) between sleep activating and hibernate activating. The fast start up times allowed by an SSD can maximize power savings while minimizing inconvenience, which is arguably going to be the best way to aid the fight against global warming.


So I'm still not sold on Mac Mini's as environmentally friendly desktops, as there are better solutions out there, namely my old laptop still beats the Mini in efficiency terms.

That and the lack of built-in UPS is why I never found Mac Mini a reasonable buy.


Can you wake it up with an IR remote? If not, then auto-sleep is a non-starter for a set-top box. Just for example.


The Media Center remotes will wake/sleep your system. Windows Media Center does a fantastic job with sleep, actually. It'll wake up to record, then go back to sleep. If you hit "sleep" while it's recording, it will shut off the video output, then sleep when it's done.


You can generally wake up most PCs via a USB human interface device (HID), and there certainly are IR receivers that act as HIDs. Not that I've tried it.


Many new motherboards do come with wake-on-usb, and it works great.


Where do you live? If it's somewhere warm I'd say not having fans might be part of your cost savings.

It would be much more compelling if you were giving us an apples-to-apples comparison with regards to the dates. My electricity bills in the winter are 1/6th what they are in the summer (although that's due to air conditioning).

[Edit: whoops, I misread the examples in the OPs link; for some reason I thought the lower bills were in the cooler months, and that is not the case. Still, my question stands with regards to wanting to make apples-to-apples comparisons]


Mine is the opposite, for not having AC. Winter bills are much higher.

The month of the change is probably a very important factor. If you make the change between January and February, or June and July, the weather is probably a minimal factor if at all.


I have a similar experience.

I got used to having a linux server on 24/7 when I was in college (no electricity bill in the dorms).

I moved to California when I graduated, and was spending over $40/month on electricity. I borrowed a friends kill-a-watt and measured my server at 180W-220W depending on the load!

I put together a machine based around the Via C7 and measured it as using 18W at the outlet. The whole thing cost a bit over $300. Also, no fan, which is nice as the server is now in my living room. As soon as SSDs get cheap enough, I'll have a no-moving-parts server.


I've got an Atom-powered rackmount firewall (http://www.netgate.com/product_info.php?cPath=60_107&pro...), a fanless rackmount HP ProCurve GigE switch, and a 2009 series Mac Mini running the home/business network. The rest is done with notebooks.


I'm not sure how they justify their claim that the Mac Mini is the most power efficient desktop. According to another post in this discussion, the average draw under load is approximately 35W. The Lenovo M58 & M58p--which have been out since December, 2008--also draws an average 35W under load, and are also EPEAT Gold and Energy Star 5.0 rated. I believe HP has had a similar product for a while too; probably every PC maker does.


Where is the wasted energy lost to? If it is lost to heat and you have electric heating in your house, would that lead to a correspondingly lowered heating bill?


Yes, but you really don't want to heat your house or your hot water with straight electric radiators or water heaters. Use a heat pump. A heat pump will give you a 30-40% lower electricity bill.

I have a heat pump which takes heat from the ground (two boreholes) and for every 1kWh of electricity I put into it I get 2.5-3 kWh equivalent of heating in my house. This is essentially stored solar heat in the ground which I use the heat pump to transfer into my house. The electricity comes from regional hydropower. With electricity usage for heating without the heatpump of roughly 50,000 kWh and an electric price of about 15 US cent for the kWh, this knocks off a significant piece of the electricity bill, down to about 18,000 kWh and gives lower environmental impact. http://en.wikipedia.org/wiki/Heat_pump#Ground_source_heat_pu...

Also, reversely, if you are in a warm climate your aircon will have to take all that extra heat from your computer and push it out of the house, with a corresponding loss in efficiency.


I was born and raised in Las Vegas, NV. So was my father. His father started a water well drilling business in 1953, my father joined him in that business (post collage) in 1957.

In 1980, my father decided to build a house. We (meaning I) was sent to the property in order to drill two 800 foot deep 'wells', which were subsequently cased with 8" (ID) steel pipe, but not perforated. (There was a 1/2" plate on the bottom of the casing.)

After this, we filled the casing with water. (A fine test of one's skill at welding.) Then added a 2" galvanized "loop" inside the casing, which was then hooked in to a heat pump.

It was the single-most efficient "single family residence" HVAC system in the state.

We subsequently added hot-water recapture to the indoor AC units, in order to make the most of the heat "pulled" out of the house.

Dad also built the house with R-44 insulation in the walls, R-60 or so in the attic, and a white 'Spanish tile' roof.

The result, a power bill of around $100/mo for a 5000 sq ft home in the desert that was numbingly cold in the summer, and pleasantly warm in the winter. (It does get cold in LV.)

It helped that 40% of the square footage was basement (something else nobody in Las Vegas has, because it turns hard about 4 feet down.)


On the other hand, if you are running air conditioning, you'll have to pay once to generate the heat (the computer), and pay again to extract the heat from the house (air conditioning).


Short answer : yes!

Long answer : in a certain limited case, it could be different. Maybe you are using your PC mostly during the day, and your house is badly insulated so most of the heat from the PC is lost. At night, you turn on the heating. Using the PC during the day has no influence on the amount of heating in this case.


Would love to know what his KWH consumption is per month. His electricity bill is 1/2 mine. Is electricity generally cheap in North America?


I pay $0.085 per Kw/H. That's lower than most of the US though. I can power my 1400 sqft house with a macbook pro running 24x6 for about $90/month.


You live in a wonderful country!


I can tell you that in Europe, €0.20-0.25/kWh (about US$ 0.30-0.37, ZAR 2.20-2.80) is typical for residential customers.


Ok, that is fairly expensive. I pay 0.74 ZAR (~0.097 USD) per KWH.

Although, a recent proposal of a 135% increase over the next three years is under discussion.


I'm not sure, but NPower in the UK appears to be 8 pence (about 12 cents). That does seem low!


My figures include taxes and average amortization of any fixed costs. That said, UK energy prices do seem to be lower than in central Europe; I'm not sure why.


My guess is something Atom-based, like the EEE Box, would save even more in electricity.


Correct, they use about 50% less power, plus it costs less to purchase to start with.


Best home computing device I own!

It's connected to my LCD TV and acts as my source of TV/movie, etc enjoyment, along with net surfing and conversing!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: