I'm typing this on a windows PC with 192GB ram.
64GB RAM LIMIT? WHY OH WHY?
it is 2013. 64GB is not enough for many serious tasks.
(repost from a similar thread earlier today)
The RAM limits on laptops are nearly understandable. But on a machine like this, it seems like obvious greed on the part of Apple to give this machine a maximum life span of 3-4 years.
I guess the industries that benefit from "the more RAM the better, with no limit" would be 3d rendering, video editing, simulations, and large-format photography. I am doing the latter, joining together thousands or tens of thousands of images into a single seamless photo. for example:
I am aware that in many applications, the GPU has taken over for some of the "heavy lifting", but that is not true in all cases. Given there are basically no real power or thermal considerations limiting the RAM for a computer like this, I would really like to know if this 64GB limit is real, or if it is due to what RAM modules are available on the market now. Is it a limit in the OS? Will the Mac Pro be able to have 256GB RAM in the future, or not?
it is 2013. 64GB is not enough for many serious tasks.
You present a few valid examples. But I can't see how that represents "many" tasks; it's a very, very small subset of the fields in which people use computers for work.
If there were no hardware out there that supports the RAM requirements you're talking about, that'd be a great area for development. But if you're suggesting everything -- or even most things -- should support what to most of us is an over-the-top amount of memory, perhaps your niche is not quite as major as you think.
I agree that it's a niche application, but imho it's one of the target demographics of a 4k+ USD workstation. I'd like to know the reason (software/hardware) behind the 64 GB limitation as well.
It's a $4000 workstation. You buy it to run big jobs. Your department buys 10 or 20. In 3 or 4 years, the amount of "big data" grows and now you need more RAM. Do you simply spend another $100k to buy all new machines? You are just thinking about today's needs but a few years out memory needs always grow.
Not disagreeing with the memory point... just pointing out that a 12 core processor would, most likely, not be sufficient to the cause in 3 or 4 years if your memory needs grew that much. So wouldn't you buy new machines anyway every 4 years if your department were doing that sort of work???
The Mac Pro itself is a niche machine and priced as such, though. I mean, it uses a Xeon workstation/server-class CPU - how many computer users need Xeon and wouldn't be just as happy with a much cheaper high-end desktop CPU?
It's not unheard of for Macs to be limited in how much RAM they'll recognize, but it's not common. The maximum is typically limited by the number of RAM slots and the largest modules on the market. The Mac Pro has 4 slots and 16GB modules are the biggest out there at the moment. I would put good odds on it being able to take e.g. 128GB when 32GB modules become available. But there's no guarantee.
Is it an absolute requirement that the machine on your desk is the powerful one? For example, lots of scientists and engineers rely on massive computing power to do their work, but that power is in a server room somewhere. They often have modest laptops on their desk.
I doubt most people doing bioinformatics are doing their number crunching on the machine in their office. They're probably using a cluster that's maintained in a server room.
Keep in mind that these are workstations, not servers. They are intended to be used by an individual who, for whatever reason, has interactive applications with large compute requirements.
Many scientific computing tasks are memory intensive - operations on large matrices, R and Python both tend to keep their data in memory rather than writing it to disk, heavy simulation, etc.
I've slammed a cluster node with 1 TB of RAM pretty hard, I can easily see exceeding the 64 GB the Mac Pro comes with. Though my solution to this is admittedly "Toss it on the cluster".
Large image manipulation has been done for years now, no? If we managed it ten years ago on comparably puny machines, what has changed? Images bigger I guess is one thing, but does that account for advances in RAM?
(Saying that I am amazed at how my desktop slows down with 4gb of RAM these days.)
Back in 1995, Toy Story was rendered with a server farm of 87 dual-processor 100MHz SPARCStations w/ 192MB of RAM and 30 quad-processor 100MHz SPARCStations w/ 384MB RAM.
I used to do professional digital video editing on Media 100s and Avids with 80-96 MB of RAM back in the mid-90s.
It's amazing how quickly top-of-the-line becomes crap.
>The RAM limits on laptops are nearly understandable. But on a machine like this, it seems like obvious greed on the part of Apple to give this machine a maximum life span of 3-4 years.
Or, you, know, basic market research that shows you're some freaky outlier, and not even 0.1% of their customers will ever attempt to add that.
Not to mention, adding tons of RAM (for multi thousand dollars) and complaining of the machine having a "maximum life span of 3-4 years"?
Why would a pro that has an actual need for a Mac Pro saddle himself with it over 3-4 years? 3-4 years are like forever in tech. There'd be new memory speeds, new CPUs, new GPUs, new SSDs, new buses and such. If he's doing stuff the Mac Pro demographics does (video, audio etc), he'll want to update anyway...
We used to run print production for a major advertising agency mailing millions of pieces per week using 8gb of ram with Dual Core G5s. This machine is aimed at designers/agencies with fat budgets that might need to do Video editing / light 3D work / After Effects every now and then. That was always the bread and butter of the previous Mac Pros, it's also why they tried to make it look nice on your desktop (though I think they failed)
Honestly, Apple is not worried about the segment of the market that needs 192GB of ram - I'm sure it's orders of magnitude smaller then the ones who need 64GB.
Supporting more memory carries significant performance penalties. Until they make jumbo DIMMs which are as fast as smaller ones, perhaps it's not a good idea to offer the option.
I still find it incredulous people accept these are enterpise/workstation/pro machines:
- No ability to have more than one hard drive (No RAID possibilities)
- 64GB limit on RAM (as mentioned)
- Single power supply
- "Your Mac Pro comes with 90 days of complimentary telephone technical support and a one-year limited warranty" My toaster comes with a better warranty than that...
Compare this to the dell workstations of 3yrs of next day onsite & 24/7 support, 4 hard drives and dual power supplies...
> No ability to have more than one hard drive (No RAID possibilities)
The video production houses I've worked in the past had no local storage. Everything was over fiberchannel.
> 64GB limit on RAM (as mentioned)
Well there is theoretical and there is supported. My office Dell only supports 12GB. It currently has 16GB installed. My Macbook only supports 8GB it has 16GB installed. 16GB ECC DIMMs aren't exactly in high demand.
> Single power supply
I haven't found a "workstation" this is doing redundant powersupplies. No, the 2U Supermicro under a desk is not a workstation.
> 3yrs of next day onsite & 24/7 support
$249 for Applecare. It's cheaper than what I'm paying for my servers but more than I'd like for a desktop.
> Is it just me or something gives me the creeps about running an OS off an external drive...
The parent was talking about having your assets on externals if you had high storage requirements, the OS can live perfectly fine with any number of apps on the SSD.
> Since when did the requirements of a workstation class Xeon desktop become "Super-compact" and "beautiful"?!
We weren't always so utilitarian, Silicon Graphics did gang-buster business for a long time selling style and compactness in the workstation class. Yes, SGI eventually crashed and burned, but Apple is in a far better position to experiment with this, given that they have about $147 billion of cash on hand.
Good point. The best answer there is: that's what Time Machine backups are for. Time Machine backups lag your actual storage by 10 minutes to an hour, but it's okay for most uses.
If you're worried about important data, you'd want to use external storage arrays to store your pictures/videos/genome data. Then your redundancy is handled in individual storage chassis configurations. Promise and LaCie have popular faster-than-a-single-ssd thunderbolt arrays.
Apple should have dropped the price tag a bit (2-3k USD maybe?) and targeted the entry level workstation demographic, or bumped up the customization limit (additional RAM etc) and most definitely included a two year minimum warranty on the Mac Pro.
incredibly surprised ?
Apple thinks their 27 inch Thunderbolt Displays are still good enough, and they aren't really that bad.
The Mac Pro wasn't really cutting edge for years and the new one, while being very capable, isn't the best workstation you can buy either.
They show it off with 4K monitors, just not their own. For a machine that's specifically slated to use them (6 thunderbolt ports, two GPUs) it's extremely strange behaviour.
Are they "good enough"? I don't think there's a compelling reason to invest in one for my Late 2013 rMBP. And less to do so with the Mac Pro - I get a USB2 hub, no USB3. I have to use an adapter to use my Magsafe adapter. And if I have a Mac Pro, I get TB1, not 2.
It depends on what "Retina" means. It's resolution is close to the MacBook Pro, arguably it'll look just as good since it's further from your face. Or not.
No, it’s not, not really, at any rate. (The way Apple defines it, a retina resolution is simply a resolution where individual pixels are not visible to someone with normal sight at a typical working distance. I think it’s a quite useful definition and I really want to use it outside the context of Apple products. I want to steal this trademark from Apple by turning it into a generic term. For example, at typical viewing distances 1080p TVs are retina screens. That means going to 4k for TVs isn’t really worth it unless you plan on getting a much bigger TV or sitting closer to your TV. But the following text will focus more on the technical aspects of actually making retina happen with OS X than this nifty definition of retina.)
Apple does quadrupling of the size of its UI (on their MacBooks and iOS devices) when switching from non-retina to retina.
Consequently even 4k is not really enough if they want to replace their 27" display. That 27" has a higher (physical and logical) resolution than 1080p and quadrupling turns a 4k display merely into (about) a 1080p display when it comes to its logical resolution.
You gain all that resolution but lose all that space. That’s in many ways an icky tradeoff. Yeah, Apple has a hacky way that increases the logical resolution, but it’s also an icky tradeoff. The UI is rendered at an higher resolution (quadrupled in size and all) than the display can show and then down sampled to display size – but that costs performance and also leads to slight blurriness.
Both are not very big issues (the Mac Pro should have enough performance to handle it and this is really more an issue on portable devices – but maybe you can cause issues by connecting multiple 4k displays; the resolution of those displays is so high that the slight blurriness is not very visible) but are also not optimal.
OS X simply is not dynamically scalable. It’s quadrupling or nothing. And that works very well, but it’s not as flexible as dynamic scaling.
Realistically Apple could go for a 4k 24" screen (where having merely about 1080p logical resolution would be acceptable) but I’m not sure if they want to.
But going even higher than 4k? To 5120×2880 (that would replicate the logical resolution of their current 27" at retina resolution)? I’m not sure whether that’s realistic or even possible on that machine.
As can be seen on the 13" retina MacBook Pro, Apple is willing to make icky tradeoffs in favour of less logical resolution (despite being their top of the line 13" it has a lower logical resolution than the non-retina 13" MacBook Air).
But it just seems that when it comes to the Mac Pro they currently don’t want to play ball at all. It will be interesting to see how they handle this.
>As can be seen on the 13" retina MacBook Pro, Apple is willing to make icky tradeoffs in favour of less logical resolution (despite being their top of the line 13" it has a lower logical resolution than the non-retina 13" MacBook Air).
The 13" MacBook has always 1280x800, so it wasn't really a tradeoff. The Air is the higher DPI version and has been all along.
Well, yeah, but the 13" resolution has been an embarrassment for a long, long time. It just made no sense, especially compared to the Air.
I think it’s a quite obvious tradeoff, Apple just decided on it early on so it’s not as noticeable. (They must have known for quite some time that retina screens were coming so they kept the 13" Pro at a level where they could feasibly quadruple the pixels in the short run.)
That’s my little conspiracy theory for the day, and anyway, the logical resolution of the 13" Pro is an embarrassment. That’s just how it is. It’s a very real tradeoff (you get higher res for less logical resolution than is usual at that size).
Yeah, I thought it would be CHEAPER than their laptops, at least base price. I could be wrong, but isn't that how most desktop PCs are priced? I mean of course the specs are much, much better, but I thought you pay a premium for the laptop body, and so the desktop is naturally cheaper.
Yeah, it's probably not bad value for money if you need exactly one of the combinations of components they offer, with nothing more and nothing less. Which basically means someone who needs an expensive Xeon workstation-class CPU, dual AMD workstation class GPUs, and a fast SSD for the OS, but only 12-16GB of RAM and no RAID array or expansion cards.
If you want anything more or less than that, you'll end up overpaying or just plain not being able to get it at all. For instance, it's overpriced if you don't actually need two workstation GPUs, or if you need more RAM than the standard amount, or if you don't actually need a Xeon CPU, or if you have different storage requirements. Expansion cards? Pay through the nose for new Thunderbolt replacements or forget it.
I agree, and that makes complaints about lack of options completely legitimate.
But that's a completely different complaint from saying it's overly expensive for what's actually included. That's what was being said in the comment I replied to.
"Too expensive for my needs" is not the same as "too expensive for what's included".
I'm having trouble finding anything "official", I think I may have just seen them in comments here and elsewhere. It's all a bit imprecise as well, since not all of the components are completely clear (e.g. the GPUs are some custom Apple-specific model, which appears to have a generic PC equivalent, but it's not completely solid).
My vague recollection is that the GPUs alone are something like $700 each, and you get two, and the rest goes from there. Depending on the assumptions you make, you can get a $3,000 Mac Pro costing more than $3,000 for an equivalent PC, or less. Just beware of people comparing the Mac Pro with builds using non-Xeon CPUs, gaming GPUs, etc. Not that there's anything wrong with that sort of configuration, and personally I wish Apple offered some middle ground with high-end consumer components, but it doesn't give a good guide to what this actual machine costs to put together.
I don't have the need for mobility anymore. I work from home. I can build myself a beast of a machine with 60% of the cost of this desktop or an iMac. Don't get me wrong, I love my iMac but it beachballs way too often; enough for me to notice - and it's an iMac from Feb 2013. To me that's sort of unacceptable given the price I paid for it.
I then built myself a desktop with $1000 and it's a beast, full SSD 16GB ram the works. Never stalls, always keeps up with my chain of thought. You would think it doesn't make the difference but it does and it's noticible.
That's a great reason. I did the same back in 2009 when an iMac couldn't support the displays I wanted along with CPU/RAM requirements. I find my computing drifts when I do that — I end up with a lot more flac music that doesn't port back to iTunes easily.
But these days, I can grab a rMBP, hook up three 27" external monitors (plus the built in 15" HiDPI display, plus the rMBP can use an AppleTV as an external monitor, not just a mirror now), have a 1 TB internal PCIe flash setup, have 16 GB RAM, and four 2.6GHz (~8 with HT) cores to do my bidding.
Plus, the Linux box was about two feet tall and acted as a noisy space heater even in the summer. :-\ (but, I had it crammed full of 2 TB drives. but, rMBP has USB 3 and Thunderbolt for easy expansion. It's more difficult for me to justify large computers for non-compute-intensive tasks these days.)
But, it's smaller, outputs less heat overall (though, the keyboard is often too hot to touch if you rest your hands on keys or—god forbid—if you accidentally touch metal between keys), and it sometimes stalls all display output for 30 to 60 seconds at a time while running all cores at 100% for no discernible reason at all between once a week or 15 times a day depending on its mood.
Let's see your tower do that.
Stockholm Computing Syndrome?
As far as price: company bought it. Always change jobs right before an Apple hardware refresh event. If it wasn't essentially "free" from my end, I'd have gone with a 27" iMac.
Agreed, but: I'm weary of Windows. I've run every version except Vista and 8 on my home machine. I've been a total fan of Windows. I'm just really weary of using it. I use Cygwin for scripting, but some things are really hard to do even in Cygwin. I'm seriously considering the new Pro, but I have issues (can't RAID1 the OS drive, what are those external enclosures really like? Noisy??) so I'm not sure.
Ironically, I probably would have definitely upgraded if they had not changed the form factor, since I know I'd be able to do what I want.
You don't have to run Windows, there's always Linux. Why are you weary of Windows?
I have used Windows 7 and 8 and use virtual machines with Debian for development. You could probably even install an OSX virtual machine if you wanted to.
Just keep your files organized and don't install every too much crap and it'll run beautifully.
I keep my files so that I can easily format the computer and start over fresh in a couple hours.
I also run a software firewall so that I can keep track of all connections in and out of my computer. Microsoft Security Essentials is sufficient for most Windows users for antivirus purposes, just be smart about what you install, or if you want to install lots of stuff run it in a virtual machine. I use Oracle VirtualBox, it's lightweight and fast.
Lightroom and some other programs. LR runs on both Windows and Mac, so I'm tied to these.
For why I'm weary, see Cygwin comments in OP. And, this[1] (while I don't run Windows 8 at home, I do at work). Microsoft could have bought Cygwin before Red Hat. They could have embraced the differences between UNIX and Windows and made me love Windows as a better UNIX. PowerShell? No thanks. Even if I didn't hate it, it's Windows only. Give me Bash.
Windows is an inferior OS. When I start a Cygwin Bash process, there is a non-zero chance that I won't get some memory I need. The fix is to shutdown all Cygwin processes and to rebase ALL of the dlls used by Cygwin to different addresses. Seriously, true virtual memory has been around for what, more than 30 years? Still today you can run/load a device driver that puts crap in the middle of every program's address space. How is this tolerated??
I have two CentOS boxes at home, where I do lots of my work. Every night they are updated with the latest rpms to fix problems. I only have to reboot when I install a new kernel. Windows, on the other hand, needs to reboot for almost every single Windows Update.
These are the reasons I'm weary of using this shit OS.
What a strange time to release something...a week before Christmas, after Black Friday/Cyber Monday, and after the "deadline" of December 15 by which most retailers guarantee delivery before Christmas.
It's targeted as a work machine, which isn't really subject to seasonal Thanksgiving/Xmas buying, not much at least. I'm guessing the demographic of people dropping $3-6K on shiny Xeon-shaped garbage cans as Christmas gifts is pretty small.
That said, I suspect they wanted to release this one earlier but hit some speed bumps.
Apple did say "December". They dident say you would GET it in December, but it would be available to PURCHASE in December... Same with iPhones these days... order online, 3-4 weeks before you get it...
(repost from a similar thread earlier today)
The RAM limits on laptops are nearly understandable. But on a machine like this, it seems like obvious greed on the part of Apple to give this machine a maximum life span of 3-4 years.
edit: spelling