Still don't understand the justification for this machine, I waited years for a MacPro but when it became clear that Apple actually considered it a dead product and behind the scenes were building the iMac Pro to fill the niche I switched to windows and now run a dual 1080Ti workstation (Uses CUDA for 3D rendering path-tracing) for around half the price of a iMac Pro.
Apple bloggers said time and time again the reason they were not making a tower was that there isn't really a market for one anymore, yet when they finally made it they decided to build something that only really serves the highest of high end video editors.
Completely ignoring 3D, mid range video editors, developers who need high core counts + ECC, deep learning, etc.
Before they made it we kept being told "There isn't enough of you to justify them making it"
They finally make it and the narrative turns into "It's not for you it's for people who edit Marvel movies"
Apple didn’t say those things - like you said, bloggers, critics, people whose interest is in making you read about Apple and continue clicking on articles, they made up all these different reasons why Apple would or wouldn’t make a tower.
I don’t really think Apple cares about that drama. I think they know they have customers who will pay the premium. I think the machine is squarely aimed at businesses making the purchase, not consumers.
Is it really a problem that the Mac Pro is only for ultra high end users? Apple hasn’t made a mid-range consumer tower in over a decade now. If you walk into Best Buy how many towers you think they’re selling compared to laptops?
That’s the market Apple sells in, not the low-margin custom built PC parts market.
Apple doesn’t cover every use case of the personal computer. They are just one OEM. Unfortunately if you like macOS they are the only OEM.
As far as the mid-range video market that you talk about, what about the iMac Pro (has ECC memory) or a high spec iMac is insufficient for that task? Sure, it’s not as nice as your dual 1080Ti setup, but also, NVidia isn’t actually a viable option for Apple anymore thanks to their disastrous support for the platform in the past. If you made a Hackintosh system with NVidia you’d still be SOL. You aren’t getting CUDA on Mac no matter what hardware configuration Apple comes out with. Is Metal supposed to cover that use case and compete with CUDA?
- Ships with anything other than Windows (Linux, OSX, BSD, etc)
- Has a corporate warranty
You'll quickly find that Apple has this market cornered. To many people, you're not paying the extra money for the goodness of Apple. You're paying to avoid the badness of Windows. There's also some software that works on Macs, but not Linux machines that may be necessary for the job.
- Ships with anything other than Windows (Linux, OSX, BSD, etc)
- Has a corporate warranty"
Support for mac-only software notwithstanding, Dell's workstations officially support RHEL, have Nvidia GPUs for CUDA workloads, and come with up to five years warranty with on-site service. You can probably find comparable HP Z-series workstations too.
Has Apple finally started to offer one comparable to the Big Three? (SLAs, Onsite service with guaranteed reaction times and HW replacements and so on)
Serious question - this was actually a big argument against Pro Apple workstations in the past.
>-Ships with anything other than Windows
Well, you won't get macOS of course, but all big workstation manufacturers sell workstations with Linux preinstalled. It's really nothing unusual and hasn't been for quite some time.
Apple’s warranty service is terrible compared to the alternatives like Dell. A couple years ago I had a failed key on a laptop with a service warranty. Dell had a tech in my living room with the repair parts in less than 24 hours.
Apple could not possibly do that today. Maybe you don’t need it, but if you care about warranty service Apple is not the answer
> You’re not paying the goodness of Apple
> You’re paying to avoid the badness of Windows
Are you telling me there is a market of an intermediate OS? Because I would be ready to pay about $250/year/user for a Linux that is as good as Mac, but with cheaper and more maintainable gear than the iMac. It’s almost as if Apple were trying to tell us there’s an intermediate market up for grabs, but they’re still to close to it for any incumbent to try their luck. Canonical was close to it, but stuck to the wrong business model and decided to switch to Unity in 2013 instead of stabilizing Ubuntu. Product roadmaps are hard. Jony Ive is available, just saying ;)
Apple service is atrocious. Don't they make you book an appointment and turn up at their store just to get someone to look at it? I hear of people who spend more time taking their laptops to the local Apple store than most people with major illnesses spend going to the doctor...
I bought a Dell XPS13 laptop recently which unfortunately had a non functioning motherboard. I contacted Dell and a technician came to my house first thing the next day and replaced it, no questions asked. Totally hassle free. I'd take that any day over having to book an appointment to see a 'Genius'.
> Apple service is atrocious. Don't they make you book an appointment and turn up at their store just to get someone to look at it?
I had a macbook pro with a logic board that died. I phoned Apple, they couriered me a replacement device next day, and that courier picked up my old device. Literally couldn't ask for better service.
Seriously? My friend was forced to drag his 27 inch iMac into the Woodfield shopping mall location, which if you've ever been there, is a quarter mile minimum walk from the parking lot to the store.
Pedantry. They didn't give him an option besides physically bringing it into the store.
Good question about accessibility. I have no idea. But it's not as though our society is a perfect utopia for the disabled. I can only imagine it would have gone far worse.
I was just thinking reading the comment that you replied to that this goes one of two ways. Some people seem to get wonderful service, others crap for relatively high price.
To be fair, we experimented with a Dell XPS 13 laptop that had a succession of problems, and the service was the worst I have ever encountered in IT, taking several months of elapsed time before we finally got an on-site visit from someone who knew what they were doing (who then fixed the laptop in under an hour). That was what "next day" level support actually looked like in our case.
We were attracted to these as there was a Linux version that potentially offered a good alternative to Windows 10 for some of our people, but the experience was so bad that instead we immediately ruled Dell out as a supplier for any serious equipment for the foreseeable future.
I think we just got stuck in endless loops of tier 1 support people trying to run through checklists over the phone and then again via email. It was essentially the business class version of "Have you tried switching it off and on again?" repeated seemingly endlessly, bouncing from one support tech to another. Clearly several of the techs didn't even understand that they also sold these laptops with Linux on them and that's what we had.
Eventually, literally months later, someone finally seemed to escalate it to a person with the authority to send out a technician, who as mentioned before then fixed the actual problem in barely any time at all. We were on the point of just writing off the machine by then, as the amount of time we were wasting dealing with Dell was in danger of costing more than just buying a new box, and at least we would been reasonably confident of having a working system the next day in that case!
> Apple service is atrocious. Don't they make you book an appointment and turn up at their store just to get someone to look at it?
I can't speak for anyone else, but the last time I had trouble with my MacBook (five years or more ago now, the touchpad had cracked, IIRC), I made an appointment in the morning to come in in the afternoon. Walked up, explained the problem, gave them the machine, and they called me back to pick it up a few hours later. That's not bad, IMO.
Home service is awesome, but unless things have changed recently, I don't think that's common. It also may be different for wear-and-tear fixes vs. DOA replacements; Dell has a strong interest in fixing the latter as quickly as possible to protect their reputation. Most companies, most of the time, expect you to come to them to get service.
They also will ship you a box and ship it to a service center, I've been shocked on the turnaround time on that even with major service. From Me getting the box to getting the laptop back it was just two days (I ship on day one, they get it overnight on day two, they ship back out with a new Logic Board and cables, then I get it on day three.)
>- Ships with anything other than Windows (Linux, OSX, BSD, etc)
It doesn't exactly sound like it would be any kind of a problem for you to install your own OS if you're willing to accept a manufacturer supplied BSD.
Hmm, well I'd honestly do just about anything I could to avoid macOS. If only I could compile iOS apps without it, I wouldn't have to deal with their backwards and incapable UI.
> You'll quickly find that Apple has this market cornered.
No they don't. Not the market for people who just need a corporate Unix box. At my consultancy we have a mix of machines with many people running a System 76 tower or laptop and you'll find plenty of folks here on HN who will name Dell, Lenovo or HP as their supplier.
Perhaps you're thinking of just the market for people who do the absolute highest end video work for the film industry? Other than that, I don't see it.
And Windows has been rock solid for a massive amount of users and developers of various types since Windows 2000.
>> I wouldn't have to deal with their backwards and incapable UI.
Well, that’s, like, your opinion, man...
No, seriously, I feel the same way about Windows’ UI. I mean, did you ever use Windows 8? And Windows 10 has built-in ads by default in the UI/UX?
Windows XP SP3 was peak Windows, IMHO - the awful part about them making an abortive mess bastard child of the UI/UX in 8 was that there are millions of non-techie people who literally know how to follow one sequence of events on their computer, and that usually starts with ‘press start’.
Windows 8’s awfulness is probably what drove a lot of those people to iPads. If you’re learning a new user interface idiom anyways, and even Microsoft Office is on the iPad; why stick with Windows?
I don't know anybody that does serious work on iPads. Every quarter there are only ~9 million iPads sold for every ~34 million Lenovo/HP/Dell laptops and half of those iPads have got to be for kids from what I can see.
Sure, an iPad is fine for consuming documents and doing light markup. However, if you're going to create things you're going to need to be multi-tasking and - hey, I have multiple iPads and I use them all the time - but I'd love to have a contest between what I can get done with Windows/Linux versus what you can do on an iPad because there's just no comparison as far as I can see.
I'll add macOS to that too. It's not even in the same class as Windows/Linux. I watch people using macOS daily and I swear, they are constantly swiping to find that full-screen app they lost because of the complete lack of window management in macOS. They'll put Chrome into full screen and then struggle to get the detached devtools window back up. They'll have to install things like iTerm with it's own tiling manager, to manage 3 terminal windows. Apple just doesn't care about practical things, they are constantly focusing on how things look, how thin or light they are or how they can make the most amount of money by removing options and claiming everything is always better that way, when really it just serves to remove the amount of work they have to do to support things like you know, physical buttons, headphone jacks, options/modes in software and so forth.
Anyway, the UI in Windows 8 and 10 were also completely configurable to make it more like the original Windows UI. If you don't like the default configuration you can change it or install 1 program (7+ taskbar tweaker) to make it just about perfect. What I really, really like is being able to do things the way I want to do them and not the way some godless corporation has decided it should be. Apple just gives you nearly zero choice compared to Windows and most obviously, Linux. They're just on the wrong end of the spectrum for how I like to do things.
And I never got any ads on Windows - just pre-installed apps like Candy Crush and Skype. I'm assuming they installed Candy Crush because it's a lot, lot more popular than Solitaire or Minesweeper with today's crowd. This is no different than Apple pre-installing things on macOS/iOS. And before anyone says anything about Apple not pre-installing 3rd party software...I think that's incorrect. If you want to use any of the Unix aspects of macOS, you have to start off with Apple's lame and old versions of even basic Unix utilities and programming environments until you go and install some other 3rd party things to fix the situation. That's way worse than having to right-click a Candy Crush icon to remove it once IMO.
Pressing Start is one popular way of starting a program on Windows, so I don't understand that line of argumentation.
The only tangible market for a Mac Pro is professional Final Cut users in a professional setting, no?
If this is hogwash, tell me so, but it just seems that any other realistic scenario that requires this level of hardware (like research, rendering and AI) would be significantly cheaper and better supported outside of Apple's ecosystem.
Short of having a pretty device to sit in a studio, what other reason is there for this to exist? (And how much of that audience is more likely to just buy iMac Pros).
As regards development - Most development tasks won't significantly benefit from the performance offered here, and anyone who needs that performance is likely going to buy something significantly better value for money (as regards tech specs) than a Mac Pro.
I'm not aware of any significant tools for 3d modelling or video editing (besides FCP) that are OSX-exclusive, and that audience is surely better served by a much cheaper Windows/Linux machine.
There aren't any serious edit houses that use FCP X outside of gimmick advertising deals like some late night shows. Everyone is using Avid or increasingly Premiere anyway.
In my experience, the past five years have seen a dramatic shift to windows in professional facilities (Oscar winning editors).
I do know one very high end editor who cuts on a Mac mini. The old school guys are used to proxy workflows and you don't need lots of power for that anyway.
Unfortunately they will probably point to the poor sales of this model as justification that this pro tower market really is dead. “Sorry guys, for some reason only Marvel editors bought these so were canning the line”
I kind of feel that Apple needs its own pragmatic Satya Nadella to regain relevance in many niche applications. Sure, Apple is nowadays a consumer company mainly, but what is the problem having a competitive professional line as well?
Were people editing Marvel movies really waiting a decade for this though? Pretty confident they've moved on to PCs years ago. I struggle to find the target audience for this, now that they are alienating home users.
>Were people editing Marvel movies really waiting a decade for this though?
Given the number of posts telling people in Hollywood not to restart their Trashcan mac Pros because of the Google Raven screw up, I would say that yes, in fact, plenty of people in Hollywood are using Mac Pros, and likely will buy this new one.
> I struggle to find the target audience for this, now that they are alienating home users
This machine has NOTHING, and I mean nothing to do with home users.
Next you'll tell me that Tesla have alienated "normal" car buyers because they make a $100k P100D rocket ship. Tesla also make a $35k regular sedan. Apple also make much cheaper iMacs, and Mac Books and Mac Books Airs for home buyers.
I don't understand why people time and time again bash Apple for making something that isn't in any way designed for "home users", while they still make plenty of things that are.
The base price has more than doubled from the original cheese grater we all fell in love with. I'd be willing to bet the home user MacPro community is much larger than Hollywood. People have been clamoring for an updated cheese grater for the better part of a decade. They are STILL actively developing hacks to keep their 2008/2009 models running. Shit, I just retired mine this year in favor of Hackintosh, because I'm not dumping $10,000 in a modular Mac.
I think this new MacPro is going to be a huge failure. Professionals are running PCs now, and home users won't spend the money.
Many of the home users who were buying $3000 base model cheese grater Mac Pros could get by now with a 6-core i7 Mac Mini.
Price would be around $1300 for the computer, $300 for an eGPU enclosure, and $700ish for a Radeon R7, plus aftermarket RAM. AMD's not in a great spot for high end GPUs right now, but when the Navi 23 cards land next year it will be looking better.
This doesn't scale as well for multi GPU machine learning workloads and Apple needs to get over their shit with Nvidia, but as a lower end "modular" Mac than the $6000 cheese grater 2019, it's an option.
> Price would be around $1300 for the computer, $300 for an eGPU enclosure, and $700ish for a Radeon R7, plus aftermarket RAM. AMD's not in a great spot for high end GPUs right now, but when the Navi 23 cards land next year it will be looking better.
I used to want an eGPU. Then I learned that you need to disable SIP in order to do so...
> This doesn't scale as well for multi GPU machine learning workloads and Apple needs to get over their shit with Nvidia, but as a lower end "modular" Mac than the $6000 cheese grater 2019, it's an option.
No one is going to wait for that hypothetical future where MacOS supports Nvidia GPUs.
> No one is going to wait for that hypothetical future where MacOS supports Nvidia GPUs.
True, but their Navi cards are at least competitive in the price ranges where they exist. Hopefully the high end ones next year continue that. If you’re looking at Titan or whatever the current ML thing is in the $1000+ range, then you might be stuck with Nvidia.
> True, but their Navi cards are at least competitive in the price ranges where they exist. Hopefully the high end ones next year continue that. If you’re looking at Titan or whatever the current ML thing is in the $1000+ range, then you might be stuck with Nvidia.
I am rooting for AMD's Navi cards too. It's just unfortunate that CUDA seems to be more supported than OpenCL.
Agreed on that. AMD put a bunch of work into the Cycles rendering engine (for blender) to get their OpenCL support up to par with CUDA, and now it's completely disabled on the Mac version thanks to Apple deprecating it. Disappointing.
My Windows machine is admittedly more modular than a mini+eGPU would be. I can pull the CPU out and put in a new one whenever I want! But over the course of 12 years and 3 computer builds, I've never done that once. By the time there's a noticeable CPU upgrade available I'd need a new motherboard to go with it.
So I think there's a big segment of the "modular" market that only really cares about having GPU options and upgradeable RAM.
It's not for everyone, but the people in between the high end Mac Mini (6-core i7 + thunderbolt GPU) and the low end Mac Pro (8-core Xeon W and internal expansion slots) are a small enough slice that Apple doesn't care.
Missed the edit window but for GPU I meant the Radeon VII not R7. Double checking benchmarks, the RX 5700 XT compares pretty well to that, but 2nd gen Navi will have a new higher end card.
Yea, I remember when PowerMacs could be had at a wide variety of price points. That was nice.
If Apple doesn't want to serve the "I need a decently powerful machine but don't want to waste money on what is basically a status symbol" market, maybe they should license MacOS to someone who does. Something like that might actually bring me back to the platform. As it is, I stick to PCs running Linux.
Yeah, I've waited years for this machine, was excited when it came out, and in an epiphany last night, will probably pass on it and opt for a beefed up Mac Book Pro instead. It will be cheaper and fill my needs.
I'm a programmer. If I want to prototype a multi-component stack, I've got a Raspberry Pi Cluster running K8 for that. If I want to play around with deep learning, I'll just spin up EC2 instances. Games? Let's be honest. That war was lost decades ago. I've got a good PC rig for that now. The days of a general machine to do everything are gone.
The only issue is the storage space, which I can get from a multi-bay drive enclosure.
They used to do this and it almost led to the death of the company. I’m sure they’re quite hesitant to try again. Instead, you can look into the hackintosh project
They’re making money, now - they have no reason to license MacOS, and all the more reason not to.
Apple’s hardware/software combination is the reason a lot of us have stuck with them for so long. It’s quality control. No drivers to fuss with; just plug in and go.
That was back when Apple was primarily a desktop computer manufacturer. In any case, they could place restrictions on licensed clone makers to avoid cannibalizing sales of their more popular products (e.g., no laptops).
I didn't do it lightly, I haven't owned Windows machine in 17 years.
But as I said dual GPU machine for half the price of an iMac Pro that can run Octane Render[0] which requires Nvidia cards with CUDA left me asking why I didn't do this years ago after using that for 15 minutes I couldn't go back to how I did my work on my old Mac.
If Apple wanted my business I would have given it to them but they were pretty insistent that they had no interest in my money.
I agree! I find Windows quite cumbersome and unpleasant to use and would rather pay more for macOS; not because it is a status symbol, but because it is simply a better product for me.
I need a Mac where I can replace the harddrive when it fails after 18 months like Mac drives always do, and also I'd like a bay to put a decent optical drive and not the low quality Apple SuperDrive which they finally gave up on, and add a card so I can have more than 1 USB port which is ridiculous.
And it should cost around $500, not $12,000.
$500 with what I request is certainly available on the PC side of things and I have that and run Linux on it and have transitioned everything I can off Apple because they can't produce the desktop that I need. I've put Mac versions of my products on bug fix only status and when customers ask me when the new version comes out I tell them never and advise them to try Linux.
Doesn't the new top tier Mac Mini fill a lot of the role the trash can Mac Pro did? You can hook up GPUs to it with the thunderbolt ports. I think the top model is faster than the trash can Mac Pro.
Sure, Windows machines have always been cheaper for similar specs and performance. The Mac Mini and GPU setup is way cheaper than the new Mac Pro and hits a bunch of the use cases (development, machine learning, gaming) that people wanted a Mac Pro for.
I think they should include Apple Care for their professional equipment. This won't put any any more money in Apple's pocket but I could see it being another differentiator and a good bit of PR, especially to all of those price arguments. Plus, it's only $300 and I'm sure Apple could suffer that "loss" on a $6,000 - $30,000 purchase.
I've always argued that for the prices that Apple charges Apple Care should be included in all its products.
The first Apple computers I bought lasted for years and I never thought Apple Care was necessary, but since the fiasco with the 2011 MBP and the butterfly keyboards I'm not buying another of its products without Apple Care. Yes, Apple ended up doing the right thing in both cases but it took years after the problems started and a couple of class action lawsuits.
It presumably is in some areas. In New Zealand we have The Consumer Guarantees Act. Things are expected to have a reasonable life span. What this means is not specified but is generally accepted to be relative to the price paid (eg if it’s expensive, it should last). It’s a fantastic piece of legislation. I’ve had a 2.5 year old iPhone covered for example.
I believe that the parent meant that Apple charges enough (in the US), so it should have the extended guarantee included there. In NZ, things cost extra due to that law, correct? The difference: is everybody is paying for that extra protection, there's no choice.
If you have this protection, it's like insurance. You, and everybody else, pay an insignificant amount of money and you get a much more expansive treatment incase of failure.
Sure you have a choice, back the politicians that are against this.
I don't get why freedom loving people rather have a corporate take away their freedoms than a democratic self governed body.
Both are power clusters, but in one you have a moral ground to criticise when it's acting against the populous interests (democratic government). While abuses of power from corporate cannot be argued against morally, you don't own them, they are playing by the rules, they manage to make a buck, all is above board morally while in actuality masses of people might be hurt by corporate abuses of power.
Freedom is extremely important but it is sometime conflicted with other values, a reasonable discussion of the conflict and tradeoffs is a much better way to improve life for people, than fanatically protecting the one single value you hold most dear.
Aside from the additional taxation and regulations in NZ, it does have some pretty unavoidable issues which drive up the prices of consumer goods. It’s a very small market, and very far away. Getting things there is just going to cost more anyway, and the market isn’t big enough to invite significant competition, so there’s usually a large (or at least larger than a lot of other places) markup on imported goods.
It’s not just far, it’s far away from everything, including all the major trade routes. In Europe and Asia, a lot more goods are produced close by, and if you ship a product to any country in Europe or Asia, there’s a lot more people to buy it, and a lot more markets close by.
And there are lots of regulations too. For example I sold an oil burner light on eBay to someone in NZ (didn't actually know as there was a UK address that forwarded the package) and I included some oils for free...
Big mistake. Package was returned and I had to remove them.
My only point being it's a poor port in part because of its delicate ecosystem, which necessitates very strict regulations.
Add a layover and you could easily be at 20+ hours (which is common since NZ is small so there aren’t that many flights). Europe is well over 24 hours even to the major hubs.
TV set you get on Black Friday works until X-mas. It was a bargain, what did you expect?
That discount you got on your car? You think it was because of your outstanding negotiation skills? Think again, the seller swapped a cheaper bifurcator, meaning your mileage will suffer. But is it wrong? You got a discount.
If you sell it, if needs to be good. If it is not good, you cannot sell it. There's no price in question.
In Norway, it's not based on the price, but the product type. So TVs in general are said to last at least 5 years. So if you sell a cheap tv that breaks after 4 years, you have to replace it.
This of course means one don't get as much ultra cheap stuff sold, but at the same time most of that stuff is crap, and it's better for the environment to build stuff that lasts.
I don't think Apple should necessarily include coverage for user-caused damage as standard, but having only a single-year warranty on manufacturer defects is kinda shameful for a premium product. I expect premium products across most product categories to carry a three or five year warranty as standard.
A single-year warranty is in general an indicator that a) the manufacturer doesn't have faith in reliability past the first year and b) the manufacturer is begrudgingly just meeting the minimum standard for warranties in most jurisdictions. Even if neither of these are the case, the optics aren't good.
>A single-year warranty is in general an indicator that a) the manufacturer doesn't have faith in reliability past the first year and b) the manufacturer is begrudgingly just meeting the minimum standard for warranties in most jurisdictions. Even if neither of these are the case, the optics aren't good.
Dell's Precision workstations come with 3 years warranty, upgradable to 5 years coverage with on-site service. There is no reason Apple can't match that.
I live in Indonesia where there is no Apple store and where time for repair of an issue such as keyboard replacement is six weeks.
This wasn’t an issue years ago because reliability wasn’t an issue. My ~2005 30” Apple Cinema Display is going strong connected to my 2010 Apple mini. My 2017 MBP however has been the most unreliable hardware I’ve ever had (my first computer was a 286). 6 weeks for repairing a pro device with a hefty premium price.
In Europe a lot of products have to include 2 years warranty by default, which makes Apple Care a lot less attractive. You're basically paying the same price for just one more year instead of two.
Not every EU country has such a law. In the UK 1-year long warranties are standard, but some other states mandate 2 years on all electronic devices.
Unless of course you are talking about the EU-wide consumer protection rights which apply for 2-6 years after purchase, but which people very mistakenly call a guarantee. The problem with this protection is that it protects you only against manufacturing defects. And anything that happens after the initial 6 months is up to you to prove that it existed at the time of purchase. So no, if your MacBook suddenly dies 23 months after purchase, apple doesn't have to repair it, unless you can demonstrate to them that it died because of a manufacturing defect. In comparison, apple care would get your laptop replaced under the same circumstance.
We have CRA 2015 here in UK. 6 years on everything. Apple ask me what I want to claim under when I take my stuff in there. I only need AppleCare for when I break something :)
2 years for people, 1 year for businesses. That includes self employed people if they want to enter it as an expense in the accounting books (I'm not sure about the terms in English, I hope you understand what I mean.)
I'm wary about AppleCare BECAUSE of the MBP keyboard incident. They refused to replace mine or my wife's. I have purchased well into the 6 figures of Apple devices over the decades.
p.s. Switched to Windows when I bought my laptop for that reason.
I hear you. My current strategy, unfortunately, is to buy a new MBP, sell it at exactly 23 months so that I can advertise "over 1 year left of Apple Care".
assicuration is often a form of self selction bias. people conscious about damaging their property are those that are more likely to take care of it. apple care for everyone can dramatically shift its profitability.
But then they would have to either extend that to all Macbook Pros, iPad Pros and iPhone 11 Pros for consistency, or admit that these devices only got the "Pro" slapped to the name for marketing.
> I think they should include Apple Care for their professional equipment. This won't put any any more money in Apple's pocket but I could see it being another differentiator and a good bit of PR, especially to all of those price arguments.
Do Apple hardware failures happen often?
In ~20 years of assembling PCs with consumer grade parts I've only had 2 legit non-keyboard/mouse hardware failures (3 if you count lightning destroying a power supply before I knew about surge protectors). For context, I've had 0 hardware failures in the last 10 years. I keep my computers on 24/7 too.
Funny, in about the same period ~27 years or so, I've seen enough PSU failures alone, that I always use a UPS. I've seen ram that failed a couple times, quite a few motherboards die in the early 00's. A MB that was DOA and a video card that died after a year. Oh, and a first gen Intel SSD that suddenly thought it was an 8mb drive.
I still prefer to DIY for value... my 4790k build lasted 5 years, just put together a new one, placeholder cpu waiting for 3950X. Will probably, again, last 5 years or so.
What's really funny is in the late 1990s and early 2000s I always overclocked my systems because it gave a noticeable FPS boost in the games I played. Dual Celerons (physical chips) running 366@550 (a classic) and then a Pentium III 733@900 which was a huge upgrade, especially for a consistent 125 FPS in Quake 2 / 3.
My 2 legit failures were an old IDE HDD that audibly clicked like crazy for months before failing completely. Since it gave such a long warning I was able to back everything up thankfully. The 2nd failure was a different power supply that just stopped working without any notice or grand events. I never had any RAM or video cards go bad, but after assembling a new machine I always run them through rigorous automated tests to help detect faults. I've had 1 stick of RAM be DOA but I don't count that as a failure.
My current system is an i5 4460 (3.2ghz) with a bunch of other components. It's been going 5ish years now without issues and I work this thing pretty hard. Full time development, video recording and editing, gaming, etc..
Oh yeah... I've also had a few HDDs fail too... Two that were in Raid-1 within a week or so of each other (before the RMA for the first drive came back).
Aside, kernel (5.3) and drivers (mesa 19.3) are all updated, can finally drop in the rx5700xt card I'd been sitting on and play with it. Amazing how many things are working without issue via Steam's Proton (Wine, dxvk, etc)
A great machine to be sure for high end content creation but Apple is not chasing the deep learning dev market because of the lack of CUDA. That market is better served by Linux boxes with appropriate GPUs for TensorFlow, PyTorch, etc. support.
Nvidia really has the GPU market under their thumb. It would be one thing if their pitch was "We're not as good as AMD for games, but we've got CUDA for professionals and researchers!". At least that way there could be some sort of segmentation and I could tell people shopping for GPUs to "get the right one for what you want to do!"
But that's not the world we live in. Nvidia's pitch is "We make the only hardware that runs the framework used by almost all deep learning research and media creation software, and we're also the only folks operating at our level when it comes to high-end video game graphics, and if our high-end cards are too expensive for you, we have cheaper ones. And when our competitors start to think they can catch up, we'll drop RTX and Tensor Cores on their face."
AMD seems stuck on "We have great value middle-to-high-tier video game graphics cards" at best. I have no idea how they can get out from under that rock. They've been turning the CPU market upside down and smashing Intel's once proud "near monopoly" status. Nvidia seems like exactly the kind of prideful company that would be poised to fall, but I have no idea how AMD could make it happen.
I'd guess that within 2 years AMD will be competitive to Nvidia at the high end. Their strategy is a bit different, they're delivering good value at 2/3 the way up the performance graph... not as big a markup as a $1200 RTX 2080 Ti, but the RX 5700 XT is decent, especially for the price.
Most people aren't spending over $500 on a GPU, so they get the volume sales. The better aftermarket cards have been selling out pretty consistently. And the longer term strategies are similar to how they approached Ryzen. So, I'd think that Navi can definitely succeed in that light.
The real lock in for NVidia, is all the existing CUDA development. Intel and AMD will have a lot of work, One API may help there, so long as Intel and AMD can work together, without Intel's often and weird lock in attempts.
Yes, because nobody replicated the pragmatism and power of CUDA. OpenCL is much uglier and lower level. So AMD decided to do something about that... and invented ROCm, which is somehow even uglier and more low-level! A reference FFT implementation in CUDA is about 150 lines of code... and it is almost 10 times more in ROCm.
It was always about software. Granted, CUDA is not the best and most elegant platform in the world... but AMD seems not to be able to reach even that level.
Isn’t that only if you ignore price? If you only have a limited budget, then AMD will get you better performance. For absolute maximum performance at any cost AMD has nothing that can beat Nvidia though.
He's saying that performance gain may not be worth huge price tag. It's similar to Backblaze which decided to use consumer grade hard drives for their storage solution rather than pro grade drives because cost increase was not worth it.
This will probably only get more difficult as a bunch of AMD GPU engineers defected to Intel.
I'd still likely only go AMD GPUs in the near future, just because I don't have particularly demanding GPU requirements and they have better Linux drivers.
Anyone privy on what hold Apple back on supporting Nvidia? The only info I read was due to some earlier bad blood, but I wonder why hurt your users just because of some old scuffles...
A reference FFT implementation in CUDA: about 150 lines of code. A reference FFT implementation in ROCm: about 1200 lines of much uglier code using advanced C++ idioms.
So no, it is not a replacement, again. Elegance matters.
I'm struggling to think what super high end content studio hasn't moved on by this point? All these tools are available on Windows, and no one still in business was waiting 12 years for a worthy upgrade.
Slightly off-topic, but who here uses a HEDT (high-end desktop) or workstation computer for software development? Does it make a significant difference in comparison with a standard business laptop?
Yes. I work on a Linux distro, and the amount of time my 1st gen 32 core Threadripper has saved me is truly mind blowing. It made it possible to do changes I wouldn't have dreamed of.
As an example, I worked a bunch on our PostgreSQL infrastructure, but we support 5 years of versions. So you have to CI/integrate your changes and test them across all 5 versions, every time you test. My machine could do this on the order of 2-3 minutes -- recompiling everything from scratch every time, and running all integration tests on all versions. There's no CI system or cost-effective cloud hardware that would have given me turnaround like that. In the end I was doing hundreds of builds over the course of just a few days.
In contrast, at $WORK we have a C codebase that takes < 1 minute to compile on my 4 core work laptop. YMMV.
Because then you need to commit your chnges, push to a remote Git repo, wait for the CI server to check for changes, wait for the CI server to finish previously scheduled builds, wait for CI server to start building, ...
Also, all that needs to be set up. If you need to change some little thing, you're much faster doing it locally, rather than trying to wrap your head around the build server configuration and figure out how to just run this one test different but only for this branch and not for the normal builds ...
How's kernel support and VM support such as QEMU lately? I'm thinking to buy i7 for embedded OS dev but that sounds very tempting. I heard AMD CPUs had some breaking issues last year.
There was breaking issue, but it didn't manifest automatically.
It happened, when the 2nd gen TR arrived. It used the same mainboards, so all the manufacturers issued BIOS updates.
Unfortunately, these updates claimed to support SEV (Secure Encrypted Virtualization). Linux of course tried to initialize it at boot/module load time and the entire thing went hanging, because TR CPUs do not support SEV, only EPYCs do.
So there were the following fixes:
1) downgrade BIOS back to pre-TR2 version,
2) blacklist the ccp module; which would make kvm_amd non-functional,
3) wait for a fix in Linux kernel, which initializes SEV with a timeout.
So it wasn't that tragic issue, if you had first gen TR.
I haven't had any issues with my 1st gen Threadripper for about a year now. I'm running Unraid with QEMU from that box and couldn't be happier with performance and reliability whilst running 2-5 VMs at once along with 20+ Docker containers. No issues with Windows 10 and *nix VMs; I haven't been brave enough to attempt a hackintosh yet though.
When it was bad, it was so bad. I never wanted to know as much as I do now about IOMMU groups and PCIe lanes.
I use an 8-core iMac Pro with 64GB RAM; it’s wonderful.
When I bought it I was doing some embedded device work involving a Windows VM, as well as heavy web dev on a frontend JS app and a backend app with tons of Chrome tabs open. All these things crave memory, and my 16GB MacBook Pro was swapping itself to death. This was pre-32GB MacBook Pros, so I bit the bullet and couldn’t be happier with the setup. It’s dead silent basically no matter what I do and doesn’t get thermally throttled during heavy workloads. Having the extra RAM also makes a huge difference.
Now that I’m doing more regular web dev on an Elixir & React app, I benefit from the 16x parallel compilation & test suite, as well as the ability to keep basically anything open without resource issues.
I have a similarly specced iMac Pro and I use it for very similar tasks. Could not be happier with the machine. I think the RAM + the very speedy storage make this a great choice for the kind of dev work I do.
I do similar work (arguably a bit more intensive) on my 2016 MBP with the lowest specs possible (8gb ram, 2ghz cpu) and it's surprisingly silent most if not all the time.
I have around 5-6 windows with 10-15 tabs each within them as well some ssh tunnels and some other applications open (I do heavy front/backend with docker builds through ssh as well as ML work through ssh)
Not sure what's going on, but my lowest-end Mac is running everything I need like butter.
Your Lenovo laptop has an 8 core 3.2GHz Xeon with 4.2GHz Turbo Boost, 19MB Cache, 64GB DDR4, silent cooling even under heavy load (which it can sustain), and an SSD that does 1700MB/s with full disk encryption?
Actually, P53 and P73 can be specced with all of those - and more. Up to 128GB RAM, Quadro graphics, 3 drive slots/bays (2 NvME, 1 2.5")...And the chassis is not compromised for thinness, so cooling is adequate.
I'm thinking of getting either a P1 or a P53 for my next work upgrade. The P53 will likely have better thermals, but the P1 has an option to remove the discrete GPU, and just use the integrated GPU. As I have no need for a fancy GPU, I figure it would reduce unnecessary power draw.
Hah, I agree...But, on the other hand, you can carry it when needed, and I do regret not buying one of those for work instead of Razer Blade, "because it's thin and light" - it gets overly hot and noisy when doing heavier work and is definitely not up to par to these...
I think Lenovo (together with other questionable actions that alienate their userbase) removed the option to choose Linux for workstations, so it comes only with Windows 10. But Thinkpads are traditionally very well supported on various Linux distributions.
Not at my laptop until monday so not sure about all details, but almost that, yes. It's a top specced P1. Xeon but 6 cores I think. 64gb ddr4. 1500mb/s read I think it was on the SSD.
Wow, that’s a beast of a laptop. I just configured one and the cost isn’t all that different from my 18mo old iMac Pro :) I have to imagine that the battery life isn’t great given the non-mobile parts, and that it must weigh a ton?
I’m enjoying the combo of a powerful workstation where I work most of the time, and then a thin & light portable device for when I need portability. For me that combo works well and hits the right trade offs, especially the silence I get to enjoy.
I’m also remembering that Apple ships the same SSD setup in the newer MacBook Pro machines, which weren’t yet out when I bought this iMac Pro. Definitely nothing unique or unusual about that speed anymore!
Yeah, not cheap, but my company let me choose between that and a top of the line macbook, and since my previous macbook was a catastrophy I switched setup entirely.
The weight isn't too bad, I have it in my backpack commuting by bike in hilly Norway. But maybe too heavy for a shoulder bag. The battery time is useless, yes, so that's the big tradeoff. But since I bike instead of commuting by train or so, I never used my previous laptop not plugged in anyway.
I just upgraded from a 2700X to a 3900X. Maybe it’s not quite HEDT but its closer, and my god, it’s literally mind blowing. I measured a 47% wall clock improvement on a test compilation. If you compile lots of C++ you are in for some serious surprises. Especially so if you are doing things that cross-cut hundreds of C++ packages at once, it basically enables things that aren’t possible on, say, a mid range laptop.
That said, if you are just running simple Go or Rust compilations, or small single-package C++ compilations, caching and a decent processor should be good enough, and you probably won’t benefit much from a lot of threads. (You may want it for your Webpack builds still ;)
One tip: scale up your RAM, pay attention to clocks and latency relative to your processor (especially with AMD where it really matters.) 16 GB is easy to kill when you are running 24 instances of GCC and a VM or two.
Rust actually benefits a lot from more threads once you have a couple of dependencies. A personal project using amethyst/nalgebra dropped its compile time for a fresh release build from 20 minutes to 2 when I upgraded from a i5-4670k to a r9 3900x.
This is true; I’ve never worked on a huge Rust project, only fairly large C++ projects (at home, anyways.) Rust compilations always felt fast enough to not matter, similar to Go, although maybe not quite that fast. (With Go, it never felt to slow to say, build Kubernetes from scratch; it’s just fast.)
Regarding webpack builds... absolutely NVME, when I went from SSD to NVME it's the Node/webpack build times that I really noticed the performance differences.
I would think the extra cpu would be more of an impact with Rust than even C++... I wouldn't know, running a 3600 as a place holder until the 3950X comes out. I couldn't handle the 4790K anymore, going from 32gb to 64gb and the couple extra cores made a huge difference for my docker/vm workflows. Can't wait for the 3950x. I'm sure the TR 3rd gen will be similarly impressive with the changes to IO.
Oh yeah, NVMe is an absolute given. It only took one NVMe drive experience and I have never had a desktop or laptop since without a large NVMe SSD as the boot and primary disk. It is in many cases a substantial boost and you can benefit across more things than a bigger CPU since many things these days are IO bound to begin with!
I’m rocking a Samsung 970 Pro 512 GB on my desktop. I never thought I’d need more space than that, since I can always use my NAS or the spare spinning disk I have installed. But, the more CPU power you have, the more you can feed it... I find myself building entire fragments of Nixpkgs now and it takes substantial disk space to do it.
I don't have an HEDT but an outdated Haswell i5 laptop (not an Ultra Low Voltage model at least, so it's quite fast). Last year I spotted a bug in Firefox and I though it was time to put a line in the contribution section in my resume.
The contribution experience was a nightmare because a full build of Firefox took 3 hrs and running the entire testing framework took 4 hrs, though it turned out that I needed to run only a part of the testing framework. Changing a single line and building it again still took more than 30 minutes.
That was the first moment that I wanted a HEDT in my life. It feels like devs who work with big C++ projects would want a bigger workstation because of the significant build time.
Maybe spin up an instance on GCE or EC2 and remote into it? If you live anywhere near a data center it should be under 50ms delay. Your time is certainly worth more than whatever the cost of the instance.
Probably. But I always try to build Linux programs natively or in an officially supported environment at the first attempt because I'm not sure what potentially cause problems when I build it in a something unsupported environment. On top of that running the testing framework needed an X11 environment. You know, it sometimes also takes significant times to figure out how to set up in a different environment. I might try EC2 if I became a regular firefox contributor.
My RTT from the midwestern US to AWS/Azure/GCP in Germany is less than that. I think your ISP has more blame than your distance to the nearest zone in NA.
It's usually more, actually. We have seen 100+ GB build directories when building the "all" target and a non-GOMA build easily takes multiple hours on a desktop i7
Hmm... Like when Oracle created their "bare metal" product. There just wasn't much left to add to the database engine to eliminate the need for a general-purpose operating system. I suspect graphics, audio, and pointing input may be significant barriers to doing the same thing here.
I remember building Mozilla took around 6 hours in my PC back around 2003. Though granted, that was a Pentium II PC with only 128MB of RAM and most of those hours were probably spent with the OS (Linux) swapping stuff :-P.
Yes. As a CI pipeline the Firefox team do run their own compile farm called TryServer [1] and is building a various different versions (possibly all of accepted versions) at a time.
Level 1 contributors (contributors who are acknowledged by one of Mozilla dev) grant access to TryServer.
I do, and it makes a significant difference in most of my workloads. A lot of my work requires testing bleeding edge software, for example, applying a small patch to the latest version of kubernetes then running a small virtualized cluster locally to make sure it works.
Even if not for that, I tend to cross compile a lot of software since all the engineers at my company use macs for software development but we deploy to linux servers, so often I end up building rust binaries for linux and it's fairly computationally intense.
For an anecdote, on my work laptop (i9/32Gb/512SSD 2018 15" MBP) I can compile the dev environment from scratch in 40 minutes whereas it takes 4 hours on the company standard dual core/8gb/256gb 2015 13" macbook
I do, although not the kind of software development most people do.
I’m a robotics controls engineer. The code I write is pretty short compared to most of you, and it doesn’t take long to compile. But to test it I have to run very complicated dynamic simulations. These never run in realtime. It’s not uncommon for a thirty-second simulation to take five, ten, fifteen minutes to run; sometimes I have to run many iterations of the sam sim to collect valid statistics. So both CPU speed and number of cores translates directly to time spent waiting for the sim run to finish.
I have a 2013 trashcan Mac Pro with 12 cores. It’s still a good machine.
Oh yes, it makes a _huge_ difference. Most modern laptops use constrained, low power CPUs and GPUs. They are dog slow. Even my 5 year old desktop is substantially faster (like, builds take half the time kind of difference) compared to my <1 year old max spec'd thinkpad.
Possibly the only laptops that could compare to desktops are gaming laptops, and I know developers who buy things like the Razer gaming laptop to try and get desktop-like performance.
For me the problem with gaming laptops for work is their GPUs. I honestly don't need anything beyond build in Intel GPUs but I would love to have powerful CPU + a lot of RAM. Highend GPUs are driving prices of laptops very high so essentially I'm paying a lot for hardware that I would never use. That's why I prefer desktops, there is no problem with having Threadripper and 128GB of RAM in case with some basic $30 used GPU.
I think good GPUs are necessary to support external monitors. My thinkpad really struggles to power two external normal DPI monitors. If I had high DPI I don't think it would even be possible. I suspect the gaming laptops with their full powered GPUs, usually Nvidia or AMD, would have no problem powering several high DPI external monitors.
That's true but still you don't need Titan to power two 4k monitors, probably good old 750Ti or even older GPU will be enough. For me it's even easier because my sweet spot is one 1440p display (that's where I work most efficiently) so anything will be good enough for me.
I migrated from a MBP to a 2018 Mac mini with 64GB, and the ability to run say 10 VMs at once, without having to limit them to half a GB RAM each is amazing.
I'm not in the market when this thing (Mac Pro) is actually released but in a year or to, I may well be migrating to the latest of the Pro line - I don't care about GPU (much - it just needs to drive some displays) but I do care about CPU speed + core count, Memory and fast storage.
I do, I really like it. Current uptime, ~90 days, got everything the way I like it with i3... I can have hundreds of tabs open in various browsers, I've got 9 workspaces with various projects and things in them and a 4k display.
Actually my main rig is the last mac pro, the 2013 one - I just don't use apple's OS. I really like the rig and I expect to use it until at least 2023 (10 years of use).
I use a dual 10C Xeon (40 threads) with 128gb of RAM as my work desktop. I do builds development for C++, Java, SystemVerilog, and Python codebases. Jetbrains IDE products like IntelliJ, PyCharm, and CLion can use extra cores and ram for indexing our large codebases. The high core count is also useful when compiling C++ from scratch (common for me because I'm often doing weird things that defeat the bazel caches) and running large test suites or medium simulations.
The latest MacBook pros (8 core, 16 threads, 32gb ram, 2tb flash disk) are finally fast enough with enough ram to both run CLion with clangd indexing and do C++ compilation/testing. Full builds are still not advisable (not that a full build is particularly useful since our prod is only Linux and we have CI/desktops to avoid cross-compilation).
I use a desktop, not sure I would call it high end (32 GiB, 6 cores, nvme). I do lots of large builds of things like linux kernels. I doubt any laptop would be able to keep up simply due to thermals.
Yeah, what's considered high-end does shift, although that still sounds higher end than most laptops. I think a typical "high-end" desktop is one that has more cores, memory channels, and PCIe lanes than a typical "mainstream" desktop (whatever that happens to be at the time). Support for ECC memory is also sometimes a requirement.
I use a 3 year-old laptop with 32GB of RAM, an SSD, and a dual-core i7 (U-series) processor. It's reasonably efficient. I think the amount of RAM and the IO speed of my SSD remove most of the hardware bottlenecks I face.
However, I would like compile times to be faster, I do sometimes notice applications hanging, and IntelliJ (we use Java at work) seems limited on how many projects I can run efficiently in a single workspace. I'm just wondering whether a workstation-grade laptop (Dell Precision or Thinkpad P-series) would be a sufficient upgrade, or investing in a desktop would be worthwhile at some point.
> I'm just wondering whether a workstation-grade laptop (Dell Precision or Thinkpad P-series) would be a sufficient upgrade, or investing in a desktop would be worthwhile at some point.
Yes, either 17" machine offers an 8-core CPU and plenty of thermal capacity (i.e. larger chassis than their 15" equivalents), along with more than enough memory (128GB), and in Dell's case, a boatload of HD slots (4, run in whatever RAID config you want). These are desktop replacements, heavy (as in 6.5+ lbs), with 240W power bricks , so you'll probably want them to stay on the desk for most of the time.
I'm personally looking to replace my old Dell Precision with a 15" 5540, maxed out. For Scala work the extra cores help reduce build times, probably the same for Java.
My company provides a pretty standard laptop, but they rarely have more than 2 cores.
In native app development, Android Studio and Xcode are severely bottlenecked by 2 cores. The difference between my laptop and my Mac Mini (6 cores) is astounding.
The difference between a build taking 6x minutes and 1x minutes is more than just 1/6 because it goes from "Well, guess I'll check email/messenger/HN real quick." to taking a sip of coffee.
It also gets real bad when your IDE is forced to fight with your browser, email client, etc. Then you're forced to do nothing, lest your casual actions further slow down your local build.
Usually the "breaking point" is when builds take more than 5 minutes, but especially when they take more 30+ mins.
Slow builds can definitely break concentration. I'm on a dual core as well, although most new business laptops come with 4 cores.
My build times aren't as long, because I'm working with Java-based microservices, so each service takes a minute or so to build (with tests) but even that delay can break concentration. Turning off tests helps, but then you don't always have the immediate feedback of the test results (and don't worry, I always run the tests before committing).
I worked on a codebase a few years ago that took over 44 minutes to compile... a change you would think should take 10 minutes took 3 days. RSS was still a big thing at the time, and did a LOT of reading while waiting.
I used to be happy with Java and .NET builds, until I got to deal with Android builds.
Android is probably the only platform that in almost every conference has at least one talk related to how to improve build speed, thanks Gradle + Android Studio.
A little off topic, but what does your microservice build with tests look like? Trying to figure out what I need to do if I want to try microservices correctly...
The 2018, 6 core Mac Mini is underrated for use as a HEDT in my opinion. It's surprisingly powerful and silent. I really have to thrash something to even get the fans to spin up.
When I started working in my current company, the dual core, 16GB MBP that was given to me was enough since I was working on a single project.
But now that I am working across projects that run 2-3 repositories at a time with Jetbrains IDE, the computer almost barely keep up. I get times where typing will actually lag.
I am contemplating asking them for a Mac Mini instead since I rarely ever venture out from my desk.
I bought a used Dell workstation with a 8c16t Xeon E5-2670 and 32GB ECC DDR3 RAM a few years ago. It's a bit outdated now, but it's still faster than the most laptops (for multi core workloads). At the time I had a MacBook Air, so this completely blew it out the water. I remember one time while trying to debug an issue I had 10 docker instances of our CI running (Rails + Cucumber + Firefox) and there was absolutely no performance impact to the desktop experience.
Last year I bought a T470s with 20GB DDR4 and an i7 (U series) as I was mostly working away from home. It's good enough for most of the work I do, but it can be a bit slow at times. The processor just isn't as fast and the integrated graphics struggle with a 4K desktop (I think that's mainly Linux/GNOME being unoptimised though). I haven't noticed it throttling, but my workload usually isn't that CPU intensive.
If you mainly work from home I'd definately suggest building a desktop machine for your needs.
To add to the chorus, Threadripper has been incredible for deep learning related workloads, partly for the high core count for prepping batches, but also because it can support multiple GPUs and gobs of RAM. If you have a lot of big compile jobs, it shines there too.
Not a desktop, but... My employer typically gives people a MacBook Pro, but when a colleague left a couple of months ago I snarfed his laptop - a ThinkPad with 24GB RAM and an i7. I put Slackware on it (for I am Old Skool) and it's really handling many many tabs, plus our many-docker local dev environment far better than everyone else's Mac.
Of course, it's far bigger and heavier than the Mac, but also of course (for I am Old Skool) I'm using it with a decent keyboard, a decent-sized monitor, a decent mouse, and a USB-C charger. It does mean I can't go to meetings, but that's ok with me...
Does wanting to count? I don't have a need right now, but some years ago I was doing iOS development; compile times started to get to a point where a more powerful system would've come in handy.
Mind you, my workflow could probably do with adjusting as well, I tended to work like I do with webapps and just continually rebuild/restart and review my changes on the device itself (or an emulator).
Part of me also wishes we had a big setup with multiple docker images running in parallel but ATM we have the luxury of working on 'just' a website (react, some lambdas, back-end are all third party services, we connect to a staging environment during development) so it's not too bad.
But I'd still like a permanent machine, there's something about (and this is me idealizing) having a fixed workspace you don't have to pack up every day. I mean sure disconnecting a laptop and yeeting it into a backpack isn't that much effort but it's the little things.
Yes. I have a company issued 2018 MBP 2.6 GHz i7 & 16GB RAM but I prefer to do most of the dev work on my HEDT running Linux (Ubuntu 16.04), 4.0 GHz i7 & 32GB RAM.
It's way better at running Docker instances, a bunch of electron apps, and tons of Chrome tabs simultaneously without a hitch.
I do all my work on desktop workstations; I have a little burner laptop I lug around to meetings, but all it really does is browsing and outlook.
Having a real keyboard, mouse, and four 27" monitors is something I will never leave behind at this point. All that screen real estate to spread out over helps enormously. I can have a browser with our application pulled up, Visual Studio, another browser with doc, Webstorm with the front-end code, SSMS connected to the database, an email client, Notepad++ with logfiles, all the different chat clients I have to use, and more if I really need to, all on screen and available at the same time. I don't have to alt-tab around, I just look and it's there.
I do, but mostly because there's a few projects where running multiple virtual machines with multiple cores and lots of ram is useful for them. Also because I run a VM for cad work for my 3d printer, on the same machine.
Yeah, I sometimes will be running a dozen or so docker containers, although the CPU utilization on those tends not to be very high - RAM is the main bottleneck for me. However, I also don't typically leave those running. I just spin them up when I need to for testing. I'm not sure how fast my other applications would run if I left those up.
My HEDT at home serves as my VR development machine and doubles as a gaming rig. I generally save basic research, design, and sandboxed coding for my MacBook Pro at a coffee shop.
If I had a similar performing laptop that could replace my desktop in terms of GPU performance and compatibility with the devices I use, my HEDT would start collecting dust.
Not a HEDT here, just a regular desktop, but it is significantly more responsive than a business laptop. Also much cheaper than a laptop comparable in performance. Plus it's upgradeable so more future-proof.
I still have an old laptop that I use to ssh into my workstation occasionally.
Depends what you're doing. I'm a low-level developer, my build times are less than a minute, so cashing out on a huge beefy PC would barely make a difference to my workflow.
The standard business laptop msxes out with 16GB, which is good enough if you don’t run all the secondary stuff like databases locally. But 32GB would be great.
Reading the replies to this make me very grateful that I just run a couple of Docker containers for local Postgres/Redis along with some non-compiled node.js
Web Dev. Going from a 2015 MBP to the new touchbar model, my cold build times were cut in half. Other than, saving 15 seconds once per day, I noticed nothing.
People here seem to think this is just a random set of components not as good as X+Y+Z. This machine was designed and specced based on talking with the people who will be buying them, not the general public. Complaining its not X+Y+Z is like saying IBM didn't design a mainframe for your needs. Not everything has to build for everyone.
330 Comments, No one asked why was this submitted.
We have discussions on HN [1] when it was announced. The Tech Spec page has been there since day 1, I looked hard and dont see any significant changes, if any changes at all.
That was not the point though, Its release were widely known and this spec page isn't news. It is generally accepted on HN unless the last submission had little to no discussion, and there is nothing new related to the topic, we mark those as duplicate.
The starting 256 GB SSD sticks out like a sore thumb on a machine as expensive as this. The barebones Mac Mini comes with 128 GB and 6-core Mini with 256 GB. And upgrading it is gonna cost an arm and a leg.
Depends entirely on workflow. A lot of shops will have all of their stuff on a 10G NAS share, so you’re actually forbidden from copying to and from your computer because it’s just stupid and sometimes slower to do so.
Yeah, that’s what they give on the iMac Pro as base, so it’s not like they’d never do it. But it’s clear they’re deliberately under configuring the base to force complete customisation, which is what I assume the kinds of people buying this will want.
I’d go so far as to say they should sell it without a default hard disk if they could - let the buyers make the choice. I’m guessing it’s defaulting to the smallest disk just for optics.
This is clearly a start with the skeleton and build it yourself kind of machine, which is as it should be. It just looks odd when the selector defaults to the lowest available option on every selection.
It is still not shipping so there's not really a reason not to use a Threadripper for lower price AND higher performance. Or even Epyc chips. We know the Titan Ridge Thunderbolt controllers work on AMD motherboards. Alpine Ridge had firmware problems but Titan Ridge is fine.
You can likely Hackintosh it with a 64 core Threadripper, WRX80 8-channel board, 4TB ECC LRDIMM and run circles around Mac Pro for half the price. Ryzen seems to work quite well with the latest macOS.
Well, AMD had nothing to offer really especially in the low power space where Apple mostly is but Sharkstooth and Rome so throughly trounces Intel high end it's not even funny. (Yes, Sharkstooth is not yet out but already the Ryzen 3000 chips make a joke out of the lower end of Intel Xeons and leaking benchmarks paint an interesting picture of the upcoming Threadripper 3000s.)
Motherboard design takes time — and Apple would also need AMD firmware and software support. If Apple want to go AMD it would be next year's model at the very earliest.
Eh, no. Google has been testing Rome for a long time before official announcement, this Mac Pro could easily have been the launch customer for Threadripper 3000.
I wish I could afford these, but I’m really curious to test a hackintosh with threadreaper 3 following what they did on Linus tech tips with a virtualized PC and Mac running at the same time on the same monitor https://youtu.be/EozeSDeV3Vo
Hackintosh allows you to have OS X on better hardware but these days it's really unsafe, due to the possibility of Apple going full ARM in the near future, dropping all the support for x86.
How is it different than buying a Mac now? It also won't upgrade itself to an ARM processor and they'd need to support existing macOS installations for a while anyway.
Not a chance, the Mac Pro isn't even out yet and it's Xeon, so x86 support will be in Mac OS for at least the next 10 years. There's just nothing from ARM that can compete with Xeon at any price. I wouldn't be surprised to see it in the low-end laptops within 5 years, but x86 is still safe as houses.
I'd argue that the software people run on a Mac Pro is a strict superset of the software people run on a MacBook. Combine that with Apple's philosophy of hiding technical details from users (e.g. with fat-binaries in Darwin), so if Apple _does_ switch to ARM for the MacBooks that's not a problem: Microsoft Office for Mac, Chrome for Mac, and the rest (if they aren't just Electron apps like everything is thesedays) will have to be an x64/ARM fat-binary for the Apple App Store - and no-one will complain about specialty pro-only software distributed outside of the store only being available as x64.
It seems like a Hackintosh would actually be the better choice if you're worried about Apple moving to ARM, since you could run Windows or Linux on it without relying on Apple for driver support.
Given that the latest Mac still uses an Intel CPU, I’d say the range is [5,7] years at minimum, based on Apple’s current vintage/obsolete policy, before x86 support is dropped from OSX.
Considering they just spent possibly 6 years designing this crazy expensive x86 beast, I don't think support is going to drop that hard that fast to warrant avoiding x86.
You'd be surprised how married effects houses can be to their chip architectures. Attempts to switch can sink you when you realize that some technical artist wrote some extremely critical plugin or extension that's super fast only because it uses Intel only or AMD only instructions.
My own theory, Apple has probably already spoken with clients like effects houses long ago. Since these are the clients that will generate most of the revenue with this product, I'd wager they've put it together almost especially for them. If anyone else wants to buy one, gravy, but I'd bet Apple's money comes mainly from clients in those very specialized segments.
My guess is, they won't be doing anything those clients don't want them doing with that platform. Apple made that mistake once already, and it sounds like they are done chasing the larger market with their pro platform. Basically, They're going the HP workstation route and just becoming an Intel distributor. Only they're trying to distribute components in a bit more aesthetically pleasing package.
The Current 2990WX is typically slower. For example, rendering is a strong point of Zen: https://www.anandtech.com/show/13748/the-intel-xeon-w-3175x-... but here it is behind. The Mac Pro's Xeon W would be slower than that tested Xeon W in multithread but faster in lightly threaded workloads.
But AMD said new workstation chips are coming in November. This could be up to 64-core Threadripper chips with PCIe4 support. Though we have no idea on the clock rates so it is possible these Xeon W CPUs may still come out ahead in lightly threaded performance, especially with AVX-512.
Different market segment. The Mac Pro has 64 PCI lanes, and eight PCI slots, along with supporting 1.5TB memory. 3900x for example only has 24 pci lanes and 128GB memory support.
Edit: My bad... I misread thread ripper for Ryzen for some reason
Presumably that's with the dual-GPU, 28-core processor model under full load (and the MPX module drawing its full 500W, among other peripherals). An office with this kind of workload is going to require robust wiring regardless of which particular workstations are installed.
And robust air conditioning, too. Source: worked in an office full of dual-GPU deep learning workstations. We had to bring in those gigantic rental portable conditioners in the summer, and re-do pretty much all of the existing wiring in the office, because electricians who installed the first iteration could not conceive of having two dozen ~900W workstations cranking away 24x7x365 in a single open office plan.
> Why do you have the GPUs inside the office with the people rather than in a server room under temperature control?
Because Nvidia, the 800 lb / 400 kg gorilla of the GPU ML/AI space, has decreed that only "data centre class" GPUs can be used in a server room, and those are much more expensive than the GeForce and Titan cards:
There's _also_ a sizable server room with temperature control (which was also woefully underpowered when we moved into the office). It's just more convenient for researchers to have a couple of GPUs locally.
You can remote GPUs pretty far these days - it’s possible to have a local system and have the cards still connected with PCI but on the other side of a wall.
You'd need 100GbE NIC and network fabric for that to not be a waste of time though. That costs more than renting an industrial portable air conditioner.
That'll be a peak amperage; it won't draw 12 amps by itself.
Even if it did, most displays pull less than 1A, so theoretically you have room for 3 more displays on your standard 15A North American residential circuit. Any office should be on 20A circuits, anyways.
The AMD Vega II appears to be a higher-end Radeon VII except with 32GB of HBM2 (instead of 16GB like the Radeon VII). So not really. AMD VegaII Duo (the dual-Vega II) is going to be far more powerful than anything AMD releases in the near future. 64GB of HBM2, 28+ TFlops single precision, etc. etc.
CPU-wise, probably. Xeon W 28-core is still quite good from Intel however, with AVX 512 support and such.
Not just Threadrippers, but better EPYCs as well. More PCIe lanes, more cores, cheaper (hence higher profit margin), and the only thing you really lose is AVX512 which hardly any software besides Intel MKL makes use of anyway.
Think about what you're asking: this thing uses a high-end server board with a bunch of exotic specs. Those things take time to design, validate, and test — don't forget that reliability under sustained load is a hard requirement in that market — so adding extra chips isn't done on a whim. Then you have to ask how many people are using this for something where the performance hit of using WiFi is acceptable — it has two 10Gb ethernet ports so you can get an idea for the kind of usage they're targeting.
I'm bummed out that there's no sign of Navi cards as an option. The 5700 series seems right at home in this machine.
I'm still holding out for a Navi product from Apple. There's some mention of support in the Kexts but alas my 5700 XT does not work in Mojave or Catalina (Hackintosh).
Why do the different processor quantities have different base frequencies (higher for lower CPU count) but the same "Turbo Boost" frequency of 4.4 GHz?
If you ran Folding@Home on one of these, would you get sustained 4.4GHz clock speeds?
Thermal limits. Lots of cores in a tiny space will generate lots of heat under a full workload. They can do full speed in short bursts (or when only one core is working hard) but can’t sustain full performance indefinitely without damaging themselves. The more cores in a small space, the lower the clock speed they can sustain for long periods of time.
Intel's Turbo Boost specification only allows for a few seconds of boosting. Of course you can overclock (letting it run longer than stock settings) if you have better cooling or if you won the silicon lottery.
Furthermore, I think the advertised Turbo Boost 3.0 frequency is single-core only. It will not increase all core frequencies to that level unless there is sufficient thermal headroom.
Very little. Yes, they finally made a Mac Pro that is extensible and up to date. Basically everything the 2008 Mac Pro already was. On the downside, they doubled its price so for the non-Hollywood customer, there still isn't a desktop machine. Even 3k is much for a deskop machine, but I had set some money aside to get a Mac Pro, if it had started around its predecessor price. There was even a time in the past, when a Mac Pro would start below 2k and consequently was very popular.
> on the downside, they doubled its price so for the non-Hollywood customer, there still isn't a desktop machine. Even 3k is much for a deskop machine
I think there were more benefits to an ordinary user to getting a tower in the past then there are now.
In the past even a hobby or prosumer photographer would see a big benefit from getting a Mac Pro. Nowadays an iMac or Macbook Pro with very normal specs can edit large RAW files without breaking a sweat.
The extra HDD bays on a Mac Pro were great because you didn't have to mess around with USB2 (cheap but slow) or FW (fast but expensive). Now you have USB3 (cheap and fast) or TB (very fast but expensive).
I guess that leaves upgradeable graphics cards, at this point it is easier to just get a PC or try a hackintosh build if you want a beast GPU for the latest games.
> There was even a time in the past, when a Mac Pro would start below 2k and consequently was very popular.
2006:
Mac Pro base model:
$2,199 ($2,800 in 2019 dollars)
Internal storage is a huge thing. I have external storage attached to my 27" iMac, but it is not completely reliable (disks get ejected occasionally) and completely beats the purpose of an elegant desktop machine. So I really would like a machine with several drive spaces, especially if I can access them. I could have lived reasonably by upgrading the internal storage of my iMac, if there was any way for me to access it.
Graphics cards is another thing, but also the plain ability to clean fans when they start to clog up. The limitations of the iMac are amplified by Apple making the interior inaccessible.
Finally, while the screen of the iMac is great, I would like to have a larger screen.
So there are plenty of reasons still to have a bit more than the iMac can deliver.
I've had BlackMagic 1U SSD rack with a few drives connected to my 2013 iMac 27'' via Thunderbolt since like... 2013, and not a single time did they disconnect. It just works.
The rack is under the table so it doesn't "beat the purpose of an elegant desktop machine" either.
Lucky for you that you had no disconnects. But getting those limits which files you can put on the external disk if you need them available all the time. E.g. when my external disk gets disconnected, EyeTV stops working as its work directory is no longer existant.
Also, I don't see how having an additional large box (which by itself costs as much as many PCs) doesn't defeat the purpose of an all-in-one machine.
I am not arguing that the iMac shouldn't exist, I just listed a few points which can be better addressed with a proper desktop machine. Why I would be willing to spend quite a bit of money for that convenience.
Maybe check your power supply? I've supported Macs for a couple of decades and normally people have storage mounted for years without this kind of thing happening in the absence of some sort of hardware or environmental problem.
I wouldn't know what to check my power supply for and how I would go forward with that in an iMac. And this phenomenum isn't limited to the iMac, my Mac Mini had the same problem. Across different disks and interfaces. The disconnects are not constant, but an irregular thing. A colleague occasionally has the same problem, he even managed to get a local disk image unmounted when waking from sleep. So it might just be a MacOS problem.
With Thunderbolt3, there are more solutions for external storage, but still I find it odd consindering the price, that you don't get a desktop Mac where you can plug in some NVM SSDs or you get at least a few 2.5 inch bays.
Gonna need a citation on the Xeon-W board with PCIe 4.0 support. Apple aren’t using Ryzen and definitely not Ryzen 3. If they were they’d be waiting for Threadripper - even current TR doesn’t support PCIe 4.
No it's not. I like MacOS and use their laptops, but Apple proprietary approach does not work for me when it comes to desktop. Their development lifecycle must be very long if they missed advancements in the CPU space. Ryzen to the masses.
Surprised no one(?) commented on the case yet - looks like they came to their senses and went back to the "old" case design.
I've been seriously considering a second-hand, early Mac pro - just for the case. Not sure if they fit regular/modern psus - but those cases were very nice.
Good riddance to the trash can design...
Now, if these wouldn't be priced like a car, and came with some proper AMD cpus ...
Honestly I feel like reading that just leaves me confused. There have been some rough spots, but it seems like it's much less contentious than the relationship between Apple and Samsung, and Samsung still makes iPhone displays.
hell they dropped Nvidia support awhile ago and Nvidia stopped making drivers earlier this year that could be used. I had a 2013 model with the Nvidia 780m chip that I was holding onto since I bought it in hopes of a new Nvidia based system but alas that did not come. Then suddenly early this year my github script reported no compatible drivers after an OS update and by June it was quite evident we were not going to get them,
so but a refurb BTO to replace the 2013 model, stuck with ATI but at least the screen appearance is a big jump for me, my model was the last on the non retina models
There is open CL now which blender and other programs have started using. Hopefully the lack of nvidia cards in these mac pros means proprietary tech like cuda gets left behind.
It’s seems like they went all in on AMD a few years ago, so I assume they have a bunch of long term agreements that lower the price?
It’s more irksome to me that they don’t have a high spec non-pro desktop.
Eg I want high end consumer desktop GPUs, not high end mobile GPUs (afaict that’s essentially what iMacs use). But my alternative appears to be “pro” GPUs with their absurd - for a consumer at least - markups and Xeons, which no one really wants ;)
eGPUs have been supported for a couple years now. Grab a Mac Mini, an eGPU, max out the RAM, add an external hard drive or two, boom, your ‘pro consumer’ Mac for about $2k.
Sure, and that must be frustrating to an extent. My 2018 MacBook Pro’s GPU does well enough for my purposes with my HTC Vive, so I guess I’m not in the market.
I'm close to going dedicated windows/ubuntu computer with an nvidia card for ML & gaming and a mac mini hooked into a KVM on the same screen. But not having allowed nvidia drivers in 10.14+ is a disappointment.
I’m currently running a VM with a GPU passed through and moonlight on my Mac for gaming. It’s really nice, my old MPB isn’t doing any real work so it easily lasts for hours if on battery, and the VM doesn’t eat up all the resources on my server so I’m free to do whatever I want to with everything else.
My only real issue is I occasionally get slowdowns while playing which causes things to stutter. This is most like partly to blame on using WiFi for the Mac and partly to blame on using only a single port NIC for the server.
Please forgive my ignorance but the specs on this are just average in my opinion. Why would I choose this over a custom build pc with almost the same, if not better, components?
My experience with all apple products at work has been quite bad so I might be a little biased.
Try to build a custom PC with equivalent specs, paying attention to the expansion options and I/O characteristics, and you'll end up with similar pricing. People who need professional workstations are in a different market than gamers or almost all software developers — the question is kind of like asking why anyone buys a semi-truck when your pickup can beat it off the light.
If you don't get why you'd want to buy this, then it's not aimed at you.
"Why would I pay AWS to host a Virtual Machine for me when I can get a physical host on Hetzner for the same price, if not less, that's 5x better?" -- the point has truly been missed.
And what is the point then? I'm serious since i read this "if you do not understand why then it isn't for you" but without any explanations about what isn't understood.
Think about other products that exist in the wild. Have you ever looked at something on a shelf, or in someone's hand, come to an understanding of what it is and what it does and then thought, "That's stupid. I don't need that. There are better options!" The thing is the person using it loves it , and it's solving a very specific problem for them, a problem you don't have.
It's the same thing here: you don't respect the product so you don't understand who would use it, so you don't buy it. That's fair enough.
But the MacPro is designed for a niche audience who are happy with the hardware, implementation, support, and so on. The things they're doing with this product are not things you're doing, so you don't understand their pain points.
This doesn't answer the question though. The question was "why would you buy this over a pc with the same specs?" and you answered essentially with "if you do not already know this then you cannot know this".
How would one gain that knowledge in the first place? We aren't born with the knowledge of why one would buy this machine over another.
We gain that knowledge based on specifics of the two machines. So the question is simple: what specifically makes the new Mac Pro a more desirable purchase option for someone than a PC with similar specifications?
I have zero interest in building my own PC, or dealing with fragmented and terrible support when issues crop up, or using anything other than macOS. And definitely not to save a few bucks. You clearly are different in your values and priorities. That’s what “if you don’t understand, you’re not the target market” means.
The parent didn't mention anything about saving money, or building a machine from parts. You can order a custom build from any number of OEMs and have a same-day on-site service warranty. The point about building a technically superior machine stands. I use Apple for their laptops because the OS is superior on a touch pad. But really all their desktop machines are joke hardware. The Mac Pro is just a bigger more expensive joke than the others.
Intel is old news on HEDT ever since Threadripper became a thing. The final nail in that coffin will be the November release of the third-gen Ryzen Threadripper making that 28 core Xeon obsolete. I'm curious if the new Mac Pro will even be available by then.
AMD builds admirable video cards, but the Radeon VII is inferior for professional work compared to Nvidia cards and CUDA.
MacOS is superior to Windows but inferior to Linux. A lot of people with HEDT builds are in Linux (myself included). I appreciate being able to do some of my favorite Unixy stuff in MacOS at a fraction of the speed, but for high end use it's still a joke.
I was very skeptical of this machine and the whole Mac Pro as concept car idea [1] but the Afterburner Accelerator Card has me wondering if I was wrong. Especially if Apple finds a way to support more works flows and applications with it. That could be a real advantage moving forward and it’s not hard to understand how it could trickledown to other products.
The only shocking things here are the lack of ports. Only 2 usb A and 2 usb c ports. Its basically required to use a usb hub on this device since once you plug in a proper mouse and keyboard you only have the usb c ports left
I thought that was really odd, but it appears to have two usb c (Thunderbolt 3) on top under "Additional connections", and the graphics cards each seem to have 4 Thunderbolt 3 ports on them. So 8 USB C unless I misunderstand.
I don't think they're expecting you to plug in a "proper" keyboard and mouse since it comes with a Bluetooth keyboard and mouse. Plus it appears you get another 2 Thunderbolt 3 ports on the front as well.
Which is insane.. do people really use the wireless capability of their keyboard or mouse? Especially when you can't use your freaking mouse when it's charging? Serious insanity.
Edit to add: of course I know people use those keyboards... I meant are people actually moving with them in ways that makes the wireless capability useful? Do most users not stay in a pretty close proximity the whole time?
The downside is that when your peripherals' batteries go flat, or the Bluetooth stack goes tits-up, you've got no keyboard or mouse.
This happens with some regularity on the iMac I'm typing this on presently. The system has a wired (non-issue) keyboard, but the wireless Bluetooth "Magic Mouse". Every so often the mouse (or more likely, Mac) decides it wants to wedge Bluetooth and forget the mouse pairing / configuration. Powering down the mouse doesn't fix this, nor does restarting the computer itself.
If it wasn't for the old legacy wired mouse+keyboard combo from the prior iMac this largely replaces, the system would be dead in the water when this happened.
(Subsequent OS updates seem to have made mouseless operation slightly more viable, at least for recovery purposes. It's possible to log in and fire up System Preferences, which is most of the recovery process.)
Too: removing additional steps in the keyboard/mouse stack results in crisper response:
The problem is you forget every time so once a month you are in the middle of doing something and you have to stop, turn your mouse upside down and then wait.
The only way you can run into that is if you have a three-day meth burner (so, no breaks where you can charge the mouse), ignore the three days of low battery warnings, and then use it right up until failure yet be outraged that you don't have 5min to spare (meth will do that).
My mouse needs new batteries maybe twice a year. The keyboard I’m using lasts considerably longer (maybe once every two years? I don’t keep track, but I’ve done it maybe once or twice, if ever)
The convenience of being able to use it anywhere is minor but I don’t mind the hassle.
Personally I used to use a Logitech wireless keyboard and mouse and stopped doing so.
The reason was my mouse cursor intermittently freezing - and there being no easy way to tell the difference between lag due to high workload, the signal being blocked by obstacles (like the computer's own case), intermittent radio interference, other USB devices spamming noise onto the 5v USB supply, and flat batteries.
It's possible Apple has solved all these problems - none of them seem intractable.
They work just fine with rechargeables, 4xAAA good quality NiMH cells with a charger costs only $35, and advertise 2000+ recharge cycles. The batteries in my mouse are probably a decade old and still last months between charges.
To me, yes. Having two less cables running around is nice and less hassle. To me that is easily worth spending a few dollars twice a year for batteries.
When price and space are not an issue with this device you would think they would just throw every thing on it so it suits everyone since there is no need to compromise here.
The monitor that you’ll use with this machine is likely to have a built in USB hub (see Apple’s new display or the LG display they were previously selling).
Still it seems like quite little. Motherboards used to have 8 to 10 backside USB ports, most now only come with 6 (usually 4x USB 3.0, 2x USB 2.0), which at least for me means I have to use an external hub already.
Looks like the aim is that you’d use Bluetooth for those. I know that isn’t what everyone wants and there is that option for corded, but the mindset seems to be for wireless.
I’m not sure how many ports are needed, though I’m sure more couldn’t hurt. This seems less about expansion externally than doing so internally. My MBP with 4 USB-C ports has one that is power + hub (HengeDock Stone) with one of the USB-A ports on it chaining to a USB-A (powered) hub. One other port is used for eGPU and the other two are untouched. Works for me, but I can completely see where this Mac Pro could end up needing a hub attachment for a lot of people.
They always catch me by surprise when they go flat. Also about twice this year my mouse has just not connected for some reason and I have to use the laptop trackpad to fix it. This would be a big issue on a desktop. And if I have to have a cable available to charge it I might as well just have a corded mouse.
The only time I find a wireless mouse to be better is for use with my laptop because its easier to bring to meeting rooms. At home I use wired since it seems all gaming mice have awful battery life. This logitech one I use at work has 70 days which works for me but every few months I have to run around and find a cable which isn't nice to use since usb cables tend to be a lot stiffer than mice cables
I've only had one bluetooth device, a Fitbit, that actually worked with any kind of reliability. From time to time I still have to bang it on the side of my phone to get it to pair back up.
I’m a massive fan of Fitbit devices, they work great. But the same can be said of my Plantronics BT headphones, which I’ve used daily for almost three years, same for magic keyboards, mice and touchpad of different generations. Truth to be told, any cheap device that wouldn’t work reliably (I can think of a few BT headphones in the past) I just tossed or returned when possible.
Think about this. This is a "Pro" level machine and Apples like "Because aesthetics...". Think about video workflow using external drive arrays. Many who are working with these sort of peripherals will want them plugged in all the time. So you need a docking station for your Pro level non-laptop workstation. Apple has gone way beyond thinking about their end user experience and send to only care about non-functional looks.
I like it. You pay the Apple tax for sure but the design makes sense and is nice. I wonder thought when they are gonna to announce the new MacBook 16”. It has been rumored a lot.
Bolivia's capitalLa Paz is at more than 3500m, El Alto in the same metropolitan area is at more than 4000m.
Not saying this is where they were tested (I have no idea), just pointing out that these altitudes are not necessarily far from major metropolitan areas and international airports.
Among other use-cases, high-elevation astronomical research can strain hardware.
ESO, the European Southern Observatory, has its Llano de Chajnantor observatory at 5,100m (16,700 ft), in the Atacama Desert. Low humidity is a major concern for electronics due to static buildup and discharges.
This page would look better if they used high-quality 3d animated renders instead of the simplified 2D-graphics. Someone on the design team thought this was a great idea? It looks unpolished and out of place (in comparison to other well-done Apple product pages).
There are already two other pages about the Mac Pro (linked at the bottom) with the more luxurious consumer feel to them. I think the functional look of the flat art is totally appropriate for such a text-heavy page—and the illustrations still animate seamlessly as the user scrolls.
Honestly if they're chained to this godawful scroll-based animation garbage they keep pumping out, I'd much rather the understated 2D graphics they're using here. The 3D looks incredibly amateurish to me.
FWIW by default desktops are configured to just sleep (hibernatemode=0 = suspend to ram).
Laptops default to persisting memory & sleeping (hibernatemode=3, a sleep which falls back to hibernation in case of complete power loss e.g. run out of battery). Both can be configured to regular hibernation (hibernatemode=25, persist memory then shut down).
So apart from the pleasure of being able to use MacOS, I presume it will be cheaper to just put these parts together myself, yes?
If the form factor is just a tower, well I can buy a tower myself. What is the advantage of paying for Apple a (presumed) bunch of money to build it for you?
If you don't want the hassle of building a PC (which is fair) there are an endless supply of places that will put PCs together for you, with (I'm presuming) a much smaller cut than Apple.
The amount of time I’ve had to fiddle with my audio/video editing beast of a hackintosh tower is quite insane really.
Both necessary and well worth it, as Apple simply wasn’t selling anything appropriately powerful for like five years (also why x99 was so tricky until more recently - no equivalent hardware of theirs), but by any other calculation if you actually put the value of your own time anywhere close to any professional scale (or see it as a hobby) it will very likely end up costing way more to do yourself.
I could understand that you would want a proper Mac for work. But I think that time is well worth spending on a personal machine, especially when it is easier nowadays.
Actually those server boards and CPU's from Intel are pretty darn expensive, as well as the fancy RAM etc. For it's specs, I think technically it is an OK price, but obviously for almost all other purposes you can custom build a PC for a fraction of the price yes, that's still really powerful. If you know how to hackintosh you can run Mac OS too.
But yeah it's to get Mac OS, that's how it's always worked, and the Apple workstations have always been at the high price segment, from the early G3, G4 days too.
I am working on a project right now where I am rendering atmospheric effects (lots of clouds). Even on a 12-core workstation, it takes a very long time to render a single frame to even be able to get a good idea as to what things will look like, so I'm doing a lot of changing something, waiting… changing something, waiting.
Most of the final rendering can be offloaded to a farm, but being able to quickly and interactively get a good idea of what something will look like is extremely helpful and time-saving. Right now, I'm doing some of the work on an EC2 instance with 48 cores (96 vCPU) to help with this—cost-effective for occasional use, but I can certainly see the appeal of a machine like this. And for task like compositing where you need access to a bunch of huge source files, or where maybe you have a specialized video I/O device you need to use, running in the cloud might not be practical at all.
Not for the renderer I am using (Terragen), which has specialized features for rendering skies. I do normally use GPU rendering when it is an option though.
Handling the burden of hardware fragmentation will reduce their quality and velocity. It's not in their DNA and definitely not in their strategy to make this happen.
The Mac Pro obviously shows that Apple isn't planning to completely drop Intel (as in x86) anytime soon. At minimum, new OS releases are going to support x86 for the next 10 years.
At the base configuration (single 8-core Xeon W) with 256GB SSD and 32GB RAM for $6000, it's about twice what it should be. I could build a faster dual Xeon Silver series with more cores, Nvidia graphics, much larger SSD and 32GB RAM in a nicer case for around $3000. So, that Apple logo and macOS are the other $3000.
I love how any post I make that is critical of something Apple always gets me some negative points here on Hacker News - I actually made a point of showing my tech class this social phenomenon a few weeks back and we had a good discussion that explored possible reasons why it happens.
You're getting downvoted for being incorrect and doing so in a way which has long been cliched. If you actually did your homework and made a spec sheet for an truly equivalent build you'd learn why some many people are tired of hearing lazy attacks repeated again and again.
The biggest new feature im looking for is the removal of the touchbar.
Until then, I’m going to just buy upgraded but used 2012-2013 macbook pros.
It’s one of those sounds good on paper but useless in real life. Not only that, it’s a hindrance for coders like me who need to be able to hit the escape button and other top buttons quickly with force feedback.
This is the frustrating part. The amount of innovation pushed forward in desktop computing in the past 20 years and every once in a while they get some things wrong and people just can't believe it. Manufacturing and component commitments make "Oh just remove the touchbar" comments lacking of substance. Same with removing the headphone jack conversation–no one wants to talk about the trade-offs in making a phone waterproof whether they trade a headphone jack for a waterproof phone. Most people would, probably, which is why it is gone.
Sorry, I'm new here and half the comments I read are bickering about some feature a someone doesn't think it optimal for how they use their device.
> Same with removing the headphone jack conversation–no one wants to talk about the trade-offs in making a phone waterproof whether they trade a headphone jack for a waterproof phone.
This is not true at all. There are multiple Android phones that have waterproof and they have headphone jacks.
They replaced the headphone jack with a barometric vent which other phone manufacturers just put in a different location. Apple didn't have to get rid of the port. They decided to get rid of the port because they knew that their users would buy the phone anyway and then have more of a reason to buy AirPods. There is plenty of space in that corner though. Some guy wanted to see if he could add the headphone jack back and was pretty successful for a hand crafted solution.
That is nonsense. He butchered that phone and definitely, 100% ruined the waterproofing function of the phone. Additionally, I'm curious which manufacturers you're speaking of because, the last time this type of comment came around, there was a distinct response of "most other phones don't include a barometric vent".
All waterproof phones with barometers would include similar technology. Phones like the Galaxy S7 which was also waterproof and which also had a headphone jack. The video is an individual messing around with a dremel and soldering gun. Of course it's not waterproof at that point, that was never his goal. It's ludicrous to believe that Apple had to remove the headphone jack when other companies showed it can be done just fine without and others have shown just how much space is still available in the phone. Your assertion that it had to be done isn't supported by evidence.
Surely if esc is important to you you shouldn't be using the esc button as esc in the first place? Caps is both my esc and my ctrl and I wouldn't have it any other way.
If f buttons are important just configure your fn-as-hyper+1 so do f1, etc.
If you're logged in, all you need to do is click on the "9 hours ago" field of the comment. That takes you to the individual comment, fully visible. So it's only one extra click to read.
This unfortunately doesn't work if you're not logged in.
Apple bloggers said time and time again the reason they were not making a tower was that there isn't really a market for one anymore, yet when they finally made it they decided to build something that only really serves the highest of high end video editors.
Completely ignoring 3D, mid range video editors, developers who need high core counts + ECC, deep learning, etc.
Before they made it we kept being told "There isn't enough of you to justify them making it"
They finally make it and the narrative turns into "It's not for you it's for people who edit Marvel movies"