Hacker News new | past | comments | ask | show | jobs | submit login

Funny enough, I predicted this in a previous HZ comment 3 months ago when they cut their server systems business.

> "My guess is that they're cutting the portfolio down to feature only chipsets and add-in cards. Just CPUs, GPUs, FPGAs, ASICs, networking. It makes sense if anything - it really focuses on their core business. No more side-projects with questionable profitability. In the last 2 years since Pat Gelsinger took over, they've cut RealSense, Movidius, Optane Memory, IPO'd MobileEye, and now exited the Server Systems business. The only odd-ball left are their NUC business."

If any hedge funds are looking for analysts, I'm always open to offers...

It's sad to NUCs go, but it was inevitable. They made products for customers, while simultaneously competing with said customers. It's hard to build any decent partnerships on that premise. I'm currently typing this on a Serpent Canyon NUC though, so it's certainly bittersweet.




Sad news. NUCs could have been an excellent platform if they had tried to work a bit more on cooling.

On third-party cases, NUCs were great. But on stock cases, some models were too noisy. Still, they provided great value.

Nonetheless, I agree it's better for Intel to focus on their core business and leave this market to others. Some niche PC makers such as Cirrus7 offer great NUC-like systems.


NUC, depending on the model, had decent cooling. I've owned a few of them over the years. My Skull Canyon having been my longest desktop computer ever and always had great Linux support. Funny enough the fan on that unit was recalled and Intel happily replaced it well out of warranty. But with the advent of Minisforum, Beelink and all the other random options on AliExpress I would gather it's been getting harder and harder to command a premium price because it says Intel on the box. In fact I just ordered a Topton SFF that STH recommended but was waiting until the i3-N305 was available for a new OpnSense build.

I wonder how this impacts emerging markets for Intel. If they've got no outlets to test their own product uptake and get people excited about the brand I feel like that's a missed opportunity and the cost of doing business. I saw a lot of NUC devices in data centers in the VMware hayday because they were so widely popular on TinkerTry and Virtually Gheto (i.e. William Lam) at the time.

I saw security vendors copy that play making SFF firewalls to incent stakeholders to run at home that turned into 7-figure deals over the long term.

I don't think Gelsinger is doing a lot of good for Intel. You didn't need him to cut costs. He's had plenty of time behind the wheel at this point, Intel should be on a more interesting trajectory by now.


> But with the advent of Minisforum, Beelink and all the other random options on AliExpress I would gather it's been getting harder and harder to command a premium price because it says Intel on the box.

It was my impression the NUC sold by Intel was always meant to be a proof of concept for the NUC-class device to inspire third parties to make them. Intel getting out of the market now that it's proven sort of makes sense.


The Intel NUC was too successful and none of the big commercial players (Lenovo, Dell, HP, etc) ever embraced it as they basically repurpose their Laptop boards and engineering into their "tiny" and "nano" 1L platforms which function like a headless laptop.

None of these players are interested in budget systems either, which is where my attraction to NUC platform was, Celeron. None of the big players touch Celeron, they only make I3, I5, and I7 systems, all in excess of $600 ea.

I can get a Celeron intel NUC fully equipment with a Windows Lic for less than 1/2 that that is perfectly suited for Line of Business Applications, Kiosk Machines, or other Low I/O, Low Memory single Application workloads.

Things that could run on PI or other ARM SBC if not for the application stack requirement of Win32 API and the management integration of Windows (Active Directory, Intune, ConfigMgr) etc.

I see alot of comments on here about how Great the NUC linux compatibility is. Ironically the Main Reason I use Intel NUC platform in a commercial setting is because of its Windows Compatibility, something alot of the low cost, tiny form factor, SBC platforms (like Odriod, rPI, etc) lack. I require Windows for my environment unfortunately .


NUC was always a platform that exposed the latest and greatest in a small form factor. I, honestly, don't believe the form factor itself was the driver. But the ability to show off the new hardware capabilities in a cheap, small, package. People were using these things in the DC, not because of the size (in fact that was often an impediment) but the capabilities in a small footprint.

Losing this outlet for them is a loss for Intel to bring new considerations to their platform beyond just chips. Again, if Intel was looking at NUC as a profit center, surely it was smart to shut it down. But I highly doubt the NUC line was a significant revenue generator to begin with. Because that wasn't the real value add for Intel. It got people talking about Intel and using it. That is one of the greatest forms of marketing for a chip manufacturer.


Yeah, I thought that was Intel's MO. Intel quit making RealSense only after a competitor (OAK-D from Luxonis) became viable. They just want to make chips. It's my impression that they only build "lower" devices as a proof-of-concept for how their chips can be used.


> if they had tried to work a bit more on cooling

If Intel had 'worked more on [thermals]' we probably wouldn't have the M2 chip.

At nearly every phase of Intel's history they've had the second-worst chips for instructions per watt, occasionally taking over first place.

In a world where a large and ever-growing percentage of all CPUs are not fronted by giant cooling fans... It's like Intel decided what they wanted on their tombstone in 1995 and haven't felt the need to change it since.


Things like the Dell Optiplex Micro and Lenovo Mini ThinkCenters can provide a similar experience. Generally I have found that they haven't had any major noise issues but a part of that was being very conservative with clock rates.

NUC isn't exactly the same as those, it is designed to be much more user upgradable and the price was very competitive. But even nowadays it is surprising to see them in the wild because they just aren't that common.


This is why I'm so sad about this news: reviews showed the latest generation had a much less, and more pleasant, fan noise


>cooling

I bought the Intel Euclid (basically a Realsense with integrated compute) and out of the box it had a fan curve that would cause it to thermal throttle within 5 minutes, which would also cause the wifi to drop out.

On a regular consumer device you pick a tradeoff that will make it quieter, but really the only thing anyone used the Euclid for was robotics, where if you're using the CPU at all it's to run SLAM or object recognition nets, in which case you need 100% of the performance 100% of the time...


I've been running a NUC7 for several years and the onboard fan has failed four times now.

Replacing it is a 15 minute job, but the timing is always terrible.

Now I have an alert set for when the CPU is heavily thermal throttling or when the temp sensors hit certain thresholds and I keep a spare fan on hands at all time.


Apple worked hard to make cooling work in the Mac Mini, and now the M series of CPUs.

Intel just slapped a chip in a generic case.


> Intel just slapped a chip in a generic case.

The "NUC" was actually an Intel developed form factor, smaller than Micro ATX. They had to engineer motherboards, power supplies, and cooling for the extremely small footprint. They were available as barebones kits (without a CPU), so literally the exact opposite of what you suggest.


An alternative take is that the NUC is made of laptop parts, with a footprint, thermal capacity and level of bespoke-ness similar to a typical laptop.


My 2018 i7 begs to differ. The cooling was never adequate enough for that chip even at 65W.

Now it’s comically overpowered. For 25-35W TDPs the chassis is fine, but Intel couldn’t deliver performance there. Apple should have retooled the case, not slapped it in there anyways. At one point when I had to use that i7 everyday I debated if it would be worth it to hack a proper PC tower cooler onto it and operate it outside the case.


I always saw the NUCs as proof of concept for small form factor PCs and to that end I think they have succeeded. There are a ton of different form factors now.


Didn’t Intel used to be pretty up front that the NUC was a way to push computer manufacturers to be more innovative? Sort of like a widely available reference computer.

By making them without RAM and SSD they were never going to be that profitable. It was nice being able to buy boxes without wasted parts you knew you didn’t want.


The probably is that the amd apu is just better and more balanced now. You get lower tdp + better graphic cards. So ppl would just buy the amd counter part


> It's sad to NUCs go, but it was inevitable.

Nooooo! This is a tragedy. I love my NUC. It's totally handy to have a little computer on my desk to load up with whatever Linux distro I please and play around. NUC is way better than Raspberry PI.


Isn’t this basically the innovator’s dilemma? You can call it focusing the core business, but you can also call it dropping investment in long term cash flows.


That's not the innovator's dilemma.

The dilemma is when you have tech A: Profitable and large. Tech B: Unprofitable and small, but the tech itself improves at a faster rate than tech A.

GPUs were their true dilemma. GPUs were only good for gamers, who are picky and extremely value sensitive. Whereas CPUs could be sold to datacenters and enterprises, very high profit margins. Hence CPUs were core at intel, no one has the balls to bet on GPUs and go all in.

But CPUs ran out of performance improvements, GPUs continue to scale up because of parallelism. Then suddenly, new valuable applications started to be based on GPUs, first Crypto, now AI. Now CPUs are completely commoditized, and Nvidia dominates the money-printing GPU market, and worth 8x of intel.

NUCs are not some rapidly improving tech, they are just some minor market that is profitable but never that large.


> NUCs are not some rapidly improving tech, they are just some minor market that is profitable but never that large.

Everything doesn't have to be


The NUC was never going to be a mass market product.

Sure Apple sells the Mac Mini. But the entire Mac line is only 10% of Apple’s revenue and most of that is laptops. But Apple has to support its ecosystem as its only supplier. Intel doesn’t

And besides, a NUC based on x86 chips is by definition going to suck at either performance or heat.


> Sure Apple sells the Mac Mini. But the entire Mac line is only 10% of Apple’s revenue and most of that is laptops

The 10% figure is spot-on and that really threw me - I was expecting something like 25-30%: https://www.apple.com/newsroom/pdfs/FY22_Q4_Consolidated_Fin...

It’s unnerving that Apple is effectively dependent on a single product line to subsidise everything: Apple Silicon must have cost tens (hundreds?) of billions of dollars to get to where the M2 chip is today (including acquisitions) - but it only exists today all because of the A-series SoC in the iPhone: without the iPhone Apple would likely be still dependent on Intel for Mac chips.

The iPhone isn’t going away any time soon - but when/if that day comes, Apple is going to be faced with having to maintain all those iPhone-originated projects with no easy way to walk them down (e.g. switching from Apple Silicon back to Intel - or stock ARM - is going to damage Apple’s credibility for years). I’m concerned Apple might be painting themselves into a corner by continuing to shed businesses it isn’t interested in that might be a useful lifeline in a post-iPhone world, like their XServe hardware - or even their MacPro workstations: Apple has seemingly intentionally priced themselves out of reach of smaller and indie creative-types - while at the high-end (movies, etc) the great migration from FCP-on-Mac to Avid-on-Windows took place almost a decade ago - and the lack of PCI-Ex GPUs in the 2023 M2 MacPro is further evidence, to me at-least, that Apple is increasingly uninterested in the Workstation market - which leaves them with less access to trend-setting professionals - which erodes their ability to compete, long-term, imo.

Intel killing-off the NUC is unfortunate, but ultimately is not Intel’s core business (chipmaking) which remains their priority. I don’t blame Intel for doing that; but it would be the same thing if Apple killed-off their Mac Studio line (they even look-alike): it isn’t their (current) core-business - but as Apple sheds - or neglects - its old core-businesses (i.e. computers) it leaves them with fewer contingency options.


> Apple has seemingly intentionally priced themselves out of reach of smaller and indie creative-types

I don't think this take makes any sense. Sure, you can spec out an insanely expensive Mac today. That's always been the case.

But with Apple Silicon, there's actually never been a better price to performance ratio at the bottom end of their lineup, perhaps in Apple's entire history. An entry-level Mac Mini or MacBook Air can give an indie creator tremendous power and performance per dollar.

I think you're right that Apple has lost a lot of creative professionals over the years who migrated to Windows, and the lack of plug and play GPUs may well keep some of those professionals away.

But I think it's actually precisely the indies that represent Apple's best chance at making inroads into the space again over the long run. If you're a kid trying to bootstrap a new YouTube channel or experimenting with filmography and building a portfolio, it's hard to recommend a better combo than FCP and a Mac Mini for performance and cost.


Except that the bottom end is severly hamstrung for "creators" with just 8gb of ram and 256gb of storage at half performance. Fix both and you are at $1000+ for the Mac mini; even worse in non-$ currencies, e.g. 1150€.

The performance of that one is good and great for stuff like video editing, but it is also double the price.

Side note: minisforum um790 with a current amd laptop chip seems to be competitive with an m2 pro (similar power and efficiency under load, somewhat worse idle) and you can even get 64gb of ram while still being much cheaper than the Mac mini. Q: What are the cheapest macs with 32gb of ram? With 64gb?


The half rate SSD at the entry point is unfortunate, but it takes the SSD performance from great to just fine. I wouldn't call it a deal breaker, especially at the price-point. If you can't splurge a little higher, you'll be ok.

Really, getting the insanely good media encode engines built into the silicon is the point for a small indie creative shop (which is what we're talking about here). The rest of the M1/2 chip is just gravy for them.


That still leaves the ram upgrade and thus 929€ at the current apple store price for 16gb ram, 256gb storage, 1 gbit ethernet. Which again, is an okay deal for a large sized nuc, but at $1000 that is about par for the course.


Again, we're talking about entry level creators. I have been talking purely about the entry level models this whole time. I know what they come with. I know the specs, the upsides, and the downsides.

The entry-level models are great. The best entry-level machines Apple has ever offered. They would slot into any indie-creators workflow very well.

Would I recommend some upgrades if they can afford them? Yes. Some more RAM and more SSD is always a good idea. If they can't afford it? That's fine, the base models are fantastic.


> If they can't afford it? That's fine

Agree to disagree. Even for "entry level creators" the 8gb ram is a dealbreaker in my view. That ups the effective base price to 900€. Again very good machine, but no surprise that 900€ gets you a good device.


The experience of a low-end memory mac configuration is mostly ok, but lots of people are used to comparing storage and memory directly and have the impression you get 'less for more' when shopping Apple.

In a sense, that's true when you're using badly written non-native software (Slack, Teams, anything Java based ;) ) and you really should try out if your apps fit or upgrade right away. But even an 8Gb mac will happily run iMovie, Garageband etc for a student project or hobby use.


> lots of people are used to comparing storage and memory directly and have the impression you get 'less for more' when shopping Apple.

Yeah, I feel like most people in these conversations are sort of in "grocery store mode" -- sitting in an aisle comparing the ingredients of one jar of mayo versus another jar of mayo from a competing brand.

It's just not that simple anymore with Apple's new SoCs. What Apple is doing with its combo of core performance, unified memory setup, specialized compute blocks on the SoC, and the overall thermal efficiency which allows all of this to just run and run without throttling all that much--it adds up to way more than the sum of its parts. You really have to use it in your daily life to believe it and feel the difference.

I still have a mix of Intel machines and Apple Silicon machines for my work and personal life, and it's just so immediately apparent the latency difference in usage. Apple Silicon feels and runs so much better.

Sure, would I recommend more than 8GB of RAM? Yes. More SSD is better? Of course. But Apple Silicon in even its thinnest entry-level configuration is an auto-recommend for me. And the prices at that entry-level are so, so reasonable for what you're getting.


> What Apple is doing with its combo of core performance, unified memory setup, specialized compute blocks on the SoC, and the overall thermal efficiency which allows all of this to just run and run without throttling all that much--it adds up to way more than the sum of its parts. You really have to use it in your daily life to believe it and feel the difference.

The future that fusion-HSA promised is finally here. Everyone drools over the possibility of PCs getting console-like zero-copy shared memory between the various accelerators, and people want analogous features to be ported onto the current dGPU/CPU paradigm (like directstorage). Fusion-HSA never got there itself, otherwise iGPUs would be able to do zero-copy already, but Apple has done it and the silence is thunderous.

Nobody cares, everyone is just waiting for AMD to implement something competitive and then it'll be cool. Strix Point/Strix Halo I guess.

PC enthusiasts are gonna hate this but I don't think you'll ever be able to really get to a high-performance APU without some kind of soldered memory. Consoles use soldered memory too. GDDR6 vs stacked LPDDR5X is a design call but both are clearly superior to socketed memory, you'd need an Epyc-sized socket to get an equivalent amount of memory bandwidth into an APU, and it'd pull an enormous amount of power for PHYs too. You'd probably end up with like 30w of idle power lol, meanwhile Apple is doing 25W for the whole chip. Crazy stuff.

The practical way that x86 is going to get there is what AMD is doing with Strix/Strix Halo and Intel is doing with Adamantine - you have to go to stacked cache to make up for the lack of memory bandwidth (and perhaps even still go to quad-channel like Strix Halo), and it's still going to use a lot more silicon (expensive!) and use a lot more power than just stacking some LPDDR5X on there and calling it a day. You can't get to 6-8 channels worth of bandwidth from 2 actual channels without some kind of a hack, either you run the pins super fast (GDDR6) or you move the channels on top of the package (LPDDR5X), and both of those need to be soldered. And that's a large part of why consoles and Apple Silicon can deliver a relatively large punch (3060 perf with a good CPU at 25W package power is nothing to sneeze at!) at consumer-friendly prices.

Low-end dGPUs are done for until stacking gets common, but APUs and low-end dGPUs are such an amazing impedence-match for stacking and people can't see it because it says Apple on the box instead of AMD. It grinds me when people ignore or shit-talk really cool advancements in tech just because it doesn't fit their mold or their brand, this is what everyone has been waiting 10 years for.

> I still have a mix of Intel machines and Apple Silicon machines for my work and personal life, and it's just so immediately apparent the latency difference in usage. Apple Silicon feels and runs so much better.

To be fair, some of this is the fact that it's *nix. If you run Linux on your Intel machines it'll be snappier than windows too, my 5700G SFF build is a crazy machine for linux. But honestly having a well-supported *nix ecosystem with first-class vendor support is a good value offering, I think that's why Apple is having a surprising renaissance with computer-touchers right now. Non-techies get the happy bubble OS, techies get something they can drill down to the terminal and do their thing with dotfiles/zshrc. And everyone likes the fact that they're a well-built laptop with incredible battery life and good performance while mobile.

> Sure, would I recommend more than 8GB of RAM? Yes. More SSD is better? Of course. But Apple Silicon in even its thinnest entry-level configuration is an auto-recommend for me. And the prices at that entry-level are so, so reasonable for what you're getting.

I agree with both you and the commenters you're responding to. If the entry-level models will work for your needs, they're value champs. You will not find a similar value offering to a mac mini at the $400 edu pricing for example, if your use-case fits into 8GB/256GB it blows away anything else at that price point. Often there are some deals at bigbox stores or electronics retailers (B+H, Best Buy, Costco, etc) that offer some decent prices on the higher range stuff as well. A loaded-out M1 Max 16" (MK233LL/A) is $3300 on B+H right now (and it was $3200 a couple weeks ago) and you can find 32GB/1TB manufacturer refurb (applecare-eligible) M1 MBPs on Woot pretty regularly. The refurb store also allows you a lot of the flexibility of custom configuration but especially when combined with the edu/veteran discounts it gets you close to the level of that bigbox or refurb pricing. M1 Max 10C/24C with 64GB/1TB is about $2550 for example, that's a pretty nice machine too.

I would really say that if you're a developer you probably do want at least 1TB storage. A lot of things rely on being able to store docker images/pipenvs/node packages/etc in the expected place, and while I'm sure you can configure them to run on an external, it's a pain, and they aren't configured to "float away to the cloud" like apple does with their first-party apps. A 256gb spec is a thin client/cloud terminal only - used 16GB/256GB MBAs are very very cheap compared to the higher models and I have to think some of it is because people try it and learn the hard way 256GB isn't enough for them. This includes me, and while I could never have gotten to "yes" on a loaded MBP or even a 512GB or 1TB upgrade on the MBA, 16/256 definitely did not work the way I'd hoped even as a homebody with a NAS. They really are aimed at momputers and people deeply into the icloud ecosystem where everything can be silently swapped out on-demand.

That said I definitely do feel the complaint others are making that Apple basically does "product tiering by RAM/storage envy", as I once heard someone call it. Really the difference between a MBA and a MBP or between the different chassis sizes is pretty minimal when you equalize for RAM/storage, a decently loaded 15" MBA is at least $1600 and probably closer to $2k, and that also gets you an entry-level MBP or a refurb loaded (32GB/1TB) MBP. $2k for a laptop is a lot but nobody else has the kind of laptop performance and battery life that Apple does right now, so it's kind of a question of whether you're just looking for the cheapest thing that checks the boxes or if you're looking at the offering holistically. Nothing wrong with an XPS or a Latitude or Thinkpad or whatever either, but if you're spending "premium ultrabook/business laptop/gaming laptop with dGPU money" you can definitely get a real nice macbook too and it both has unique selling points and targets a different set of needs.

And Apple has perfected the art of stacking their tiers perfectly so that you can talk yourself into getting the next higher model. That's why they're the most valuable company on the planet, lol.


> But with Apple Silicon, there's actually never been a better price to performance ratio at the bottom end of their lineup, perhaps in Apple's entire history. An entry-level Mac Mini or MacBook Air can give an indie creator tremendous power and performance per dollar.

The frustrating thing is just how quickly value decays with Apple's pricing strategy.

When it comes to the Mac Mini/Studio, outside of some extremely niche applications, it really only makes sense to get base models. It is infuriating how much Apple charges for additional ram.


Yeah, I can agree with you that Apple's general pricing ladder is annoying.

But I'll also say as a software engineer, the MacBook Pro 16 with an M1 Max is the best tool I've ever used for my work.

I dock into a thunderbolt setup with a couple high resolution monitors. I have docker running. Often multiple IDE instances with large codebases. Multiple Chrome windows with dozens if not hundreds of tabs. Zoom calls running screen shares. The machine barely gets warm and I never hear the fans. My prior work laptops would have been howling and begging for mercy.

I often forget that I leave applications running, only to remember later "maybe you should kill those processes". With past work machines, I'd routinely have to hunt for processes to kill to claw back CPU cycles and quiet down the spinning fans.

And then I unplug it from the dock and do all of this at the airport for a few hours. Still silent, cool, and performant. I've never seen anything like it before. And while it's not "thin and light" it's also not the heaviest workstation quality laptop I've ever used.

Not sure how this fits into your personal value propositions, but I can tell you I'd pay far more than I did for this quality of a machine.


> but I can tell you I'd pay far more than I did for this quality of a machine.

I'd buy a MacBook Pro myself, sure - except I. need. a. forward-delete. key.

Sorry, but Fn+Backspace just isn't acceptable to me.



> I don't think this take makes any sense. Sure, you can spec out an insanely expensive Mac today. That's always been the case.

Yes, of course - but historically Apple's MacPros (and G3, G4, and G5 PowerMacs before that) had a fairly reasonable entry-prices - but in recent years (especially since 2013) the entry-price of Apple's workstation-tier machines has risen sharply over inflation:

In 2005, Apple advertised[1] the PowerMac G5 for sale at $1999 - adjusted for inflation that's $3100 today, but today's equivalent: the MacPro, sells at an eyewatering $6,999[2]. While in 2013, that $1999 adjusted-for-inflation would be $2400 but the 2013 "trashcan" MacPro started at $3000. Today that would be $3900 - which itself is roughly half the current $6999.

As with the Vision Pro, Apple's pricing is intended to limit demand which accomodates Apple's low-volume, US-domestic manufacturing of the Mac Pro (and everyone else can just get an iMac Pro or Mac Studio) - but this risks making Apple's workstation-class machines so inaccessible they never develop an audience, and the companies that write software for workstation-scenarios (oil-and-gas? AutoDesk? etc) will likely avoid the hassle of porting Win32 or *nix number-crunching software to Apple's hardware. Overall, it feels like Apple is trying to contrive and stage-manage their own workstation swansong - it will be beautiful, but is it wise?

[1]: https://web.archive.org/web/20050228225922/http://store.appl...

[2]: https://www.apple.com/shop/buy-mac/mac-pro

:


> today's equivalent: the MacPro, sells at an eyewatering $6,999[2].

Others have already called you out on this, but are you being intentionally misleading to bolster your point, or just unaware of the Mac Studio's configuration options versus the Mac Pro?

The Mac Studio is effectively the new Pro. It has literally the same chip (Ultra) as the Pro for thousands of dollars less, and also offers another still very performant configuration (Max) for even less money. The Pro now only serves a very small niche that needs PCIe.

If you were to use the Mac Studio in your price comparisons, your point basically evaporates.


That’s because most people who bought these machines 20 years ago now just buy laptops which are perfectly sufficient for their use cases.

Even if there was a 3000-400 desktop Mac Pro I doubt many people would buy it instead of a MBP.

If you need a laptop anyway what’s the point of getting an another machine which is just marginally faster? That wasn’t really the case 10-20 years ago when laptops weren’t really an option as a primary machine for demanding use cases.


> If you need a laptop anyway what’s the point of getting an another machine which is just marginally faster? That wasn’t really the case 10-20 years ago when laptops weren’t really an option as a primary machine for demanding use cases.

Laptops, even those marketed as "mobile workstations", really can't compete with a proper desktop-experience - yes, a docking-station goes a long way with replicating connectivity options, but (in my life, at least) there's far too many qualiatative and quantiatative benefits to having a "proper" desktop for dev.

----

A major point for me is that I treat my laptops and portables as though they could/will go missing the next day - which means I'm careful to avoid putting irreplacable data on my laptop (and have Bitlocker enabled, which does noticably impact disk IO, even today) - whereas my desktop is a different, and a more trusted, environment. I'm not going to compromise my daily computing experience by using a throwaway-ish environment.


> Laptops, even those marketed as "mobile workstations", really can't compete with a proper desktop-experience

But that's what I've found remarkable about the MacBook Pro 16 with the M1 Max. It absolutely does replicate a desktop computing experience with smooth performance... and it does it even on battery power. As I said in my other post I've just never seen anything like it. It's beyond benchmarks to have an experience that just never hiccups or stalls on a laptop.

> A major point for me is that I treat my laptops and portables as though they could/will go missing the next day

You might need to recognize that you're the outlier in these conversations, then. The world has largely moved to laptops. Only those that truly need a desktop are issued one these days (and again, with the Max / Pro chips, even the need should be called into question for most use cases). Especially with hybrid work policies, I can't really imagine any modern corporate office issuing you a big old tower on your first day.


You really think most developers in 2023 are using desktops?


Honestly, I don't know - but I'm not willing to bet either way: don't forget there's a _huge_ contingent of people at companies of all sizes that figured out VBA in Office by themselves and write software internally at work who don't identify with us, the HN fringe, who only code in the office, on a company-provided desktop.

I've only ever seen (and experienced first-hand) software companies and startups issuing laptops as-standard instead of desktops twice in my whole career, all other companies I've either worked for, or worked with, preferred desktops.

At my current company we interface with a lot of independent contractors and I have noticed that exclusive laptop use is far more common there - but it's still at-odds with my own personal experience.


I work with a lot of enterprisey companies and state and local companies (cloud consulting department at BigTech) everyone has a laptop.

If you don’t recall there was a worldwide pandemic a couple of years ago and a lot of people started working remotely.


I would say the entry level workstation line today is Mac Studio (starting at $1999), not Mac Pro, which is strictly top end. The $6999 model of today is not equivalent to the old entry models in any way, unless you just go by bulkiness.


Why "not equivalent"? Both are top-tier workstation options of their times.

But... the UX of PowerMac G5 is lot more pleasant and everything feels lot more responsive than on modern Pros. Probably because of lack of signature verifications, SIP, RPC with Apple servers. But still, those machines _feel_ better.


> Why "not equivalent"?

The Mac pro is just a Mac studio with a "PCI express expansion chassis" bolted on which is of no use to most people. The price says Apple doesn't want to sell a lot of those machines, they probably only fitted a M2 CPU in an old Mac pro chassis to tick off the 'entire range migrated to Apple silicon' promise.


> But... the UX on PowerMac G5 is lot more pleasant and everything feels lot more responsive

Heh, that reminds me of when a friend of mine invited me over to show-off his hand-restored Mac (early-1990s-ish - I think it was a Performa 520 or 575?) and despite the lack of double-buffered graphics there were hardly any painting artifacts but the most striking thing was just how smooth and responsive everything felt - obviously the fact it's a CRT helps a lot.

When the world moved away from computer CRTs to TFTs we also went from 70-85Hz to 60Hz everywhere - and I swear I definitely can "feel" 60Hz vs. 85Hz - so I'm looking forward to monitors gradually shifting towards 120Hz or 144Hz (or higher?) because that definitely helps with responsiveness and snappiness, even with double-buffering and desktop composition.


An entry level cheese grater or trash can definitely wasn’t top end of their times. Middle end parts in a top end case, sure.


> indie creator tremendous power and performance per dollar.

Not with 8GB of RAM and Intel and AMD had caught up with performance per dollar. Apple is mainly competing on performance per thickness/battery life/fan volume which are still huge selling points.


> It’s unnerving that Apple is effectively dependent on a single product line to subsidise everything

From 1977 - 1988 Apple was totally dependent on the Apple //e (the Mac was losing money)

From 1992-2003 Apple was totally dependent on the Mac (the iPod was just a meaningful contributor to revenue when iTunes came to Windows.)

When Apple depended on the Mac for revenue, they consistently failed to keep up with Intel - first with the 68K and then with the Mac.

If the iPhone fails, the Mac no matter what would never keep up.


The one major flaw I would like to point out here. Don't look at development on apple silicon as only development on Apple silicon. They base the architecture off their A-series processors in their phones, but scaled up. So all the research money for Apple Silicon is also research money for their in house A-series processors for their phones.


Yes, that’s my very point.


> which leaves them with less access to trend-setting professionals - which erodes their ability to compete, long-term, imo

Those people all have MacBook Pros or Mac Studios. The Mac Pro has been a niche of a niche item for a decade.


And if Apple “fails” with “trendsetting professionals” which is only a tiny market and the Mac itself is only 10% of Apple’s revenue, does it matter?

The entire global PC market is around 286 million:

https://www.gartner.com/en/newsroom/press-releases/2023-01-1....

Compared to 1.5 billion+ phones

https://www.sellcell.com/how-many-mobile-phones-are-sold-eac...

“Trendsetting professionals who buy PCs” don’t even move the needle.


> And if Apple “fails” with “trendsetting professionals” which is only a tiny market and the Mac itself is only 10% of Apple’s revenue, does it matter?

Only in that Apple would need a platform to develop for iOS. But at that future Mac failure point that dev platform would just be iPadOS or something. I would be disappointed if Macs went away, Windows is garbage and no Linux desktop experience is as good (for me) as the Mac's but Apple wouldn't really notice in terms of revenue.

However I don't think there's any foreseeable reason for the Mac to go away. Most of the Apple's development effort is shared between macOS and iOS with even more iOS things being ported to the Mac side. Even hardware development is shared between the platforms and will be shared with the Vision platform. Even with 10% of the company's revenues the Mac is nowhere near 10% of Apple's OpEx so it definitely makes more money than it costs to develop and maintain.


> Windows is garbage

I accept that as an ex-MSFTie I am biased - but I'm honestly curious what Windows did to you that put you off it. Can you share?

(Though I agree Windows 11 is ... not good, so I'm sticking with Windows 10 - though I am starting to experiment with a Slackware desktop too).


As an ex-Appler I'm a bit biased as well. I've used every version of Windows since 3.1 and Windows 8 was the last straw for me.

The Windows 8 UI was schizophrenic and a lot of effort was made to force you to use the newest most broken UI. I even got a touch capable laptop and could rarely use just Metro/Modern apps because they were crippled but the classic UI is just unusable with touch. I could only comfortably use Windows 8 with the third party Classic Shell and avoiding and Metro UI app.

Windows 10 walked some of those missteps back but then became incredibly infuriating with its automatic updates. I tend to leave machines running for weeks or months, sometimes asleep and other times not. Despite me telling Windows to not do updates automatically I'll come back to a machine sitting at the user login window. All my context now gone.

Or more fun Windows having updated in the background and I shut down or reboot without realizing. Depending on the updates I have to wait some unbounded length of time just to use the system again.

Windows sleep support is also still atrocious on most machines I've used. My MacBook I can put to sleep and remove the charger and leave it for a week and it'll have battery left to do work. My Windows work laptop I left asleep but unplugged and the battery was dead after a couple hours.

That's all on top of UI/UX issues I don't like because I've been using OSX for over twenty years. Mac keyboard shortcuts and trackpad gestures are second nature to me now and using Windows is jarring. Ctrl as a modifier key is unergonomic vs the Command key. Windows' trackpad gestures, especially app switching, are uncomfortable. All of the UI fades and pop-up previews are too distracting and there's rarely a safe place to park your cursor to read something. Everything wants to face in some tooltip or context toolbar under the cursor.

I'm fine if people like Windows but for me it's just frustrating to use. Unfortunately my work laptop is Windows now and I'm constantly annoyed with its UX problems.


In a way it would be logical for Apple to get out of the computer business and scrap all MacBooks and Mac Minis. That they only have around 10% of the market means that most people uses iPhones with non-Apple computers anyway. They could focus on making that experience even better.


> It's sad to NUCs go, but it was inevitable. They made products for customers, while simultaneously competing with said customers.

That is true, but they're already a decade into shipping NUCs. Maybe it wasn't a problem after all. It could also be just a move to show the stock market that they focus on getting leaner.

RIP NUC :'(


You can then argue that they've spent the last decade slipping into irrelevance, so perhaps the NUC focus was a distraction from their core issues. They're at an inflection point in the company where either they turn the ship around and successfully become the second largest foundry, closing their technology gap with TSMC, or they continue losing market share. They can't fund new fabs and new nodes without a massive reduction in their cost center.


> It's sad to NUCs go, but it was inevitable.

No business decision is inevitable and we should stop acting otherwise.


> If any hedge funds are looking for analysts, I'm always open to offers...

How would you invest based on this prediction?


Depends on the portfolio and strategy, but personally I think Intel is a Buy. They're burning money building new fabs with little indication that they can execute on their node or product roadmaps, but they've really took a chunk out of their cost-centre to the point where even during one of their worst quarters in the last 30+ years in terms of sales, they're still turning a profit. The Saphire Rapids Xeon is out finally out and making money after 3+ years of delays. The client products are the best they've looked in 15 years. They've got some big customers lined up for IFS. Considering the market was happy to price them at ~$64 just two years ago without all that information, makes me believe $33 is a poor market for where they're at. That's near enough their asset price.


A bit off mark on networking, looking at Tofino there. Could have maybe left barefoot alone, but no, buy it, hype it, actually tape out and start delivering nextgen, then cancel before people rack them.


I was thinking more so their Ethernet and Wireless product portfolio than the Tofino line: https://www.intel.com/content/www/us/en/products/overview.ht...

With regards to Tofino, I think they've looked at the portfolio and realized they can fill that niche with their FPGA products. Looks like they've stripped out all the useful IP from Barefoot and repackaged it to be used on a Stratix/Agilex chips, or as tiles on their other devices.


I didn't want to be a pain, but I'm still waiting on a 200G (QSFP56) 400G (QSFP-DD or QSFP112) Ethernet NIC and some kind of user-programmable (and no FPGA isn't a panacea here for many reasons) packet pipelining. Connect-X (from 5 onward) is king there and of course it's all closed. You also get Broadcom stuff there. But no Intel.

Oh you can build crazy 800G (or up to 1200G) stuff (that don't fit PCIe bandwidth) but the entry price is quite steep, see https://www.reflexces.com/pcie-boards/intel-stratix-10-fpga/...

The real interesting part of the Tofino line was the advent of P4, for smart network switch and eventually some processing in network, scatter-gather support or all-reduce. Some competition for nvswitches but on standard Ethernet for example.


> then cancel before people rack them

It's worse than that. We built Tofino powered boxes. We have multiple paying customers who bought and deployed them. Intel then cancelled it, screwing us and our customers.


As an everyday user of an Intel Compute Stick I can empathize with this.

I think you're wrong about the Intel portfolio though; Arc GPUs are another line that is going to be unceremoniously cut in a few years.

I mean I could be wrong, but Intel has gotta be looking at Nvidia's Hopper or AMD's MI300 DATA oriented architectures and wondering why the Arc team is trying to make $750 gaming computers. Xe GPUs are going a different direction from consumer Arc GPUs already anyways.


> I mean I could be wrong, but Intel has gotta be looking at Nvidia's Hopper or AMD's MI300 DATA oriented architectures and wondering why the Arc team is trying to make $750 gaming computers.

That's the same strategy AMD has tried with ROCm (de-facto pro-only compute ecosystem) and it hasn't worked. Nobody is going to buy a $5000 accelerator card (or even rent AWS time) just to tinker with your ecosystem unless it's already known to be the bees' knees. NVIDIA built their success by making sure their cards were the first thing people reached for when they wanted to build a high-performance application on a compute accelerator.

Further, there's a lot of redundant work and overlapping lines of business here. You have to develop game drivers for the iGPU for anyone to take you seriously, so at that point why not make the dGPU cards and increase the ROI multiplier of your work? Or are you planning on outsourcing the whole shebang and just licensing a hardware SIP core and a driver stack? There's really only two other names with a credible Windows/Linux driver stack... NVIDIA and AMD. Imagination/PowerVR don't have any windows presence, and Intel already tried this back in the early Atom era and it fucking sucked.

Like yea you've actually pointed out the exact reason they won't cut it: Hopper or MI300 is where the HPC and performance-compute segments are going and you can't be credible in that space without an internal solution for the accelerator half. It doesn't have to be GPUs, or they don't need to have graphics pipelines, but once you are doing all the work to develop the GPGPU side, you might as well make a variant with a pixel pipeline and display outputs and sell it to enthusiasts too.

The semiconductor space has a lot of these overlaps in product verticals, and if you choose not to be in them, you're leaving revenue on the table that is relatively "cheap" to access. The same CPU uarchs that work in enterprise work in consumer, and in the grand scheme of things re-using the same uarch for a low-margin product is fine. You'd never pay to develop the product from scratch just for the consumer market, but once you've done the work, you might as well sell it to consumers too. If you're doing consumer, you probably want to be doing laptop too, but those need iGPUs. Which need drivers, and if you're doing drivers you might as well do dGPUs too.

Intel doesn't need to be exactly AMD, but AMD has gone through this and already cut to the bone on everything they didn't need. You can draw a circle around the product verticals they're in and they've pretty much identified the core business requirements for the CPU/GPU markets.

You're right though that Gelsinger is obviously stripping the business down to his own vision of those essentials. I just think right now the evidence indicates that GPUs are still a part of that vision. You can't be competitive in HPC without GPGPU, you can't do laptop without iGPU, they might as well do gaming GPUs too, especially since a lot of the GPGPU research will overlap. The drivers don't, but you have to do them for iGPU anyway for laptops.

The fabs are the rough one though. They're expensive as shit, but all intel's legacy IP uses them, and the only thing the fabs run is Intel's IP. I think he's serious about building the wall between IP and fab at intel, and about bringing in external business, but right now there is no hope of even a GloFo-style spinoff working. It would absolutely take down both halves of the company to even try.

He's definitely got a tough road, Intel's pain is only just beginning and it's going to be a long time to profitability.


> If any hedge funds are looking for analysts, I'm always open to offers...

$INTC has done a whole lot of nothing for the past 9 months, I don’t think any hedge funds care short or long.


I just said that as joke; you couldn't get me into finance unless you had a gun to my head. I do think long though, Intel makes for an interesting investment. It plays in about the 70th percentile for stock volatility for the year, which for a name like Intel is quite surprising. Analysts put it somewhere between -50% and +100% by this time next year. I've read of hedge funds making big bets on way smaller spreads than that. They're 25% YTD, so it's by know means "nothing".


> They made products for customers, while simultaneously competing with said customers. It's hard to build any decent partnerships on that premise.

I once talked to an AMD engineer and asked why they didn't just build a barebones NAS chassis with some of their ryzen embedded stuff since Ryzen is quite popular but there's very few Ryzen products in that segment. Obviously QNAP and Synology have some decent demand. That's basically what they said, didn't want to compete with customers.

GPUs are sort of the opposite example where I don't really feel any particular attachment to MSI, Gigabyte, Asus, PowerColor, etc. EVGA and Sapphire are the only ones that have managed to claw together some consumer mindshare through their warranties. But the rest are essentially customizing an AMD/NVIDIA reference design PCB and a cooler that's perfectly forgettable and interchangeable with any other offering in their price class. They're not even allowed to do double VRAM configs anymore because that would cut into Radeon Pro and Quadro revenue etc. In that scenario I don't see a lot of value to having another middleman taking a 10% cut, and to me the value of products being available at MSRP, with no diffusion of blame through the supply chain, would be worth losing the partners over.

And while systems integration (NUCs, servers, etc) are clearly something where there's a lot of value from this kind of diversity, motherboards really are not. AMD and Intel have both killed off third-party chipsets like nForce, Abit, etc, and locked everything down to their one ecosystem they control, much like GPUs. It's not quite as pronounced as GPUs, and previously there was a lot of diversity in features, but PCIe 5.0 motherboards tend to reverse this, almost every board has exactly 2 PCIe slots and more or less the exact same featureset elsewhere. And the trend is towards more and more being onboard the CPU itself anyway (AMD chips the chipset is literally just an IO expander and is completely uninvolved in management tasks) and once onboard voltage regulators (FIVR, DLVR, etc) really start to take off in the next 10-15 years the motherboard is only going to become dumber and dumber. And in that world there's less and less of a need for a partner anyway - the CPU is all self-contained and locked down anyway, partners can't experiment and do cool things, they are a dumb pipe that pumps in 2V for the DLVR to step down at point-of-use. So why pay a 10% margin for their "value-add"?

It slays me that AMD is talking a big game about "open platform" and everyone is still locking down third-party chipsets and even the Platform Lock. If you want to lock the chip to the board for security, fine, but locking it to the brand doesn't do anything except ruin the secondhand market. Oh no I have to swap it out with another lenovo chip if I steal it, what exactly does that accomplish? And if you accept the conceit of the PSP there's no reason it has to be permanent anyway, you can allow the PSP to unlock it instead of permanently blowing e-fuses. And again, third-party chipsets are where a ton of innovation happened, it's better for the market if third parties can make those double-VRAM models and undercut AMD/NVIDIA's ridiculous margins on those workstation products, or shanghai a Celeron 300A into being a dual-socket system like Abit. What we have right now is tivoization in support of product segmentation, branded as a "security feature".


also to be clear, management engine is far far worse. the chipset boots the chip for Intel, Ryzen is a SOC always. For intel, it's super intimately involved with the processor bringup in ways that can't be exposed to third parties anymore. They literally can't open anything while the ME is on the chipset, but they're flailing at homeostasis let alone big rearchs of their processor's brainstem against the possibility of third-party control of the bringup.

The chipset is a pure IO expander for AMD. AMD still is doing way better at that, it's just things like X300 ("the chipset is no chipset") being restricted to industrial/embedded, or board partners not being allowed to pursue things they want that break AMD's segmentation. PCIe 4 enablement (including opt-in) on select X370/X470 boards was something partners wanted for example. And there was no technical reason for X399/TRX40 to be segmented and even WRX80 could have been shoehorned on with "everything works just not optimally" level compatibility backwards and forwards. Partners could have done that if it wasn't denied/locked out. They did it on the Socket SP3 flavor.

Partners should ideally just get the freedom to play, and if they can make something work, cool. Let's have more Asrock/Asrock Rack and ICY DOCK design shenanigans again. Clamshell VRAM cards should be sold relatively close to actual cost rather than being gated by both AMD and NVIDIA. Etc. Partners should have the ability to configure the product in any way the product could reasonably be engineered to work. If features are being explicitly segmented by product tier it should be enforced by e-fuse feature-fusing at launch and that's the deal, no taking AVX-512 out after it launched.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: