Hacker News new | past | comments | ask | show | jobs | submit | HappyDaoDude's comments login

The market will remain irrational far longer than even the most pessimistic takes.


Absolutely! A few times I have seen the full grandeur of this on the west coast of Tasmania. Highly recommend. Also neat seeing all those satellites flying about after sunset.


It is the closest we have to a freedom respecting device. The baseband processor is still a closed blob but it is a lot better than pretty much everything else out there. Maybe Pinephone eventually will get there as well.


There was that moment when a friend had a Dual G5 PowerMac only to have it trampled in performance by the new Core Duo Mac Mini. That was when we knew Apple had made the right move. I called the G5 tower (Steve's shame), you could just tell that every time those fans kicked into top gear that there was a sense of shame that that thing even shipped.

The M1 felt like that all over again. I didn't really get that feeling during the 68K to PPC era but that also wasn't handled as gracefully.


Didn't Apple start shipping water cooled overclocked CPUs just to get some sort of a speed bump when the CPUs topped out? Was that the G5 you mentioned?


Yes, Apple shipped the top tier PowerMac G5s with water cooling, including the top-of-the-line quad-core (2 x dual core) models. I don't think they were overclocked, they just ran extremely hot at stock speeds.


These topped out at 2.7 GHz. The water coolers ended up developing leaks, for which Apple was sued.


My memories of the 68k to PPC transition are mostly watching the new machines reboot, reboot, & reboot some more.


Your memories are spot on. ;)


[flagged]


And the M1 MacBook was quieter, lighter and I could actually sit it on my lap without worrying whether it would prevent me from having little Scarface74’s.

Apple didn’t come out with high end ARM Mac laptops until later. Yes from a laptop - a portable computer - I loved being able to plug it in overnight and to use it all day at a customer’s site without worrying about battery life.


> And the M1 MacBook was quieter, lighter and I could actually sit it on my lap

Yeah this is key. In a laptop that’s meant to be used as a laptop, designing for maximum power at all costs isn’t the winning move. M1 was nice in that it’s plenty powerful while also not making the laptop’s fans scream and being able to survive away from an outlet for several hours despite doing “real work”.

In short, balance, which is still surprisingly hard to come by in x86 laptops. If you want a 10lb monster desktop replacement, gaming laptop that does the 2016 Apple thing and crams too many watts into too thin of a chassis, or a really weak ultraportable with mediocre battery life you’ve got plenty of options. If you want reasonable performance with great battery life and silence however you’re restricted to a tiny handful of options.


On the other hand, if you want balls to wall performance on a desktop computer, Apple has nothing.


I feel like most people with those needs aside from a handful of corporate purchasers are going to be building their own towers anyway, which Apple is going to have a hard time competing with even if they fixed the current issues with the Mac Pro towers.

It would be interesting if Apple took a crack at an M-series HEDT variant that’s allowed to chug electricity freely though, even if that’d be extremely niche.


Apple had an answer for that market with the last generation x86 Mac Pro.


I don't disagree with anything you said. If you not the comment to which I was replying, I am addressing the specific point that the M1 was a revolutionary boost in performance, which it was not. Apple's Pro stuff (Intel) at the time was a year behind everyone else on specs. When their Pro stuff came out - sorry, but even the M2 isn't "Pro" anything.

What I am doing is comparing top of the line from Apple, to top of the line period. And consistently, Apple is always behind on workhorses, to get work done. They are better in your "light quiet slightly longer battery" category. At 50% more cost. The free laptop from work - I want the most powerful portable thing available. For personal life, now way I'm paying the Apple tax. So - no use case in my life. I do always get my wife iphones and macbooks though. She teaches languages to little kids and I don't want to spend my time helping her with tech stuff, so the Apple tax is worth it. But never for the hardware - just for the walled garden and fisher-price UI for kids.

The Precision is not light, but that's because it's thick metal that you can run over with a car, and extremely durable. No one complained about a thinkpad being built sturdy. For the use case you describe, an XPS with an I7 at the time was super light, got about over 10 hours of battery life if you just did regular office work (no compiling or large data processing), and has similar specs. I used to take that with when I'd go internationally. It absolutely did not get hot - the key was to turn off turbo boost if on battery by setting max CPU at 99%.

The thing is, if we compare pricepoints, you could have a Precision for the cost of an M1, and now you have desktop power on the go. It's not super-light, but it's still light and thin enough to put in a shoulder bag and not get neck pain.


> They are better in your "light quiet slightly longer battery" category. At 50% more cost.

The cost is/was comparable to higher end XPS and Thinkpads.

> you could have a Precision for the cost of an M1,

It's a perfectly valid choice, that doesn't mean other people can't/don't have different preferences which are just as as yours valid.


No one complained about a laptop pre 2020 that had bad battery life, was loud and hot because both Macs and Windows were using x86 laptops.

Just like no one complained about the clunky Nomad before the iPod was introduced. Then only CmdrTaco complained (deep cut. It’s a 20+ year old reference)


> an 8-core Xeon, > and the battery went about 8-10 hours

I find that very hard to believe. Also how much did this "laptop" weight and how thick was it? Presumably it was in an entirely different market segment and not competing with the MacBook Pro. It's almost like comparing the M1 to a desktop...

The XPS was the equivalent Dell product and I don't recall it being particularly better than a Mac back in the 2018-2019.


You could not critic either direction too heavily. Do you want a fresh technology base or legacy support? There is no wrong answer.


Also they’re in very different positions.

Apple made perfectly good choices but history did not come out in their favor. So they had to keep switching and 3rd parties had to follow (if they wanted to stay on the Mac).

Microsoft ended up on what became the dominant CPU that just happened to continue to rocket up in performance for decades. So although Windows is capable of running on other processors, it was never much of a market and most of the software never came along.


I wouldn't say it just happened to be. There's been an orgy of synergies mostly but far from exclusively with Windows that put or has kept x86 at a crossroads of cost, performance, platform openness, and compatibility across vendors and iterations.

I have a hard time imagining what it would look like if MS put all of their weight into an arch transition, but I have a hunch it would come with the most ridiculous compatibility layer we've seen so far.


You’re right. Intel had the money to push the performance on their chips because of all the sales to DOS/Windows users. Still neither company knew which platform was going to be dominant when they started.

We don’t know if Motorola could have done what Intel did given similar resources. Maybe they could have.

I fear MS will continue to struggle with ARM. The problem with Microsoft is they just don’t control enough big applications. There are just way too many little programs out there that people depend on.

Each time Apple moved the performance difference was enough to cover the cost. Plus it was clear you HAD to switch if you wanted to stick with the Mac.

PC users aren’t going to lose Intel, it will still be a choice. No one can beat Intel’s best on desktop. Unless they can convince Nvidia and AMD to come along with native drivers games will suck. Anything else that doesn’t move will depend on a very high speed emulation layer.

Unless battery life can win the day, it’s gonna be a tough fight. And you know Intel is not going to go down easy.


> Stil neither company knew which platform was going to be dominant when they started.

I figure they usually weren't thinking about it that way yet. A lot of what came later was unprecedented.

> There are just vway too many little programs out there that people depend on.

Hence the outrageous compatibility layer. I wouldn't put it past Microsoft to make a try at driver compatibility.


Get on it sooner rather than later. Many CRT's were thrown out when they were considered obsolete and the ones remaining are dying from age.


The most surprising thing, to me, was for once I got the timing spot on! Usually when I predict things like this I am way off on the timing but this is right on pace. In that sense it is like an economist predicting a crash, one of the twenty will be right.

The question is now, how many years until they start moving features behind the paywall like limits on how many people in a message chat? Or view limits (aka twitter/x) I suspect 3-5 years.


It is hard to tell what we will leave. Depending on how our civilization declines, if it just the typical path of resource overshoot and decline I do wonder how much of what we are creating today will last. Digital technology is efficient but lacks resilience. Even our printed materials now are on high acid paper that essentially turns into saw dust after less than a hundred years.

The things that make it through these periods are the stuff that is seen as useful to the folks in between. This is why we get a lot of religion, the odd bits of sciences stuff, a lot on growing food and snips of history if lucky. Heck for the might of the Roman empire, we only have 25 seconds of sheet music remaining. It is also funny how little we know about some of the Emperors. Things like, they had children, we do not know their names or if they survived childhood. The gaps are huge.

The things that survive are the things other think are worth surviving. Hygiene practices yes! Tiktok... no.


> Even our printed materials now are on high acid paper that essentially turns into saw dust after less than a hundred years.

I've heard this a lot and don't find it credible. I have many books that are from 1 to 150 years old and none have shown signs of turning into sawdust. I even have 40 year old computer mags that are like new.


at the bare minimum, we have graveyards all over the place with our language, names, number system, religious symbology, etc etc carved into stone


That is a very fair call! The overall champion of physical storage is still stone.


Seeing what folks in the demoscene can do nowadays with such limited hardware makes modern software feel all the more puzzling. I mean, yes demoscene stuff isn't concerned about ease of development, security or integration. But it does leave you yearning, think about the possibilities of modern hardware if treated with care.


This is the precise reason I prefer embedded development. The challenge of fitting my entire application into a handful of kilobytes with just a KB or two of RAM is a lot of fun. I get to build a program that runs really fast on a very slow system.

It's a point of personal pride for me to really understand what the machine is and what it's doing. Programming this way is working with the machine rather than trying to beat it into submission like you do with high level languages.

It seems a lot of programmers just see the CPU as a black box, if they even think about it at all. I don't expect more than a couple percent of programmers would truly grok the modern x86 architecture, but if you stop to consider how the CPU actually executes your code, you might make better decisions.

In the same vein, very high level languages are a big part of the problem. It's so far abstracted from the hardware that you can't reason about how it will actually behave on any real machine. And then you also have an invisible iceberg of layer upon layer upon layer of abstraction and indirection and unknowable, unreadable code that there's no reasonable way to know that your line of code does what you think and nothing else.

Modern software practices are bad and we should all throw away our computers and go back to the 8086. Just throw away the entire field of programming and start again.


I love embedded as a hobby but God is it a silly statement to imply we should go back to low level asm/C bs for everything, we would get so little done. Oh, but it would run fast at least.

Problem isn't high level dev, it's companies skimping out on the optimisation process.


That's sort of the opposite of treating the hardware with care. It's all done with no allowances for varying hardware at all. This is like pining for a 70s text editor, while refusing to admit the world has moved beyond 7bit ASCII, and that stuff like unicode support isn't "optional".


People can use whatever tools they want; but all my code, blog posts and personal notes would work with 7-bit ASCII and a 70s text editor.

The editors I use support unicode and use UTF-8, but if they didn't I'd hardly notice.


After some editing, aye. But you use emojis on your website and you can't even type your first name with 7-bit ASCII. Ö isn't ASCII.


Ah yeah fuck everyone that doesn't use the Latin ascii character set amirite!!!


Great leap of logic.

1. Write a piece of content in ASCII.

2. ???

3. Fuck everyone that doesn't use the Latin ASCII character set.

Not sure what you put in number two in your mind, but I'd be interested to see it.


It'd all blow up the moment you tried to use someone else's code though


Treated with care and 1000x the development time, budget etc.

Things are slow because we prefer ease of development for these extraordinarily large and complex projects we call video games.

I think the smart thing really is to work it all out at a high level and then target the slow, considered and expensive fixes to where they're really needed.

I'm not excusing obviously lazy development though, but I do think we need to remember how crazy the scope of newer games can be. Imagine showing MSFS2020 to someone from 10-15 years ago; much of the earth scanned in and available for you to fly over, of course there are perf hiccups.


Usually that hesitance is because a lot of amatuer stuff has been made using Unity. But when in the right hands, it can feel like it is its own thing but it is easy to be hesitant after shoveling through so much trash.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: