Hacker News new | past | comments | ask | show | jobs | submit | Tsarbomb's comments login

Are you suggesting a general solution to the halting problem?


The usual solution, including in safety critical systems, is to give the judge of the halting problem a stopwatch and a gun.

For example in an embedded system: a watchdog timer that you don't service during the execution of some context. If you fail to complete your task within the time, the kernel or entire system is rebooted.

For example in a VM-like system, you give the code some amount of "fuel" or "budget", if it exceeds that budget, the process/tasklet/whatever is terminated by the VM.

It's not a general solution to the halting problem, but a practical one.


funnily enough, ChatGPT can examine code and can say if it will halt or not, in some small cases.


Hard disagree. While they had lost a ton of ground people were still sporting blackberries in large numbers. The absolute bone headed move they did is they continued to promote and sell their legacy models front and centre cannibalizing their own future growth.

There were other issues that were much smaller in the bigger picture like android app support on BB10 coming a little late as well as the devices in general had a somewhat underpowered SoC. All of these contributed to slow adoption, but the fact they were promoting new models of the Bold while their warehouses were full of the Z10 was really what did them in.


> The absolute bone headed move they did is they continued to promote and sell their legacy models front and centre cannibalizing their own future growth.

That's kind of exactly what I meant, though.

Nokia made an almost identical mistake, before the Microsoft acquisition, by not doing a timely migration path from Symbian to Meego. The old business was a cash cow so they kept it going, not taking the new thing seriously.


> promoting new models of the Bold while their warehouses were full of the Z10

Not familiar with BB lineage I checked the Wiki for the release dates and ... wow.

Not quite an offtopic: people bash me when I point out what Burning Platform is the result of the previous Nokia shenanigans, so blaming it all on Elop is like blaming the fire for charring your steak you forgot on the BBQ.


The government giving money to corporations without ownership is communism now? I think you may be thinking about nationalism and some other things adjacent to that...


Check out Bevy. https://bevyengine.org/ It has the best ECS design I've ever seen for a game engine. Super fun to use. Currently no editor and you need to wrap your head around ECS, but it will definitely push you to improve your rust skills.

Edit: Also it can be compiled for WASM.


What? DOTS is great and extremely performant. CS2 is completely the fault of the developers.

The real issue with Unity is that they don't double down on DOTS + HDRP/URP. I have not opened the editor for a few months but last time I worked in Unity the legacy render pipeline was still the default and documentation around DOTS and HDRP was somewhat lacking which is a shame because when you get it working, it works super well.


DOTS isn't ECS. DOTS was supposed to be, a Data Oriented Tech Stack. Currently it only has ECS with some extra bits. Ergo, DOTS is an unfinished mess. And that unfinished mess --along with a mandate to release for Gamepass-- culminated into CSII suffering from utterly ridiculous issues.


Westinghouse is owned by Brookfield Renewable Partners and Cameco.

Cameco is the second largest uranium producer in the world. Brookfield Renewable Partners owns and operates a lot of power generation. Both are Canadian where there is a huge pool of both power generation and specifically nuclear power generation talent and experience. As an example until 2016 Canada had the largest operator nuclear power facility in the world, several provinces are currently in the planning phases of SMRs, and Canada will be financing CANDU reactors in Romania.

It's safe to say they see some sort of advantageous vertical integration of the supply chain, as well as believing regulatory and economic outlooks being good.


CANDU reactors are pretty spiffy. Can run on unenriched natural uranium, or even uranium mixed with thorium or plutonium. Design also comes with a lot of passive safety features.


I hard disagree. The chassis and cooler designs of the old intel based macs sandbagged the performance a great deal. They were already building a narrative to their investors and consumers that a jump to in house chip design was necessary. You can see this sandbagging in the old intel chassis Apple Silicon MBP where their performance is markedly worse than the ones in the newer chassis.


That doesn’t make sense: everyone else got hit by Intel’s failure to deliver, too. Even if you assume Apple had some 4-D chess plan where making their own products worse was needed to justify a huge gamble, it’s not like Dell or HP were in on it. Slapping a monster heat sink and fan on can help with performance but then you’re paying with weight, battery life, and purchase price.

I think a more parsimonious explanation is the accepted one: Intel was floundering for ages, Apple’s phone CPUs were booming, and a company which had suffered a lot due to supplier issues in the PowerPC era decided that they couldn’t afford to let another company have that much control over their product line. It wasn’t just things like the CPUs failing further behind but also the various chipset restrictions and inability to customize things. Apple puts a ton of hardware in to support things like security or various popular tasks (image & video processing, ML, etc.) and now that’s an internal conversation, and the net result is cheaper, cooler, and a unique selling point for them.


> net result is cheaper, cooler, and a unique selling point for them

That and they are not paying for Intel's profit margins either. Apple is the quintessential vertical integration - they own their entire stack.


I was thinking of that as cheaper but there’s also a strategic aspect: Apple is comfortable making challenging long-term plans, and if one of those required them to run the Mac division at low profitability for a couple of years they’d do it far more readily than even a core supplier like Intel.


Apple doesnt manufacture their own chips or assemble their own devices. They are certainly paying the profit margins of TSMC, Foxconn, and many other suppliers.


That seems a bit pedantic, practically every HN reader will know that Apple doesn't literally mine every chunk of silicon and aluminum out of the ground themselves, so by default they, or the end customer, are paying the profit margins of thousands of intermediary companies.


I doubt it was intentional, but you're very right that the old laptops had terrible thermal design.

Under load, my M1 laptop can pull similar wattage to my old Intel MacBook Pro while staying virtually silent. Meanwhile the old Intel MacBook Pro sounds like a jet engine.


The m1/m2 chips are generally stupid effecient compared to Intel chips (or even amd/arm/etc)... Are you sure the power draw is comparable? Apple is quite well known for kneecapping hardware with terrible thermal solutions and I don't think there are any breakthroughs in the modern chassis.

I couldn't find good data on the older mbpros, but the m1 max mbpro used 1/3 the power vs an 11th gen Intel laptop to get almost identical scores in cinebench r23.

https://www.anandtech.com/show/17024/apple-m1-max-performanc...


> Apple is quite well known for kneecapping hardware with terrible thermal solutions

But that was my entire point (root thread comment.)

It's not that Apple was taking existing Intel CPUs and designing bad thermal solutions around them. It's that Apple was designing hardware first, three years in advance of production; showing that hardware design and its thermal envelope to Intel; and then asking Intel to align their own mobile CPU roadmap, to produce mobile chips for Apple that would work well within said thermal envelope.

And then Intel was coming back 2.5 years later, at hardware integration time, with... basically their desktop chips but with more sleep states. No efficiency cores, no lower base-clocks, no power-draw-lowering IP cores (e.g. acceleration of video-codecs), no anything that we today would expect "a good mobile CPU" to be based around. Not even in the Atom.

Apple already knew exactly what they wanted in a mobile CPU — they built them themselves, for their phones. They likely tried to tell Intel at various points exactly what features of their iPhone SoCs they wanted Intel to "borrow" into the mobile chips they were making. But Intel just couldn't do it — at least, not at the time. (It took Intel until 2022 to put out a CPU with E-cores.)


the whole premise of this thread is that this reputation isnt fully justified, and thats one I agree with.

Intel for the last 10 years has been saying “if your CPU isn't 100c then theres performance on the table”.

They also drastically underplayed TDP compared to, say, AMD, by taking the average TDP with frequency scaling taken into consideration.

I can easily see Intel marketing to Apple that their CPUs would be fine with 10w of cooling with Intel knowing that that they wont perform as well, and Apple thinking that there will be a generational improvement on thermal efficiency.


>Under load, my M1 laptop can pull similar wattage to my old Intel MacBook Pro while staying virtually silent. Meanwhile the old Intel MacBook Pro sounds like a jet engine.

On a 15/16" Intel MBP, the CPU alone can draw up to 100w. No Apple Silicon except an M Ultra can draw that much power.

There is no chance your M1 laptop can draw even close to it. M1 maxes out at around 10w. M1 Max maxes out at around 40w.


Where do you get the info about power draw?

Intel doesn't publish anything except TDP.

Being generous and saying TDP is actually the consumption; most Intel Mac's actually shipping with "configurable power down" specced chips ranging from 23W (like the i5 5257U) to 47W (like the i7 4870HQ); (NOTE: newer chips like the i9 9980HK actually have a lower TDP at 45w)

of course TDP isn't actually a measure of power consumption, but M2 Max has a TDP of 79W which is considerably more than the "high end" Intel CPU's; at least in terms of what Intel markets.


Check here: https://www.anandtech.com/show/17024/apple-m1-max-performanc...

Keep in mind that Intel might ship a 23w chip but laptop makers can choose to boost it to whatever it wants. For example, a 23w Intel chip is often boosted to 35w+ because laptop makers want to win benchmarks. In addition, Intel's TDP is quite useless because they added PL1 and PL2 boosts.


Apple always shipped their chips with "configurable power down" when it was available, which isn't available on higher specced chips like the i7/i9 - though they didn't disable boost clocks as far as I know.

The major pains for Apple was when the thermal situation was so bad that CPUs were performing below base clock. -- at that point i7's were outperforming i9's because they were underclocking themselves due to thermal exhaustion; which feels too weird to be true.


That's not Apple. That's Intel. Intel's 14nm chips were so hot and bad that they had to be underclocked. Every laptop maker had to underclock Intel laptop chips - even today. The chips can only maintain peak performance for seconds.


Can you elaborate?

My 2019 macbook pro 15 with the i9-9880H can maintain the stock 2.3GHz clock on all cores indefinitely, even with the iGPU active.


My 2019 MBP literally burned my fingertips if I used it while doing software development in the summer.


Back in the Dell/2019MBP era every day was summer for me.


> You can see this sandbagging in the old intel chassis Apple Silicon MBP where their performance is markedly worse than the ones in the newer chassis.

And you can compare both of those and Intels newer chips to Apples ARM offerings.


If I was a laptop manufacturer who wanted to make money selling laptops I would not intentionally make my laptops worse.


No it's not normal, they are talking a bit out of their ass. Based on their dates, I'm significantly younger than them and yet I was able to afford a house in one of the more desirable neighbourhoods with only me being the one working in tech.

I'm not targeting the person you are replying to with any malice, but since almost all of the major financial and business institutions in Canada are headquartered here there is an overabundance of people that would claim they work in "tech" when in reality they are making a respectable but decidedly non-tech salaries at places like TD Bank or Thompson Reuters as examples.

The range of possible salaries for devs in Toronto is quite large.

Also as an additional anecdote, every single one of my classmates who went to the USA and decided they would like to start a family, came back to Canada to start that family.

That is not to say it is all rosy here. There is an overabundance of poor or terrible talent that's been shipped in to cover the exodus of Canadian educated people chasing better salaries in the USA while business leaders and purse string holders are content to celebrate their mediocrity while being confused why productivity is so low.


Wow, you're classy... and yeah, I feel targeted a bit ... hah

Our old house @ Oakwood & Vaughan was bought for $285,000 in 2005. It's likely "worth" north of $1M now, not 20 years later. My senior software engineer salary in that period was between $75 and $100k CAD. Are you saying that a SWE salary in Toronto is over $300k now?

I know it isn't, though there are plenty making more than that Google Canada, that is a huge anomaly from the rest of the market.

The distortion in housing prices and the continued upward growth has a negative effect on the ability of young people to prosper. It might help me retire, sure, but it isn't going to do much good for my kids.

BTW, I worked as a SWE at Google for 10 years. And my wife was at Apple before that. Does that count as "tech?" Just checking.


I work as a software engineer at a company whose only products are software. Are you saying I shouldn't claim to work in tech because I don't make FAANG money?


I'm 99% sure the person you are replying to was saying that tongue in cheek / sarcastically.


Our entire economic model globally is built around scarcity, either real or artificial. You remove that and we are in one hell of an upheaval.


Agree, and as a sibling suggested, I think we're already experiencing the beginnings of it.

But there's no reason a global economic model can't be built around perpetual surplus.

(And strikes me as a much easier proposition.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: