Microsoft has been consistently trying to give ARM a try since the surface RT. Consumers are not going to bite. marginal power saving is not meaningful.
The first iteration of Windows ARM didn't have any x86 emulation layer, so that one was doomed from the start. The second iteration did, but it initially couldn't run 64bit apps and the performance was poor. They do have 64bit support now and it sounds like the emulation performance has come a long way.
Here is my question though, comparing how this works on Mac.
Will Windows have the opposite? ARM running on x86?
I continue to wonder how Microsoft expects to work long term. Are they expecting that every developer is just going to keep x86 and ARM based app perpetually or users be stuck always using that emulation layer if they are running ARM?
Microsoft won't be able to 100% transition to ARM like Mac did. At some point all Intel Mac's will be old enough to no longer get the latest version of Mac and for developers to stop targeting and they drop Intel support.
I just don't see many developers bothering with an ARM native Windows version when doing so means they have to support both or risk annoying customers later.
> I just don't see many developers bothering with an ARM native Windows version when doing so means they have to support both or risk annoying customers later.
The market dictates what developers do. If Windows on ARM is the new shiny and it hits the three key laptop parameters of no fan noise, long battery life, cool case, then people will buy it and developers will build for it.
I think the official line from Microsoft would be that most software should be using .NET anyway, and in that case the same binary should Just Work on either architecture. In reality there is still a lot of native software though, so who knows how that will play out. Games in particular will always be native.
You have to understand that Windows comes from a separate division than .NET and they have no overlap. Microsoft isn't a cohesive company. .NET comes from the developer division (DevDiv) and UWP comes from the Windows division (now Server & Cloud). The Windows folks always hated .NET and the developer division has been lukewarm about UWP.
It's actually kinda annoying once I started paying attention, as many software vendors just detect "Windows" and give you a x86/x64 installer, even when the company offers a ARM64 build that would presumably be faster or be more energy efficient. I installed a bunch of stuff that were Intel binaries without even knowing that I wasn't running native. But I haven't noticed any performance issues, and yeah everything just works.
In 2018 that lockdown situation morphed into "S Mode" which you can turn off in the control panel. The only trick is that you can't turn it back on. It's just that the ecosystem isn't there, both in terms of developers and performant devices.
Hopefully today's announcement is a turning point for that but atm windows on ARM is about on the same tier as a pre-carplay infotainment system.
I think the idea is to all apps and developers gradually transition and develop with ARM support - after all even the mobile devices will be running on ARM sooner or later so future apps, games will be developed with ARM in mind anyway. x86 apps will be supported - with some paid support for example.
But it all depends on the market share of ARM at one point. But you can run DOS apps still so with emulation layer - and the increasing performance of ARM - one way or another old apps will be able to run on ARM. For those who will need to those.
Unlike Mac, Microsoft just can't drop past generations and call it a day.
> But it all depends on the market share of ARM at one point.
Right thats kinda my point, unless I have missed it I have yet to see any real talk about ARM on custom built machines and I doubt gamers are going to give that up anytime soon.
Apple was able to force the transition to happen. I highly doubt Microsoft is going to risk actually dropping x86 from Windows on any reasonable timescale and there has to be something for ARM to x86.
Unlike when Apple announced that all of Mac was transitioning, there isn't a reason for a developer to think that anytime soon they can drop x86, so why complicate what they have now by adding ARM?
> Right thats kinda my point, unless I have missed it I have yet to see any real talk about ARM on custom built machines and I doubt gamers are going to give that up anytime soon.
A lot of gaming these days is running on mobile phones and portable PCs - and now laptops - will highly likely leverage ARM sooner or later. Add to that some eGPU with Nvidia cards and you get a monster.
Intel is in a deep trouble.
>Unlike when Apple announced that all of Mac was transitioning, there isn't a reason for a developer to think that anytime soon they can drop x86, so why complicate what they have now by adding ARM?
ARM is the future as there is a desire to have long battery life and performance increase. Microsoft right now does have x86 emulation layer and app support right now is much better already than it was before (in RT era where it did not even have the emulator).
Devs are developing apps across all the devices and ARM based Mac is already requires you to develop ARM compatible apps.
>I have yet to see any real talk about ARM on custom built machines and I doubt gamers are going to give that up anytime soon.
The vast majority of gamers game on smartphones and tablets with ARM processors.
Some of the biggest gaming hits recently have also been cross-architecture and cross-platform, namely Genshin Impact and Honkai: Star Rail. Native ARM and x86 releases, runs on Windows, Android, and iOS. There are also gaming hits like Fate/Grand Order that don't have an x86/Windows release at all due to not even considering desktops/laptops.
> The vast majority of gamers game on smartphones and tablets with ARM processors.
Those are clearly not the gamers I am talking about. The gamers I am referring too are not switching to playing on mobile phones. If they are switching to handheld devices they are going with x86 devices like the Steam Deck.
There is a massive market out there of games that do not support those platforms. That are only just now scratching the surface with games like Death Stranding releasing on iPhone and Mac.
Except for Nintendo the 2 main AAA consoles are x86 based, and I have seen no rumors of that changing.
So great, there are large mobile games but lets not pretend that there is not a huge market that the future is not already here for and shows very little signs of actually changing anytime soon.
https://steamcharts.com/ that is what I am talking about. Which unless I am mistaken the only one of those in the top list that actually runs on mobile is PUBG.
> There are also gaming hits like Fate/Grand Order that don't have an x86/Windows release at all due to not even considering desktops/laptops.
That is nothing new, Pokemon GO came out in 2016. That isnt a sign that gaming is changing but that gaming is expanding to include new types of players. But the "hardcore" AAA gaming market still very much exists, and is firmly on x86 right now.
Porting a game from x86 Windows to ARM Windows may take some effort, but for most games, nowhere near as much as porting to a different operating system. There just isn’t that much assembly code or even SIMD intrinsic use in your average game. And thanks to Microsoft’s Arm64EC ABI, the conversion from x86 to ARM can be done piecemeal. If, say, the game depends on some proprietary third-party library that isn’t willing to offer an ARM version, that library can be run in emulation while the rest of the game is compiled natively for ARM.
The AAA game world is very conservative, so I can’t guarantee that PC game developers will port their codebases to ARM. It really depends on the size of the audience and how well the x86 emulator works as a substitute. Even if ARM takes over on Windows laptops, I’m not sure laptops are enough, when laptop users are already accustomed to not being able to run AAA games well.
But if the audience gets large enough, it’s hard to believe that developers won’t try recompiling. It’s just not the same level of effort as a port to Mac or Linux.
> The AAA game world is very conservative, so I can’t guarantee that PC game developers will port their codebases to ARM.
Unreal, Unity, CryEngine and Godot all support ARM, so - testing and third-party binary libraries aside - there shouldn't be any reason to not have an ARM port.
> Which unless I am mistaken the only one of those in the top list that actually runs on mobile is PUBG.
Even in that case it's "kind of but not really". PUBG Mobile is a distinct game from regular PUBG, they have similar core gameplay but they are developed independently of each other.
Fortnite is the outlier there, being the exact same game across every platform. COD Mobile and Apex Mobile are/were also officially sanctioned clones of the original game, similar to PUBG Mobile.
>Those are clearly not the gamers I am talking about.
You specified gamers, you should have explicitly specified PC gamers if they are who you referred to.
Note that PC gamers are, as much as they deny it, a minority of out of all gamers as a whole. The vast majority of gamers play on mobile or consoles, and of those mobile far outnumbers consoles too.
Consoles can also switch processor architectures with the changing forces of the wind, they don't have to support backwards compatibility unlike x86 and Windows. If Windows ends up becoming more ARM dominant than x86, consoles will likely follow suit to make subsequent Windows ports (and then also mobile ports?) easier.
Going on a tangent, I find it very annoying that PC gamers despite being the minority somehow want to claim gamers aren't gamers. PC Master Race is a meme, not reality.
>Which unless I am mistaken the only one of those in the top list that actually runs on mobile is PUBG.
Stardew Valley at #10 also has mobile ports.[1][2]
>But the "hardcore" AAA gaming market still very much exists, and is firmly on x86 right now.
The games I cited are AAA games, FSVO AAA; they are developed and/or published by big, established studios and/or publishers. Frankly, I find the AAA moniker worthless these days, but I digress.
That didn't sound correct to me, and I found an article [0] that says the numbers are pretty similar.
I play some games on PC+PS5 and some on mobile, so I'd probably count as both a mobile and "legacy" gamer, but if I had to choose one gaming market to immediately disappear from the face of the earth, it would be mobile gaming for me, absolutely.
The graph is misleading because they group PC and consoles together against mobile. That implies mobile would slaughter both of those segments individually.
It's also missing some very important markets like Japan and South Korea, presumably included in "48 markets multi-market average" but not explicitly shown individually. Makes one wonder, eh? :V
I'm a Windows ARM user (Surface Pro X). For me the benefits (fanless, battery not running down randomly in your backpack, phone charger compatibility, integrated LTE, 16G RAM in that envelope), are worthwhile.
No one cares for power saving. Turn it into higher performance at same power usage and people will bite. Of course it has to actually be a real upgrade like the Apple Silicon chips were.
Non experts have to rule on expert subjects all the time - sometimes this goes hilariously wrong (like the internet being a series of tubes) but usually what happens is that the non-expert relies on the testimony of experts to make their judgement.
Politicians aren't expected to be experts due to the immense breadth of subjects they need to consider - they're expected to consult experts. Whether an individual politician is an expert[1] is pretty irrelevant.
All of these statements are about our general expectations of politicians - whether you think politicians adhere to that point or have comments on specific politicians is beside the scope of my comment. As a less controversial example it might be good to instead consider how judges operate who are expected to provide well reasoned judgements on subjects they know nothing about.
1. Sometimes those former expert politicians are the worst of all since they _think_ they know the way things are and won't listen to actual experts but they've been out of the industry so long that they've lost their familiarity with the subject.
>sometimes this goes hilariously wrong (like the internet being a series of tubes)
That didn't go hilariously wrong, though - the internet is a series of tubes. Not physically (copper cables aren't tubes) but he obviously wasn't talking about specific stuff but broad-strokes analogy (his exact line was "It's not a big truck. It's a series of tubes."), and his description was basically accurate.
They don't actually need a quick turnaround on the Quest Pro 2, as the Apple headset is launching next year. Apple has given them plenty of warning, they have runway to comprehensively respond. Such a premature announcement is reminiscent of the Osborne effect. I think it'll negatively impact sales, and it'll make the ultimate value proposition of their first product difficult. Apple are really cashing in that brand value here...
With H/3, modern frameworks have become unnecessary deviations from web standards that result in high costs and lower app lifetimes. Do standard server side work, don’t use node as a server, sprinkle a bit of typescript where needed, and keep things lean.
As a recovering front end developer of 20 years, someone who has played with every damn framework and used most of them professionally; just stick to web standards and ignore the frameworks unless you specifically need them.
For example: we are using HN. It is simple. It works. It does not need to change. It does not need to be reddit or twitter or whatever, to be engaging. It’s good because it’s focused simple and well maintained. Almost every app I enjoy that has been over complicated or lost touch with its user base has done so while implementing these frameworks. They are business death. See; Facebook (react destroyed fb), Google (angular destroyed AdWords and G Suite), and so on. You lose analytics, accessibility, performance for the end user, less server side bugs is bad when clients are running the bugs, and most importantly you lose track of your customers needs.
If you want your shit to work, keep it close to the metal, conform to standards, scale only where necessary, and listen to your customers.
Your example does not align with your closing statement. Using Arc is certainly not "sticking to web standards" in the way you're proposing.
HN is simple and works, from a user perspective. I don't have my hands on the codebase, but I imagine it's kind of fun to work on precisely because it isn't "close to the metal". DX is just as important as UX is, you have to strike a balance.
Not sure there's much correlation between frameworks and apps becoming complicated or losing touch with their userbase. Using AdWords as an example, I'm like ~75% sure they went from one framework (Google Web Toolkit) to another (AngularJS).
GWT and Facebook’s PHP stack were both effectively server side technologies. GWT was pretty awful, but angular is probably the worst of the bunch when it comes to front end frameworks too. What I’m saying is you should run your app on the server and use HTML as a presentation layer. This is standard, it gives you visibility into bugs, it simplifies the end user payload, and it works quickly and reliably without needing huge refactoring projects and complex abstractions that your business does not need.
You probably don’t need a front end stack. You should just use the web standards as designed first, and then add improvements that customers like.
> For example: we are using HN. It is simple. It works. It does not need to change.
There are lots of basic issues with HN. For example, if you vote on a comment while writing a reply, the page reloads and you lose your comment. A web component for voting would do wonders here.
You've been here since 2014. How much value have you derived from this code? How long would it take you to write it?
https://news.ycombinator.com/hn.js
I don't think there's a person I've asked who prefers the modern web over the way it was 10 years ago. All we've done is create a bunch of busywork that results in a cushion class of yes-men who are paid to protect the oligarchs from the peons they control. The specific kind of user you're talking about unfortunately includes these people.
If we stop navel-gazing at our salaries for a second, we should take a look at what developers who make a real difference do. I'm not talking about people who build stuff that leeches off consumer credit cards. I'm talking about TimBL, Matt Mullenweg, Bjarne Stroustrup, John Resig, Greg Kroah-Hartman, Linus Torvalds. They do things on principle, on values, and are self-confident enough not to care about chasing status. They're people who look at a world full of problems and want to actually help by sharing their ideas.
Those people generally build open, self-hosted solutions. They want to empower people, freeing us from the modern versions of AT&T and IBM. They make stuff that can run on a Pi or an old Celeron. Their stuff is standards-compliant, open, clean, simple, and user-auditable, so that it can be trusted and contributed to.
React was developed by Facebook, an app that steals your own family members' valuable time. Angular was developed by Google, and app that drains your own wallet. If you want to emulate their outcomes, by all means use their technologies. If you want to make a difference in the world, observe those who are making a difference and emulate them.
> React was developed by Facebook, an app that steals your own family members' valuable time. Angular was developed by Google, and app that drains your own wallet. If you want to emulate their outcomes, by all means use their technologies. If you want to make a difference in the world, observe those who are making a difference and emulate them.
They're unrelated. The tech and outcomes are unrelated. You can make great things for the world with React. Using Arc isn't gonna make Facebook a better company
You're welcome to that opinion, but I disagree. In my experience, Conway's Law applies directly to code, not just to networks and systems.
It is also my opinion that at all levels, motives and intent are driving dynamic forces in life. Ignoring them is a great way to waste your time, money and efforts. I can understand that not everyone feels this way, but I do. So perhaps that's where our interpretations differ.
I postulate that robots need at least a single manipulator in the physical realm: Mechanical arm assembling car doors = robot. CNC machine that follows a path = robot. Mechs with chicken legs = robot. Brain in a vat = not a robot... but can be embedded in a robot.
It's a nice idea, but it totally ignores literally decades of existing use of the word "robot" (or its abbreviation "bot") to describe pure software that accesses internet services. e.g. web crawlers (googlebot), chat bots, automated clickers, etc...
Lexicography tends to be descriptive rather than prescriptive. If enough people use a word to mean a thing, that word means that thing. As least in some contexts. See also "gay", "hacker", etc...
Note that it is possible for a word's meaning to be "reclaimed", but it generally doesn't get that way by some small group of people just shouting "You're doing it wrong!"
Hmm, "robot" in its spelled out form sounds weird to me for this use ("bot" is more frequent). Wikipedia redirects people looking for software agents to a separate page from the article about the beep-boop ones: https://en.wikipedia.org/wiki/Robot
'full self driving' is an even more incorrect term, then, if you want to be pedantic. if the car mfg takes zero liability/accountability, then it is zero self-driving.
you can, in fact, 'recall' software. this is semantically accurate description of what is happening.
It's a full self driving 'beta' though, it's literally in the name that it isn't final and doesn't have the bugs worked out. You also have to opt in to it.
Wait is your argument adjectives can only be applied to Tesla marketing terms, and not to other English words?
What do you think “FSD recall” or “software recall” means
Tesla released software that can kill people. It must do a FSD recall
This is not the same as Apple doing an OS update, not mandated by a regulatory body because it could kill someone
I was hit by a driver not paying attention. I had to get two surgeries, was not allowed to stand for 9 months, spent another year just rehabbing my nervous system and learning to walk. I still can’t jump or run
Tesla released software it’s marketing as “full self driving” that is not even Level 2 Alpha! Full Self Driving implies a level 5 system.
Tesla’s software has disclaimers like “may not behave as expected in sunlight”. Even its AutoPilot has these disclaimers.
It constantly claims FSD makes drivers safer. Yet it’s non transparent with its data, the data & comparisons it does release is completely misleading to the point of fraud, comparing apples to oranges. The cars it’s released on public roads to untrained drivers runs through intersections (it’s all over YouTube) and fails all kinds of basic driving tests. Tesla accepts ZERO responsibility if someone is killed while FSD is activated… that’s how little confidence it has in its own product
This is a product that can maim and kill humans, ruin people’s entire lives… and your response is incoherent mumbling about adjectives ?
Any person who is majorly confused by what a “software recall” is… or can’t figure out what this FSD recall means for them, shouldn’t be beta testing a 2-ton machine on public roads. They shouldn’t be driving period.
I disagree, you are making one interpretation of what "Full Self Driving Beta" means to everyone, as if everyone using a beta, they have to opt-in, and purchase, and have a good driving record (indicating they understand the rules of the road), is only looking at it like a headline article they don't read. 'You imagine everyone is dumb and you are the only smart person who looks beyond the name of something. You are the one caught up in the marketing yourself while the people actually making the decisions to spend their money on this and opt in are the ones actually looking into its. You claim that they take no responsibility is incorrect as they offer their own insurance if so chosen. If insurance companies don't want to cover it, they can easily just not cover Teslas but that isn't happening. Insurance companies still cover it because Tesla is safer than other vehicles. "Full Self Driving" doesn't mean perfect driving, it means as good or better than the average human and that bar is not that hard to pass.
Vultr is amazing on price/performance. I use it as a lab cloud, and for my more fun (or Windows) workloads. You slightly sacrifice reliability, but you get interesting options including odd OSes, and per-hour dedicated hardware. Definitely an innovative bunch.
DigitalOcean offers solid uptime, a more professional operation, and typically better performance than Linode in benchmarks (https://www.vpsbenchmarks.com/screener). It has confusing marketing, but a good account dashboard and good docco/service. They're more in line with Linode, and will probably be the big winners if Linode/Akamai continue losing steam.
I wouldn't go with OVH, and I would use caution with cheaper providers, unless you're comfortable with the risk of hiring people who would willingly build wooden data centres with no fire safely. There is such a thing as "too cheap". (https://www.datacenterdynamics.com/en/news/ovhcloud-fire-rep...)
> The Bas-Rhin fire service says that the SBG2 data center had no automatic fire extinguishing system and no general electrical cut-off switch. The facility also had a wooden ceiling rated to resist fire for only one hour, and a free-cooling design that created "chimneys" that increased the fire's ferocity.
I recall seeing headlines here about the fires, but I didn't know their construction was this bad.
I second Vultr. They also have more advanced networking features, mainly BGP peering, that are uncommon with larger VPS providers. Their network seems super reliable.
You're not wrong. Look at the like/dislike ratio on their AMA video about this topic. 29 likes, 733 dislikes JUST in the 'return youtube dislike' browser extension. What the heck guys.
Office Open XML (docx), to be fair, is a non-proprietary standard. Microsoft has been forced into that by the EU; and they keep messing with it, but it is actually technically "open".
It is open only on paper - at least half of it MSO-specific compatibility cruft, plus real documents often contain binary blobs that aren't described in the standard. Name "open" is a lie in this case. Moreover, it's made in ISO only because of corruption of these standardization bodies. It wasn't even properly reviewed!
Due to decades of work reverse engineering the behavior of MS Office (not only after the invention of DOCX, many of the specified behaviors in DOCX reference behaviors of older versions of Office).
It would be a monumental effort to create another implementation and completely impossible without referring to MS Office as a reference implementation.
It's completely unrelated. LibreOffice can open most completely binary DOC or XLS as well, they were just reverse engineered. It has nothing to do with openness.
Correlation does not imply causation. The fact that Libreoffice can open binary proprietary formats does not automatically imply that all formats it uses are binary and proprietary. DOCX is an open standard.
DOCX is not an open standard, there are many articles[1][2] why it isn't - even in Wikipedia article [3]. ISO standards committee were just a rubber stamp Microsoft puppets to ensure vendor lock for decades.
I stand corrected, sorry. Emulating some other proprietary software really shouldn't be a paragraph in something we call an "open standard". Thank you for the links.