A few years ago, I had two computers on my desk, my beefy dev with double screens and some good specs for the time and my test machine which was the standard given to every non dev, with a 1024x768 screen.
I couldn't say to the boss that the code was ready until I tested it on that machine, which was sometimes eye opening and why a 2Mb HTML page wasn't a good idea.
I think for this plan to work you’d have to force the developers of Xcode to work on the 4 Gb machines first. If they could do that, the rest would follow naturally.
Hah. When I worked for a very big Just Print Money bank circa 2008, they gave me, a SDE with the Lenovo ThinkPads running Windows with 4GB of RAM and a bonus of Lotus Notes for email. This thing was slower than molasses. Not to mention because we had an offshore team in India. every morning every engineer would begin the day with syncing the Subversion repo. My team was in central US but we had to connect to a proxy in NYC for network traffic inspection. This makes the sync over 45 minutes long. Repeat the same for every SDE, from both sides of the world, and you can guess the amount of time wasted.
I don’t think I would want to work in that environment anymore.
Similar story, I had a customer who wanted me to change the entire UI of a legacy application, because some information would not fit on the ancient 1024*786 15" desktop monitor of one employee, meaning he would have to use horizontal scroll constantly.
I recommended them giving this employee a larger monitor, not only would that be much cheaper than having me rebuild the entire UI, it would also boost this employee's productivity. Not to mention that swapping a monitor takes 10 minutes, changing a UI probably weeks.
Customer insisted to change the UI, because "if we give him a new monitor, everyone in the office will want one". I nearly got fired for responding with "Great! Then everyone can benefit from more productivity!".
In the end we did change the UI, I believe the total cost was something like 30k. The customer had maybe 15 employees, so new monitors would still have been much cheaper.
A few months later their offices were remodelled with expensive designer furniture, wooden floors and custom artwork on the walls. Must have cost a fortune. In the end, the employees still worked on ancient computers with 15" monitors, because new computers didn't fit the budget.
Sometimes I have the feeling AAAs can be better optimized than Unity based indies.
It's probably a bit better than when Unity was new. I do remember the first x-com remake in 2012 was lasting longer on battery than $random_unity_indie.
Sure and all game devs should be forced to do their work on 80s NES dev kits or whatever. /eyeroll
This line of thought is ridiculous Ludditism. Artists and craftsmen deserve to work with SOTA tools, you can only benefit from having better more accessible more performant tools.
That's dumb. You can hardly even buy a machine with 4GB of memory on sale, at any price.
If you are making products that depend on people spending money on them, you generally don't have to care about broke people with 15 year old computers.
I must say, the irony of this comment in a thread about Apple moving down-market without losing quality is … well, it burns. Along with the arrogance: “Anyone who can’t afford 8GB isn’t worthy of being my customer,” is literally the opposite of what Steve Jobs always said.
I was stuck once in a cabin in the woods with an old Android phone. I’m glad it still worked, and that people curating software experiences for it had more empathy — and more business sense — than this comment displays.
Didn’t Steve Jobs basically say Apple didn’t know how to make a good computer for $500 and used that as a justification to not sell any products to the lowest priced area of the market?
There’s no irony here. The plain fact exists that 8GB of RAM has been considered not an especially exotic amount lot even on cheap on laptops and desktops for about a decade if not longer.
$450 in 2015 would have bought you a Dell laptop with 6GB of upgradable memory:
The point was that Apple has completely been uninterested in the bottom of the laptop market from 1976 to 2026, and there is therefore no irony in my statement that many businesses including Apple will purposefully ignore customers who do not have enough money to buy their stuff.
From the first comment I responded to:
> “Anyone who can’t afford 8GB isn’t worthy of being my customer,” is literally the opposite of what Steve Jobs always said.
This commenter is wrong. This idea that the bottom of the market is below Apple is almost exactly what the quote from the earnings call said. Jobs effectively said “we only make mid to high end computers, someone else can take the serve the budget customers.”
This is why I pointed out that most people employed making commercial software don’t have to concern themselves with the needs and desires of users on desperately outdated hardware, since those users can’t afford your product anyway.
Of course, at the time Jobs was alive that number for RAM was below 8GB, but that specific number is not specifically relevant other than the fact that I brought it up as a general example of the standard of the day from around 10 years ago.
I brought up a bunch of computing examples from the mid-2010s after Jobs’ death because they are about the oldest reasonable hardware you’d find around today, proof that even buyers of low-end hardware 10+ years ago were regularly getting more than 4GB of RAM.
Apple’s base model MacBook Air in 2017 had 8GB of RAM. The 2015 model started with 4GB configurable to 8GB. The 12” MacBook from 2016 had 8GB RAM.
So you literally have to go back a decade to find anything sold by Apple where getting less than 8GB was an option on the lowest possible configuration, never mind PC manufacturers who generally gave better specs per dollar and included socketed memory.
But hey, Apple shills will shout from the rooftops that a 2026 laptop with 8GB of RAM is a good deal just because it’s $500 if you lie about your status as a student and pinky promise with Apple that you’ll never use the computer for commercial usage.
No Steve Jobs said exactly what he said. The technology wasn’t to the point where they could offer products that aren’t junk. An unsubsidized $120 Android phone is “junk”. A $99 iPod Shuffle or a $300 low end iPad isn’t.
The Netbooks available in 2010 were junk even by that days standards.
The MacBook Neo which is fast enough, a better display than low end PCs and a good trackpad is not junk. It can do what most low end consumers care about well.
At least in the US, even during the SJ era you could get a “free” iPhone with a contract that anyone could afford - it was the last years phone
Well, here’s the thing…and, I apologize, this is a bit of a shift from what we were talking about.
The MacBook Neo is getting so much hype for being better than a low end PC, before it’s been put through its paces over the long term.
I had the same initial reaction. Wow, a Mac for $500, how incredible, how disruptive.
But then this morning I decided to look at the actual street pricing of laptops at my local Best Buy.
And here’s the thing: now that Apple has this machine with no haptic trackpad, no backlit keyboard, the worst screen available on any Apple product, very small keyboard, and very basic non-upgradable specs including a generations-old efficiency processor, I think the actual story here is that Apple has changed their mind and is willing to make a product that they would have previously called “junk.”
I’ll list off a couple of systems that I would absolutely buy as better machines over the MacBook Neo:
HP OmniBook X Flip, 16” 2K touch screen, Intel Core Ultra 5 226V, 16GB memory, 512GB storage, $699.
For the same price as the top model Neo you get double the RAM, a bigger and probably better screen, which is convertible and touch enabled. It is not some kind of bargain basement SKU, either, a legitimately well-reviewed laptop.
Right there in the pricing sweet spot you get more memory and basically all the benefits of an ARM architecture in another laptop that is well-regarded. You also get a number pad on the keyboard.
All these laptops have been getting well over 4.5 star reviews, like this one:
> This little guy has been amazing this semester plenty of power while being light and getting good battery life the quick charging feature is particularly impressive from almost dead to full in around half an hour all and all this laptop has met or exceeded all of my school life needs
Finally, this is probably my choice if I was in this segment:
Another great example of a laptop that is costing you less than the Neo’s top model before education discount, has better specs, and is again a legitimately good model of laptop solidly in the mid-range of the lineup, not a bargain basement SKU. I would actually be surprised if the Neo kept up with this particular model in terms of build quality, keyboard, etc.
The Neo’s main advantage is that it’s got a chassis made of aluminum, and that’s really its only differentiator. And I’d say that’s an overrated differentiator (e.g., plastic is lighter and isn’t automatically weaker/worse for long-term ownership).
Just looking at the first one - the screen is worse, it’s heavier and the processor is slower. Of course PC mags always grade crappy intel based PCs on a curve. Actually all of the screens are worse.
The Ryzen AI 340 isn't a bad match against the A18 Pro. It's actually ahead of the A18 Pro on multicore performance, and only 20% behind on single core benchmarks, not enough for anyone to notice. Yeah, it's true you're losing a lot of integrated GPU performance. Integrated GPUs do need more RAM, though, and I doubt the Neo is going to be handle a lot in the realm of "high school kids who want to game on the side" between that and the software compatibility situation.
My main point isn’t about caring or not, the point is that 4GB RAM in a laptop/desktop is incredibly rare for how outdated it is.
The PS4 came out in 2013 and has 8GB of RAM. In case you need help counting, that’s 13 years ago.
And that’s an optimized game console with no general purpose operating system and limited multitasking capability.
10 years ago, Samsung phones were shipping with 6GB of RAM. Not many phones even physically last that long.
My uncle bought a $350 trash Windows PC a couple years ago, literally the cheapest thing I could find on sale at Staples, and it came with 12GB of RAM.
The price of memory is insane, so if anyone wants to increase performance/dollar, they're likely going to have to do it in software. I would suspect 4Gb computers are going to come back if the hungry AI beast doesn't cool it soon.
How much memory does your parents and grandparents computers have? There are a lot of people out there with older computers, probably even some that you know :)
My uncle bought a $350 trash Windows PC a couple years ago, literally the cheapest thing I could find on sale at Staples, and it came with 12GB of RAM.
I am still sad that they stopped putting it into iPhone, I think the tech is great and the watch really proves what can be done with it when it is a fundamental part of the hardware and the OS can be built around it. But we never had a situation that every compatible iPhone had force touch so everything that could be done with it had to work in other ways.
I think the iPad made that even more complicated since I doubt we would have ever gotten it on a screen that large, if it would have even worked.
As far as it being on the trackpad, it is honestly pretty wild when you realize it. It does an incredible job of faking feeling like it is actually moving. Was similar with the fake home button that some iPhone’s had for a little while.
I remember being totally flummoxed when I was trying to figure out why my trackpad wasn't clicking when the machine was off. I had no idea it wasn't a mechanical lever anymore!
You know who also loves to use the term "opportunity cost"?
The entertainment industry. They still tell you about how much money they're leaving on the table because people pirate stuff.
What would happen in reality for entertainment is people would "consume" far less "content".
And what would happen in reality for Anthropic is people would start asking themselves if the unpredictability is worth the price. Or at best switch to pay as you go and use the API far less.
They will be able to do banking at least once the legislators tear down the walled gardens in a sensible way. Are the security benefits from the Appstore/Playstore real or security theatre?
I'm pretty sure that, if there are security benefits, they have been artificially tied to the use of the company's distribution method, that coincidentally really needs to be sending usage statistics, monitoring, etc. Surely there exist no conflicts of interest to be found.
fifteen years ago I use to do mobile pentests for banks and when we could not find anything significant for the reports we could’ve always count on “lack of rooting detection” and pin the risk on some vague mobile banking malware threat pushed by marketing. I am sorry I contributed to this nonsense.
It's understandable; I would maybe expect to undergo an extra step in verification for a sensitive app like, "we noticed this is the first time you are using this system that is not locked down; please type in the token we have mailed you".
But locking users out (which may not directly be the bank's fault for relying on OS's security APIs) seems anti-competitive.
Ha! Well, not right now! Previous to the last year or so, this wouldn't have escalated to the current situation where we're actively having to be wary of fending off Big Brother or blatant power grabs.
However, given that we're talking about a European phone, I'm willing to bet that this type of effort goes hand in hand with decoupling from American-backed services (at least for those who've seen the writing on the wall and understand the risk to their sovereignty if they put all their eggs on an American basket).
I just switched to a Fairphone 5 with e/OS, which is a de-googled Android (it uses microG), and am pleasantly surprised how well everything works. My banking apps work, contacts and calendar lived on nextcloud already, the learning apps I use work.
The two things I have to get used to is not having google maps, but the map app on there has also worked fine for me so far. And casting to a Chromecast doesn't really work for me, but I can live without that.
If you want, we can ritually bury your Chromecast? I'll bring the marshmallows, spiders, and the Necronomicon. Oh, and two of my old Chromecasts, rotting in a drawer.
There's now someone who wants to build an open attestation system which should comply with both banking requirements for security and allow 3rd party OSes to thrive: https://uattest.net/
They are European, certainly the Euros could come up with some regulation to force banks etc to support a Euro phone. I’d actually welcome this as more competition is better and we can’t seem to kill the duopoly here in the US.
The vast majority of people on this site can afford a couple hundred dollars for a basic Android phone that's used only for tasks like that, and as a bonus it's safer than having banking apps on your main phone. Anyone who isn't willing to spend a couple hundred bucks on the freedom to run whatever software they want on their phone probably doesn't consider software freedom a priority anyway.
Using a Sailfish phone comes from the desire not to use Google or Apple. So wanting to pay using those companies seems an odd requirement. Luckily for paying there are still widely accepted alternatives.
I wouldn't say it is a requirement. The thing is, if you want to use bank apps in NL, they all quit with native NFC. It is either Google Pay or Apple Pay >:(
It would not in principle, those rely on hardware backed keys with Google's latest iteration of Google Play Integrity. The only success people have had is by using leaked vendor keys and spoofing device fingerprints for old A11-era devices which did not have the hardware baked in. In time even this avenue will no longer work. People have been trying to get around it for a while [1] but afaik the concept is cryptographically airtight.
My banking app works fine on a rooted phone that I don't bother faking a proper Play Integrity signature for. Except for a warning about the phone being rooted when setting it up, of course. I'm not 100% sure what happens when you have integrity and lose it by rooting your phone, but I imagine the bank app will log you out.
Bank apps only stop working because banks decided they know better than you.
Unfortunately my bank also switched to Google Pay which does require Play Integrity, so contactless payments are out of the question on that phone now. Maybe if Wero compatible terminals extend support for QR payments I could use my bank app again on that phone.
Maybe I'm out of the loop but what is everyone doing with banking apps on their phones that's so essential. I see this argument all the time but it's baffling to me.
I have several hockey leagues and pick up sessions that only take payment via Venmo; prior to league or the session starting. Makes it way easier than going around to everybody in the locker room trying to round up cash.
I also use it for a few vendors for some small payments I make every month for my studio.
I don't use them a lot and I know some people that use them for 90-95% of the stuff they do which is crazy to me like yourself. I try and limit my use of the apps to as little as possible. Whatever works for people I guess.
If we were rational creatures we might choose to do such things while seated at home in front of a comfortably sized screen, rather than squinting at a pocket gadget on a street corner.
IMO the word "need" is doing all the work here. I have a fairly complex financial situation and yet - with a bit of intentionality and organisation - I get by fine without any banking apps installed on mobile.
I'm never clear if this Ask HN is for posting about what you're messing with or for promoting organized projects that chase github stars or are commercial.
But anyway, I've started to learn Go. By doing a vertical scrolling shooter with embiten. Kinda like fitting a square peg into a round hole. No, it's not public and will probably never be.
Studying how do do a memory pool for actors, since it doesn't look like garbage collection and hundreds of short lived bullet objects will mix well.
This way you'll be able to run more than one "web app" at the same time on your devices.
reply