Hacker News new | past | comments | ask | show | jobs | submit login
AMD doubles the number of CPU cores it offers to Chromebooks (arstechnica.com)
128 points by scrummy on May 7, 2022 | hide | past | favorite | 125 comments



Hopefully this will put some pressure on manufacturers to bump specs on their lowest end offerings. Dell's $300 Windows laptop offering right now for instance is built with a dual core Celeron, which is hard to excuse when most phones and tablets at lower prices are at least tri or quad-core.

Once the bottom end baseline is finally moved up to quad or hex-core, there should be a stronger drive for software not aimed at users with heavier workloads to be multithreaded well.


It seems the "core myth" has replaced the "MHz myth" among computer buyers. Those low-end quad core phones and tablets are most likely using "little" ARM cores like the Cortex-A53 or A55. These cores are very small in terms of die area which why you can get four of them very cheaply.

Meanwhile that dual core Celeron is using Intel's performance cores which are much larger and several times faster than those ARM cores. Even for multithreaded workloads the Celeron will run circles around cheap quad core ARM SoCs.


Not sure if this is the case, Celerons are just plain bad…

e.g. https://www.cpu-monkey.com/en/compare_cpu-intel_celeron_n450...

Single thread performed is kind of close, 860 is faster though. And let’s not even look at the MT benchmarks. And generally it seems that most medium/high end chromebooks with ARM cpus are generally able to outperform x86 ones.


You're comparing an Atom based Celeron, which is the bottom of the barrel for Intel CPUs and not something which often shows up in Windows laptops, to a top-of-the-line Snapdragon SoC. The wholesale price of the 860 is probably 2-3x that of the N4500.


https://www.cpu-monkey.com/en/compare_cpu-intel_celeron_n402...

The N4020 is the model of Celeron used in the 2021 Dell Chromebook. I'm seeing N3060, 2955U, etc.

The Pentium Silver N5000, which isn't badged as a Celeron, is still much slower on multi-threaded and single-threaded perf: https://www.cpu-monkey.com/en/compare_cpu-intel_pentium_silv...

The entire Snapdragon 865 package cost manufacturers $150-$160. The Pentium Silver N5000 retails for $90-100. The 860 would presumably be even cheaper than the 865.


That's dell basically ripping consumers off.

https://www.notebookcheck.net/Intel-Celeron-7300-Processor-B...

Is the latest celeron which should be more or less equal to a 6th gen i7 while using a lot less power. They are selling a 3 year old celeron in a 2021/2022 chromebook which is inexcusable.

Acer sells a chromebook with an i5 that's routinely $430 refurbished (yes it's a refurb but it's basically always available) https://acerrecertified.com/chromebooks/?_bc_fsnf=1&Processo...


>The entire Snapdragon 865 package cost manufacturers $150-$160. The Pentium Silver N5000 retails for $90-100. The 860 would presumably be even cheaper than the 865.

Sign. I would be surprised if dell is buying the Pentium N5000 along with chipset for more than $60. And that is excluding marketing rebate.


Snapdragon 7c Chromebooks do indeed outperform the Pentium-based machines…

Says more about Intel’s Atom line than anything tbh.


That’s because it’s an atom-type CPU core, and not a good one (the good ones are called “Atom”). A laptop with this CPU basically is using the “efficiency core” from a recent laptop but as its main core.

There are plenty of Core-class x86 ChromeOS laptops and these smoke all ARM-based laptops excepting Apple’s.


Not always. My recently deceased Thinkpad 13 chromebook with an Intel 3855U held up despite lacking support for multithreading. My HP Mini 311 with an N270 played HL2 and CS pretty well for a single core processor (yes, the Nvidia ION helped).

Currently rocking a Lenovo chromebook Flex 5i with an i3-1115G4 and it's outperforming my phone's Snapdragon 845 in rendering video animations by a reasonably well margin.


Yes, but my M1 spends all day with the efficiency cores trucking along at about 50%.

I actually misplaced the magsafe cable before I had to charge it the first time. The little cores make the right perf/watt tradeoff for all but the heaviest workloads, such as slack. Sometimes slack pops onto a performance core for a few seconds at a time.


I have a Chromebook (Samsung 4) with a Celeron N4020 2/2 1.1GHz ("up to 2.8GHz"), 6W TDP processor. It's not going to blow anyone away with its performance, but it's entirely adequate and I love that it has all-day battery life from a $100 device (mine was $92.44 delivered&taxed on sale; the typical street price is $119).

I don't want the lowest end offerings to become 15W TDP chips in $300 laptops. I think there's a perfectly valid place for 6W chips in $100 devices, which brings computing access to more people and places.


Do you run Linux? I think even web browsing on Windows with this is challenging


It's running ChromeOS with the Linux dev system installed. I did last year's Advent of Code in Clojure on this device, including some airplane trips. (I didn't solve every puzzle, but the limitation on the ones I didn't get was me, not the Chromebook. It runs Emacs, cider, and the Clojure REPL just fine.)

I'm typing on it right now and it's fine for casual use. (It gets a fair amount of weekend use because I neither want to undock my work laptop nor carry around something that large, expensive, and heavy.)

Would I run it as my only computer if $500 wouldn't pressure my family finances? Probably not. If my choice was between this and nothing, that's an even easier choice.


The bottleneck is usually RAM. It looks like his device has 4GB, which is similar to what a cell phone has.

It doesn't take much manufacturer-supplied bloatware, bad drivers, or background processes to use up that much RAM but a Chromebook is just the kernel and Chrome, so you get a lot of bang for your buck.


Similar to what a mid-range phone has :)


15W TDP for 4x more performance means the processor can race to sleep faster, which means it may even have better battery life.

The prices will eventually go down.


There shouldn't need to be such a large tradeoff between efficiency and core count, and in the world of ARM CPUs it's not. It's entirely possible to pack 4+ reasonably performant cores into a 6W TDP, and likely at non-extravagant prices.


> and in the world of ARM CPUs it's not.

How can this be true? I'm not saying it isn't, but this makes it sound like ARM is just objectively better - lower power and higher performance? I assume it's more complicated.


Most of the time you can increase performance either by increasing clock frequency or doing more per clock. Raising clock speed usually increases power exponentially. On a desktop this is usually the strategy because we can put decent cooling rigs on them.

Doing more per clock is difficult on an x86 compared to ARM. x86's instruction set is a hodgepodge collection of instructions of variable lengths and addressing modes. ARM64 on the other hand has far less addressing modes and a fixed 32-bit instruction length. When an x86 is trying to decode ahead of the instruction stream it needs to decode each instruction in order or have special logic to get around that which makes it more difficult to stay ahead of the processor. Normally you see an x86 chip described as having a certain number of complex and a certain number simple decoders because some instructions are just pigs of things to decode. Simple decodes will get stuff that decode to 3 uops or less while complex handles most of the rest. Some real pigs of instructions might even be sent to the microcode sequencer which generates a whole heap of uops which takes a while.

In the case of ARM64 every 4 bytes you have an instruction come hell or high water. On a chip like the M1 it takes 32-bytes of instructions, splits every 4 bytes between its 8 decoders, and each will spit out uops in parallel. From there the chip will issue those decoded instructions to the necessary execution ports. Because of the less complicated decoding, the huge increase in decoding throughput, and the huge reorder buffers an M1 can keep more of its execution ports busy. If twice as many execution ports can be kept full it means you can do the same amount of work in half as many clock cycles. Because you're only running at half the clock speed your power usage is way lower.


Presumably the cost here is that your instructions are considerably larger, which means fitting fewer of them into cache?


The code density of ARM64 is not that much worse than x64 - especially for anything generated by a modern compiler. You may get some small scale gains for hand tuned code with careful instruction and register selection (ie where Rex prefix can be more easily avoided) - but average binary density doesn’t overcome the aforementioned differences in efficiency.


For reference, it offers about half the performance of a Snapdragon 845...

It's honestly a garbage chip for disposable devices, which is just bad for everything.

I'd rather use one $300 device than 3x $100 devices over the same timespan.


Also for reference, it's about double the performance of a 3.7 GHz Pentium 4, which was a perfectly usable desktop CPU.


Pentium 4 stopped being a benchmark for desktop usability a long time ago. Any JS website will bring it to its knees.


> Any JS website will bring it to its knees.

Maybe that's a problem in of itself?

https://en.wikipedia.org/wiki/Wirth%27s_law

There is nothing intrinsically different about computing from, say, 10 years ago. You might still want to read some text on a webpage, click on a few buttons and have them do something, maybe fill out some text fields, or even upload/download a few files.

Instead, you get "visual experiences" with overcomplicated UIs with similarly overcomplicated underlying technologies (edit: not to say that there aren't benefits to technologies like Vue/Angular/React, however they aren't "necessary" to get things done most of the time, in many cases even server side rendering without JS would be enough), all of which waste all of the resources that you'll give them, especially if you don't have ad-blockers on which means that you'd get bogged down with dozens if not hundreds of malicious scripts.

Of course, this is a bit akin to shouting at the cloud, but nobody should be too proud about the state the modern web is in and use it to justify wasteful hardware and software requirements: https://idlewords.com/talks/website_obesity.htm


With an ad blocker, I've found the next bottleneck for web browsers is the GPU. I'd guess a pentium 4 with the equivalent of a five year old, well-supported integrated Intel GPU would be fine.


The Pentium MMX was also a perfectly usable desktop CPU, ran Windows XP and stuff.


I'd rather they put the extra $200 into the keyboard, display, passive cooling, audio and webcam.

Even "garbage" arm CPUs are fine these days. All my real work happens when I remote into a machine that's faster than a $3500 workstation laptop. The remaining heavy workload is the web browser (and, perhaps, corporate email client / other crapware)


> I'd rather use one $300 device than 3x $100 devices over the same timespan.

I agree, but do consider how many Chromebooks are purchased by schools. The calculus there might be different, because kids drop things.

Note that ethically, I'm not convinced this need outweighs the environmental concerns.


It is important to remember that the market is not just you.


9W Alder Lake mobile CPUs start at 5 cores (1P+4E), even for Pentium/Celeron. 6W class will apparently be E-core only and go up to 8 cores, unclear what the minimum is since details haven't been announced yet. Overall, core counts should be going way up on average this generation.


These "Windows" laptops make perfectly fine Linux machines. (Even web browsing with Firefox tends to be very manageable on as little as 4GB RAM and even a slow CPU.) It's still unreasonably hard to install an alternative OS on a Chromebook and not end up with a useless toy that will literally prompt you to wipe its storage clean at every subsequent boot unless you quickly press some obscure key combination. Yes, there might be some ways to replace the firmware and stop it from doing that, but it's just not a sensible design overall.


If the Steam Deck didn't exist, then due to the recent support for Steam[], one of these would be my next choice for a portable Half Life 1/2 appliance.

[0] https://www.chromium.org/chromium-os/steam-on-chromeos/


Unless things have changed over the last few years, I think having 8/16 is a bit of an overkill for an OS this restricted. It says they all have 15W TDB but I'd hope that's just the just max they have been configured to work at. Hence me thinks even the 4/8 5425C should be plenty for web browsing and running android apps for many, many years.


Good hardware restricted by software reminded me once again of the thing that iPadOS is. Despite excellent processors, the software is essentially iOS, with all its restrictions. Though chrome os is Linux based at least.

Question for any curious or innovative HN readers - what would you suggest to do with iPadOS' restrictions? I'm speaking both as an ipad owner disappointed with the software but also as an M1/Ax chip fan.

Potential solutions I can easily think of are: 1. Jailbreak - but it needs specific software and can be finicky, and very likely forces you to not get security updates

2. Physically remove the storage and *do something*. Except I don't know what even is possible, assuming that you're okay performing BGA soldering on a $$$ device.


Wait and hope somewhere like the EU forces Apple to open up the software/store restrictions has seemed the most realistically hopeful path to me. Alternatives are wait for someone to find a way to hack the bootloader open and add Asahi support for it (for this and every device that comes out for the rest of time). Or Apple to allow the bootloader to be opened like they do on the PC counterparts but obviously that's not what Apple wants to do or they wouldn't have released the M1 iPad fully locked.


Even better, Apple could port MacOS to the M1 iPad. Maybe they could have both iOS and Mac OS merged together and allow people to switch between tablet and desktop mode (like what they have done with Samsung devices). They could sell an external keyboard + trackpad for the desktop mode.


I would be entirely unsurprised if Apple doesn’t already have macOS builds targeting the M1 iPad…


For sure, there were A14 based mac minis given to devs during the switch. The only "missing" components are drivers for the display/speakers etc which is the smallest part of getting a working system (compared to the OS + kernel).

It's also this thing that infuriates me to some extent. Apple can do amazing things if it wanted but... it doesn't appear to care about consumer benefit.


More freedom for users is less money in their pocket.


A12Z-based.


Why is that even conjecture?

It’s the same processor as in the M1 Mac and the dev ARM kits had A14s.


They probably won't ever fully merge the UIs of macOS and iOS, but given that the underpinnings of the two are so similar it would make a lot of sense for iPadOS to be able to suspend its touch-based userland and boot up a macOS-based KB+mouse userland.


Mac Catalyst is basically UiKit userland for macOS. I think Springboard and Finder will remain discrete UI paradigms but I think iPadOS might be going towards a place where it can use either depending on what it has connected (see: https://www.macrumors.com/2022/05/06/apple-patent-ipad-with-...).


Sadly, to fully use the M1/Ax processors I think the best thing to do with it is to sell it and buy a MacBook, and encourage others to not buy the high end iPads right now.


That is what I too think, however there are 2 issues that come to mind immediately: 1. There is a market for tablets - for note taking or reading magazines, it really is convenient. You could probably switch to a Samsung tablet and likely get a very decent experience, but a lot of the "good" apps are still iOS-only (Procreate, Goodnotes, Notability etc). Not to mention a decent aspect ratio.

2. If buying a proper computer, personally unless you only use MacOS it's prudent to get an x64 chip. Intel's 12th gen chips are (fortunately, finally!) again competitive even with M1s. An Intel/AMD chip can run Windows/Linux/MacOS/BSD/most OSes, but M1 Macs unfortunately can't.

Ironically I plan to upgrade from my Air to a Pro for the high refresh rate. Getting a 90hz phone really spoiled me in the most first-world way possible.

- Things 3 is another classic example of an app that would be very easy to port to other platforms if so wished, but the devs aren't interested in going outside Apple's Walled Garden. And if it makes them good money I can't even blame them.


> Things 3 is another classic example of an app that would be very easy to port to other platforms if so wished, but the devs aren't interested in going outside Apple's Walled Garden. And if it makes them good money I can't even blame them.

A big factor is likely the quality of the UI frameworks on other platforms. On Windows, only the older "legacy" frameworks come close to the depth of AppKit but are a bear to work with (and in questionable maintenance status). GTK isn't the worst, but version 3 and up makes no attempt to fit in on non-Linux desktops. Qt probably comes closest but it comes with the caveat of being tied to C++ or Python, and distribution can be a pain. With Electron you have to bring your own everything.

I follow some Apple platform devs (on top of being one myself) and there's will from them to produce software for other platforms, but only once there's an option as nice as AppKit/UIKit to do so with.


Yeah that's quite understandable. I just wish Microsoft/Google would attempt to improve this aspect - they probably already are doing things but from the sound of it not enough.


The base iPad with an Apple Pencil under $500. I find 64gb to be workable, but if you want 256gb, it’s another $150 (ouch).

Granted, it’s not Samsung tablet cheap.


Samsung also makes very expensive and very high quality tablets. All they're missing imo is a bit more processor oomph.


(Replying because I can't edit - the italics are accidental, I intended to use an asterisk for the Things 3 point)


A touch screen and pen support is just too good to pass up sometimes. I would love an M1 class android tablet that had all of this - thankfully we're getting close.


> and encourage others to not buy the high end iPads right now.

My entire group at work bought iPads as dedicated Zoom devices at the start of the pandemic. Did we make a mistake because our needs are different from yours?


You probably did make a mistake if you paid for top storage specs, keyboards and pencils, all exclusively for zoom.


Nope! Specifically meant this advice in the context of people who looked at the top of the line iPads and were hankering "to fully use the M1/Ax processors" to the point where they had considered cracking the hardware open or jailbreaking it. If you are looking for a computing appliance on the happy path then an iPad could be perfect.

Personally I would say I don't really know what the top of the line iPads are _for_ that the midrange ones can't handle but I can imagine there are apps out there that stress them when multitasking. My point is that if you want a general purpose computer to fully use the hardware we should just buy those instead of trying to make an iPad one of those.


Ah, I had no idea that's what you meant. Thanks for explaining.


> Question for any curious or innovative HN readers - what would you suggest to do with iPadOS' restrictions?

Sell it and get a proper computer.

If ipad os is restrictive for you then the ipad is not for you.


True but I already have a normal windows laptop. I just wish something as capable as the iPad could reach its potential.


> I think having 8/16 is a bit of an overkill for an OS this restricted.

Restricted? It's capable of running:

* android apps directly from google play

* multiple linux containers (lxc)

* gpu accelerated linux gui apps (wyaland/lxc)

* docker containers inside lxc

* kvm virtual machines capable of linux, windows and even macOS guests


I have an 8 core CPU + 32GB RAM + 1TB Chromebook and it's my daily driver. I have ~100 tabs open, ~2 intellij projects open, some streaming service like youtube, netflix, hulu, etc. I run builds that pin the CPU such that if I had twice the cores I'd absolutely notice it.

I'd be very happy to see 16 core Chromebooks tbh, I definitely make heavy use of all 8 of mine today.


At these specs I expect the price to be pretty hefty, what was your reason to go for a Chromebook instead of another laptop + Linux?


It’s really surprising to me when people suggest that ChromeOS is worse than some other Linux. To me it’s head and shoulders above all the rest, because all the drivers always work perfectly, the touchpad works perfectly when other Linux developers are still putting out press releases every time they fix something trivial in their incredibly broken multitouch input stacks, and all the binaries including the kernel are peak-optimized with profile guidance for every specific CPU platform. There is no Linux distribution that can touch ChromeOS.


Can you use it for ‘regular’ computer stuff like compile python modules or run random binaries?

I had a cheap ChromeBook I used for quite a while, basically until the battery gave out and it turned into a desktop machine, but chromeOS was pretty limited back then so I just threw fedora on it. Almost all my Blender dev work was on that poor little underpowered thing…


Yes, on most[1] hardware: https://chromeos.dev/en/linux

[1] Released since 2019 plus these: https://sites.google.com/a/chromium.org/dev/chromium-os/chro...


Yep, and it's incredibly easy to install. You just tap one button in the settings and wait a moment.


How well does it work without a Google account? Are there degoogled distributions that are likely to work, and alternative stores?


That would seem like the wrong tool for the job. I always say you pick your application, then the best operating system for it, then the best hardware for that, and I don't see any way to start from no Google account and end at ChromeOS. Using the Linux environment requires a signed-in (not guest) session and the only identity provider for ChromeOS of which I am aware is Google.


It was like 3,400 or something like that.

It's a work laptop, although I use it almost exclusively these days since I can easily use a "personal" profile. It's very easy to manage things like SSO/device policies on Chromebooks because of the GSuite integration.

There's pretty much nothing that it's "worse" at, other than in some niche scenarios - like there's a bug where the VM will return an invalid code for a specific CPUID, and it doesn't support nested virtualization, etc. Pretty niche stuff.

Otherwise... it works. Funny enough I'm now in quite a pickle with my Ubuntu laptop, which updated to a new kernel, failed, and now I can't roll back to the previous kernel. Because of this, virtually no drivers are working, so I can't connect to the internet... making it really really fun to deal with! Stuff like this doesn't really happen on my Chromebook.


What's changed is that web pages, whether you like it or not, are full blown applications, written in JavaScript, so an OS restricted to browsing web pages still has has performance needs especially at the top end of the market.


Yes, people still underestimate the value of single core perf and overestimate the value over multi core perf.

In practise: what matters to users of these devices is web browsing performance (which is still mostly a single core job - perhaps a second core can be practical in some browser/OS combos).


We're talking Chrome here, it'll happily chew through all the cores.

Also, remember, this "restricted" OS is more than capable of bringing up a full Debian container.


In fact it's capable of bringing up many. You can create N VMs and M containers if you want to.


Chome on a Chromebook is much more efficient than Chrome on other OSs.


It's the exact same codebase.


That codebase has to integrate with all sorts of GPUs and desktop environments. Unless you carefully tune everything in the software stack, it ends up being a slow power hog. ChromeOS comes pre-tuned.

(Disclaimer: I prefer Firefox. Sometimes I compare it to chrome. They're usually comparable, but on some machines, one completely blows the other out of the water.)


Now if only Chromebooks could offer better resolution than 1920x1080.

Do I have to buy an AMD Windows laptop, pay the Microsoft tax, and convert it to a Chromebook? (Assuming it's possible).


Microsoft only taxes laptops sold by physical retail outlets. When ordering them online, one can often find a laptop without any OS preinstalled. Vendors are usually selling them to corporations who want Win10 enterprise covered by their volume licensing contracts.

Another good thing about them, it’s very uncommon for enterprise-targeted models to have soldered RAM or SSD. For instance, my secondary computer is HP ProBook 445 G8 with Ryzen 5 5600U which I upgraded to 32GB RAM / 2TB SSD, can recommend. However, I have no idea about ChromeOS compatibility, I’m using Windows and ordered a version with the OS license included.


There is no Microsoft tax for all intents and purposes. Manufacturers make more than an enough on the pre-installed crapware to make up for the Windows license.


Pixelbook is 4k. Seems like google discontinued the line but I use my 5 year old machine every day and it’s amazing. Boots in 1 second. Meanwhile my windows laptop is unusable.


Anyone here have any luck converting a high end ASUS or Acer Windows AMD laptop to ChromeOS? Any pitfalls to be aware of?


Is there a Pixelbook with an AMD CPU?

Nevermind - as you mentioned, it's been discontinued.


FWIW I've owned 2 4K chromebooks (replying on one right now, Pixelbook Go)


I think this is good news, even though I bought a Chromebook last year and expect to use it for at least 5 years before replacing it. Linux containers are a very nice feature for development and having more CPU cores and general power is a great thing.

My Chromebook, at $300 is a great deal, compared to my new large iPad Pro (just the magic keyboard is $350, pencil is extra - both included on the Chromebook).


I'd be more interested in turning a Chromebook into a vanilla linux box if they moved away from soldered RAM (which is all I saw in Chromebooks a few years ago).


Yep, and slow storage too last time I checked.


Sadly dram memory latency has stayed pretty constant over the last decade. As the cores per memory channel keeps increasing, does make one wonder when more memory channels will be added.


>when more memory channels will be added

Already happened a few months ago. DDR5 doubled the number of memory channels in a normal desktop from 2 to 4. That’s 2 channels per DIMM.

Of course DDR4 still has lower latency today, but that should change next year.

Today the only way to reduce latency is to overclock your RAM. It’s pretty easy to get a $200 DDR4 kit to perform better than what you’d find in $5000 prebuilt PCs.


DDR5 also halved channel width.


Ironically it makes it faster. TRP and TRCD numbers are getting so big compared to transfer clock rates that it's more efficient to double the bank groups and send data on more smaller channels vs speeding up a single one.


Because bandwidth has already been increasing exponentially.


Cores are basically free.

Interconnect is expensive.

Seriously. A Pentium IV was 40M transistors. A Ryzen V 2000 has around 5 billion transistors. It could fit 100 Pentium IV cores if desired.

That's not desired -- those transistors are better spent bumping up IPC a little bit -- but we can have a perfectly adequate processor at 1% of a modern CPU.

Pentium IV single-core performance is almost identical to a modern entry-level netbook processor:

https://cpu.userbenchmark.com/Compare/Intel-Pentium-4-300GHz...


>Pentium IV single-core performance is almost identical to a modern entry-level netbook processor:

>https://cpu.userbenchmark.com/Compare/Intel-Pentium-4-300GHz...

No it isn't.

1. userbenchmark is a joke in the hardware community. just search for "userbenchmark bias". It's also banned from both /r/intel and /r/amd.

2. even they themselves admit that the "modern entry-level netbook processor" is 59% faster in single-threaded performance. The only making up for it is "Memory Latency", which I doubt can make up for a 59% gap in performance.


> Pentium IV single-core performance is almost identical to a modern entry-level netbook processor: https://cpu.userbenchmark.com/Compare/Intel-Pentium-4-300GHz...

Almost identical? The wimpy netbook processor is 59% faster on a single core. Ignore that site's overall speed numbers, they make no sense.

But a fair comparison has to be to a desktop chip. Let's look at an i3-12300. It rates +597% on single core performance; seven times faster.


You'd be better off comparing to a pentium M. If you put 100 netburst cores on a single die, it would melt

[edit] or Maybe a Core-2 as that was 64-bit. E7500 with 3MB of cache was dual-core with 228M transistors, which puts you at 40 cores with 60MB of cache for 5B transistors.


I’d love to have a computer with a hundred little pentium cores to play around with — makes me wonder why nobody has made one yet (AFAICT).

Or one that doesn’t cost big dollars since the arm server chips seem to be going in this direction.

I mean, 256 x86 cores seems perfectly reasonable, right?


You could get a pair of used epyc 7601 and a motherboard for like $1200. Unlike a hundred little pentium cores, these 64 cores can run real workloads ~4x as fast as a modern desktop in the same price range.

Alternatively an old 4 node server could get you there even cheaper if you don’t care that they are separate computers in one chassis. I got a used C6100 with 24 cores across 8 CPU sockets for $600 6 years ago. You could probably get >100 cores for the same price today.


counting hyperthreading, that's 128 vcpus, yeah?


Huh I didn’t even think of that.

If threads count, the Xeon Phi 7210 64 core 256 thread would be even better at $125. Still x86


For the most part, because you'd be IO-bound, and you'd be limited by single-threaded speed.

A better architecture is to have the same, but as a co-processor for highly multithreaded tasks. That's basically a GPU. You get a few thousand cores on an NVidia 3060, in an architecture designed for this.

To the other comments in the thread:

- I'm mostly giving big-O / ballpark numbers.

- In terms of power, Pentium IV was 180 nm - 65 nm. A 7 nm process can be much more power-efficient.


Memory throughput keeps increasing. While the number of channels per core doesn't increase, the clock rate keeps increasing, and when it stops increasing then bus width typically increases.


Did adding more channels start improving performance in a significant way sometime in the past decade?

Last time I checked, single vs dual channel was like a 5-10% performance difference, mostly useful for integrated graphics (and even then latency was the bigger problem)...


It's highly workload dependent. Many things are cache friendly and won't care. Adding a integrated GPU definitely shows improvements with increased bandwidth and increased channels.

Similarly adding enough cores makes the extra channels helpful as well, for that reason servers often have 4x the memory channels as desktops (8 channels vs 2). Even with 8 channels often the performance scaling for using half the cores vs all the cores is poor, at least for the top spec chips.

AMD seems to have realized this and their next gen will have 24 channels instead of 8.


Too bad JS is single threaded. And the overhead of WebWorkers (multi process) makes it too expensive for all but some niche workloads.


One thread for JS main loop, one for the DOM, others for network requests… Browsers on regular web pages are already quite multithreaded.


Modern Chromebooks do a lot more than just run a single website or a single web app - for example right now I am replying to you in Chrome browser but I also have a docker container running doing some dev work in the debian linux console on a separate workspace.


A use case that is meaningless for 90+% of chreombook users.


Interested to see your data - can you share a reference? Thanks!


I have been looking for a Chromebook/Linux small laptop and all the small laptops available in Japan kinda suck for various reasons.

I wish Apple would bring back the 12 inch MacBook with the new M1 chip. Atom chip and butterfly keyboard was horrible at the time but now it would be a perfect ultralight laptop.


How much Linux terminal and/or XWindows/Wayland apps (and power-hungry Android apps?) can you do on a modern Chromebook?


Up until recently I owned a Pixelbook, and the Linux layer (Crostini) made ChromeOS a very viable development platform. The one thing I missed was the ability to start virtual machines (and I believe this may have been addressed on newer ChromeOS hardware)


Really a lot. I recently installed CloudReady (equivalent of ChromeOS Flex) on a 8th Gen Dell latitude. From Gimp to running 3 different chrome browsers with different profiles. All just works.


It really is impressive how much a cheap computer can do with the right software now a days. Cheapest I see glancing at amazon right now is $75. Chump change in the first world.

Probably even the cheapest part of schooling equipment too now. Never seen a textbook go for less than $100, at least in my experience.


Still using my 300 euro Asus 1215B from 2009.


I still really miss my dell mini laptop. It fit my small hands well and was easily lighter than a book.

I used to throw it into one of those mini fashion backpacks and bike to the park to write a bit of code on nice days.

Not much fear of breaking it because it was so cheap. Had external batteries too which I sometimes brought an extra of to swap out (which actually sounds crazy compared to how most laptops are now a days)


It still does the job quite well, naturally I took advantage of being able to expand it to 8 GB and replacing the HDD with a SSD one.

And despite my Linux vs Windows posts, it is actually Linux based, and was bought that way, also a proof that even that didn't help when regarding some common Linux Desktop themes.

Anyway, it has served me well during my travels, like you I am used to take it everywhere.

When it finally dies, it is going to be quite hard to find a good replacement that takes over similar responsibility.


That's some serious frugal :)


Yep, the time that we had to replace computers every two years is well behind us, and that is what brings fear into OEM hearts.

When not doing docker or microservices everything, there is hardly any need to keep buying hardware.

WebGL 2.0 is based on OpenGL ES 3.0, so for that kind of graphics, any GPU after 2011 will spend most of their cores sleeping.

For compiled languages, even C++ (if using binary dependencies via package manager), the workflow is fast enough.

I am the opinion, even something like an Amiga 3000 would be more than enough for what most people do with their computers. :)


preaching the choir my dear, whenever I see "new big web project" all I see is people sending a few bits of text .. sure it's coated with useless 4k vids and high res banners, but the cruft of the protocol is still a little bit of text. And considering the average brain speed of the population, a fast minitel would suffice :cough:.. hell, the human / system impedance might even improve.


Expired Chromebooks are cheap, and great for installing Linux.

For my purposes, a $100 used Chromebook is perfectly adequate, and is the sort of device I can take on a hike, kayak, or bike ride, and not worry if it's lost, stolen, or damaged.


What is the best source to learn more about replacing ChromeOS with Linux? When I was briefly considering this, I found most of the Chromebooks came with non-replaceable eMMC (<64GB), soldered ram (~4GB), or 720p resolution.

I am willing to adjust my performance expectations considerably, but the non-expandable storage has made me think I am in for a world of annoyance if I want to use anything other than a web browser.


The trick isn't to shop for /most/ Chromebooks. The trick is to shop for /decent, expired/ Chromebooks. Chromebooks are designed around planned obsolescence, and all come with a use-by date, after which they stop updating:

https://support.google.com/chrome/a/answer/6220366?hl=en

Chromebooks near or past the planned obsolescence date can be had for a song, including decent models. The market is close to non-existent, so there's a glut of them.

My Chromebook has a 3200x1800 display, 16GB RAM, and takes an SD card (for expandable, albeit slow, storage). That's plenty for most of the types of work I'd like to do on a boat. It was under $200, almost expired. New, it would have been close to a grand.

The most popular way to install Ubuntu is with crouton:

https://ubuntu.com/tutorials/install-ubuntu-on-chromebook#1-...

However, I installed it natively. Here's a random tutorial:

https://dbtechreviews.com/2018/09/how-to-install-ubuntu-on-c...

The key annoyance (really the only difference from a "real" laptop) is you have to hit a special key sequence on every boot.

I definitely don't think of it as a "world of pain." I wouldn't use it as my primary laptop, but it's great as a device I can use in places I'd never take my primary laptop.


One of the better (most ergonomic) laptops I've used was an Acer "WinBook", which was a fanless windows laptop (probably based on their chromebooks) that happened to be extremely Linux friendly.

Any idea if they're making similar non-chromebook versions of these?


AMD chromebooks would be amazing Linux machines if it wasn't for Google messing things up.


Does anyone still care about Chromebooks (outside of the education sector, that is)?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: