This $1099 config better have 16GB Memory and 512GB by default. Otherwise it is a machine that cost about the same as MacBook Air M2, with lower Display Res, slower CPU and GPU and No Touch ID. This is perhaps for the first time in history Apple has a better "spec" machine for the same price. With the scale of iPhone and iPad, Apple is now at a point where the whole PC industry can no longer compete. And that is excluding things like speakers, trackpad, and thunderbolt port.
>A spokesperson told Ars Technica that its business customers are being more open to the aspect ratio.
Biggest lie ever. People want 16:10, but for years the industry shove 16:9 to your customer because it was the cheaper panel.
> Arm Cortex-X1 cores at up to 3 GHz
For Context, the Single Core GB5 Scores
Arm Cortex-X1 cores at up to 3 GHz = ~800
Apple M1 ( A14 Core+ ) = ~1700
For PC Laptop counterpart, a 800 GB5 score you are looking at something like AMD Ryzen 7 4700U if it was limited to run at ~10W. So not too bad. Apple is the outliner here.
> This is perhaps for the first time in history Apple has a better "spec" machine for the same price.
This was also true at times after Apple switched to Intel — for example, around the time Apple went SSD-only they were considerably cheaper _and_ faster than the competition who were either shipping HDDs or using slower low-end SSDs using HDD-optimized interfaces. The main confound has been that they weren't selling crappy but super cheap machines or large laptops, so you could find products in those classes which were faster (at twice the weight / half the battery life) or much cheaper and a lot of people would make comparisons for things which were really in different classes.
The difference now is that we've hit the point where it's easier to do those side-by-side comparisons because everyone is using SSDs, solid case designs with decent thermals, etc. so there isn't an obvious confound like weight or build quality to complicate the comparisons.
The thing with lenovo is that advertise inflated prices. Everything is 50% off in a few months around black friday. The list price is only for press releases. Besides, thinkpads are sold primarily in bulk to companies who also get the discounted rate.
Thinkpads for some reason are disproportionally cheaper in the US. A lot of countries in (south) east asia are a lot more expensive. Taiwans own electronics brands are cheaper in the US than they are in Taiwan.
no the MSRP is simply doubled in regions they do that, and effective prices are exactly the same as elsewhere. Like a midrange model might be listed as $4999.99 stricken with an "up to 65% off launch celebration offer" and a "bonus" 15% off auto-applying coupons, and all combined the price might line up into e.g. a $1045.69 for an i5/16GB. That sort of BS. It's actually one of aspects that had improved about ThinkPads since the Lenovo handover.
Maybe on the consumer side, but the enterprise side I love them. PSRef is by far the best tool any mfg has ever made for comparing models for an enterprise selection. I am still shocked no other manufacture creates a tool.
> Biggest lie ever. People want 16:10, but for years the industry shove 16:9 to your customer because it was the cheaper panel.
Any background on this? I always assumed 16:9 became popular because of videos.
If you are authoring 16:9 content, something like 16:10 allows for some UI controls to be displayed along with 16:9 video at native resolution during previews. When programming, more space is better. So for creators, 16:10 is unambiguously better.
Agreed. Along with that the ambiguous 1080p, 1440p, etc. monikers. So much shorthand assumes the aspect ratio. Plenty of displays that are 21:9, 16:10, 3:2, etc.
That said: diagonal size with an aspect ratio, e.g. “13" 3:2”, is fine.
> A 16:9 screen is wider than it is tall, so if you’re programming that means fewer lines of code visible at a time.
A 16:10 screen is also wider than it is tall.
I'm not sure if I'm being trolled.
Number of usefully displayable lines is not defined by the x:y ratio.
Further, any half-way decent 16:9 monitor can, in a matter of moments, become a 9:16 monitor.
EDIT: I am aware we're talking about a laptop display, so orientation isn't flippable - but OTOH if you're trying to develop code on a 13" monitor at 1200 pixels high - your problem is not a ratio one.
That's not how 16:10 monitors work. Every one that I've seen has the same pixel width as a 16:9 screen but more vertical pixels. 2560x1600 vs 2560x1440 for example, or 1920x1200 vs 1920x1080.
These are standard panel sizes. No one is making a 1728x1080 panel to get to 16:10.
True, but if you're having two windows side by side, the wider 16:9 will give you more horizontal space to do so.
One more thing about display sizes and aspect ratio:
Since display sizes are usually given by the length of the diagonal, those aspect ratios that are closer to a square (1:1) will have a bigger area for the same diagonal. With the same diagonal length, a 16:10 has a ~5% larger area than a 16:9, and a 4:3 has a ~12% larger area than the 16:9.
In most cases that I've seen, a 16:10 vs 16:9 display has the same pixel width (the 16 part) part and more vertical pixels. So you were never actually sacrificing width as you claim.
I've had 3 16:10 panels in the last 10 years and this was the case each time: 1920x1200 (vs 1920x1080), 2560x1600 (vs 2560x1440) and now a 3840x2400 (vs 3840x2160).
16:10 monitors were out of fashion for most of the last 10 years, but are making a comeback lately.. They were hard to find for a while but worth the effort imo.
> In most cases that I've seen, a 16:10 vs 16:9 display has the same pixel width (the 16 part) part and more vertical pixels. So you were never actually sacrificing width as you claim.
Sure, that's true for the pixels (which might also be the more important part), but the actual width (inches) is still bigger on a 16:9 (for the same diagonal).
Sure but you said that when viewing documents side-by-side a 16:9 would give you more space, when really it's the same amount of pixels, and a trivial amount of physical difference (less than an inch on a 27" panel).
I was just clarifying that this is technically true but not really a noticeable difference on that axis.
> the actual width (inches) is still bigger on a 16:9 (for the same diagonal).
That's hypothetical, they don't have the same diagonal in the real world. (e.g. 15.6" laptop displays are 16:9 while 16:10 ones are 16", and they have the same width after all http://www.displaywars.com/15,6-inch-16x9-vs-16-inch-16x10)
A lot of people on this thread seem to think one way is better than the other, but I'm happy with the market being a mixture of devices. I prefer wider and shorter screens because I always have multiple things on screen at once and want to see them all - but I appreciate other people work differently.
I get why 16:10 is preferable (I'd like 4:3 even more, or even 1:1 on a desktop), what I don't understand is the incentive to push 16:9. Is it just because less pixels -> fewer expenses.
If you're authoring 16:9 content, you can display 1080p video at native resolution with UI controls on a 2560x1440 monitor, which is also 16:9.
16:10 in a vacuum is a useless metric. Real world stuff like actual resolution, screen dimensions, font sizes, scaling, etc. only make 16:10 ambiguously better.
I have a MacBook Pro with 3456 × 2234 (around 16:10.5) resolution and it is perfect. I don't know why people would want laptops at 16:9 at all. I can play videos just fine on this thing and the other 95% of the time it's a better resolution for doing things like coding and spreadsheets.
New MBPs are exactly 16:10 for full-screen, non-"notch-aware" apps - the screen is basically 16:10 below the notch, plus the two side areas. Rather neat solution imho.
On a related note, how much I wish 1440 / 1600 vertical had become dominant for PCs, rather than just "bigger number better".
It's so much more of a performance : sharpness sweet spot than 1080p - sorry, I can see pixel effects on a typical laptop - or 4k - which even desktops typically don't drive well.
>> I don't know why people would want laptops at 16:9 at all.
I don't believe almost any of the HN audience does. The vertical crampiness of almost all the laptops is what makes a second monitor practically mandatory for me.
> 95% of the time it's a better resolution for doing things like coding and spreadsheets.
Depends on the use case. my wife (accountant) lives in Excel, and I set her up with two 21:9 monitors side by side (42:9 overall) and she loves that setup compared to the one she has in the office.
When it's choice between something like a 16:10 1440x900 and a 1600x900 16:9 screen I'll pick the latter every time, more resolution is more resolution. A 4K screen would give you more "space" than your MBP offers.
It's easier to run LCD production lines if all the panels are produced on the line at the same aspect ratio. Given the prevalence of 16:9 video panels, this made it easier to produce 16:9 panels for laptops. (So, you're essentially correct.)
I'd expect laptop screens, having a different pixel density, to be manufactured on different lines so that having the same aspect ratio as some other size wouldn't count that much.
If the Pixel density differs ( a lot ) then yes it doesn't matter. But you still need economy of scale. This doesn't matter as much now because current LCD panel industry is essentially left over by Smartphone market. So you get a lot more flexibility than trying to print millions of the same 16:9 panels.
No up to date number ( Because I no longer care as much ). Before pandemic and chip shortage the trajectory was OLED will be cheaper than LCD by 2022/ 2023. That is excluding other benefits like thinner display so you could fit larger battery and higher res ( spec ) numbers for better marketing. It is only a matter of time before LCD get completely squeezed out on Smartphone.
This is something that scares me. Since linux had a large dominance on ARM, it was common to vendors to make drivers for their devices. By making drivers, I mean get the driver properly mainlined. Of course, there always existed the mostly proprietary and closed board support package but it was to their disadvantage and there was a huge incentive to write drivers.
My fear is that, with windows becoming popular on ARM, this incentive will disappear and linux support on these machine will be weak.
would it be like a situation where they would have to do less work because of existing linux support rather than building new windows support? dunno this looks like another mess then
Tldr: our peripheral vision extends width wise more than it does height wise. That 4:3 was popular at all was an artificial of display technology not being able to take advantage of that. Now, of course for using a computer vs just watching movies or playing games, 4:3 makes a lot of sense, but I think we’d see 3:4 (portrait letter size) before 1:1.
I think I'll always associate anything that runs on Qualcomm (thought it's not just them) as temporary due to its short lifespan and lack of vendor support based on my experience of buying Android phones, tablets and media devices. I think this is already true of some (all?) Chromebooks that actually have expiration dates.
This is probably not a fair comparison since it's a laptop and windows-based, but it would take a lot to win me back with how much money, time and effort i have lost over the years with these products... not to mention the amount of ewaste has been generated.
Even more true with my experience with Motorola (Lenovo); $600+ phone stops getting support/updates after 6 months of ownership... trade-in value of $28. Never again.
Then again, seeing Intel with some serious competition warms me up a little bit. Also the thought of an ecosystem of blossoming Linux distros for ARM laptops/desktops makes me excited.
Well they didn't have much choice here. MediaTek, Samsung, and the rest are equally bad. The ARM SoC market is flooded with chips designed to have the shortest SW support lifespan possible.
Maybe Nvidia can get its shit together and put out a competitive ARM chip.
MTK has the advantage of somewhat available (leaked) design information and source code, and there's a pretty thriving Chinese developer community --- at least around their phones/tablets.
Sadly, I don't think Qualcomm is even any worse than average. I don't think for example MediaTek, Samsung or Nvidia are doing any better.
I think even the Pinephones with their Allwinners and Rockchips aren't perfect, with both models being based on I believe older SoCs (although the Rockchip in the Pro has been modified, afaik).
Interesting. I was recently looking to upgrade my personal T480s which I use for development, but didn't see anything in Lenovo lineup that were reasonably light/powerful/quiet/cool enough to make put in the order. I've grown so sick of laptops constantly hissing at me when I do something more complicated than browsing the web. My work issued Dell is basically unusable in a quiet room without headphones. So as an impulse buy I sidegraded to M1 Air instead just to see what the fuss is all about.
Well, I'm not really wanting to go back to regular intel/amd laptops now, I have been spoiled.
My Thinkpad E14 Gen3 is quiet most of the time. My T470p is boiling my hands and the fan is constantly whirring (I still like it :)).
But I agree, Apples MacBook is of a different kind. The only time I heard the fan was when playing Metro Exodus and when I ran cinebench :). My MacBook Pro outperforms every other computer in my possession.
But I'm still interested in this Thinkpad, because I'm not too happy with macOS anymore. But I assume there are too many limitations with Windows for ARM as well :). And considering the price of this Thinkpad, I'm probably just buying another MacBook, which has better battery life and better performance (and probably a better screen, Thinkpads often have terrible screens imho).
I've got a 6th gen X1 carbon and under windows the thing has the fan on high all the time. Switched to Linux (mint) and I went a couple months before I heard the fan, doing all the same things. I don't know what windows is doing these days but it isn't doing it for me, the end user.
T460s user here, on Windows (and Hackintosh, for the sake of bad comparison), my system idled ~40-45c and the fans also ran constantly. Put Linux on it, add auto-cpufreq, and then it idles ~27c (30c with external display). No fans unless I launch games, have heavy network usage for >5 minutes, or compile something.
I'd probably give the Macbooks (or this new Thinkpad) a closer look if I didn't work with Docker constantly. As it stands though, ARM and x86 are still not like comparisons, and still not even remotely capable of the same workloads. I have high hopes for the future of RISC arches, but we're undeniably trapped in an age of x86 dominance.
On Windows, did you use the Lenovo provided software to manage cooling? If you did a fresh install of Windows and didn't install the OEM fan and CPU tooling, but then did install CPU tooling on Linux, its not quite a fair comparison.
I've got a 460s and it will run pretty warm if you let it. If you tell it to not run so hot it'll keep the CPU throttled down a bit more aggressively and keep it cooler. That generation of Intel CPU was always a bit on the warm side when it wanted to actually do anything. Especially anything related to video encoding/decoding, using stuff like Zoom or Meet or Teams really makes the machine get warm.
I did a fresh install, but Lenovo's management software has been notoriously bloatware-ish, so I was apprehensive to keep it on my system. I was never really claiming for it to be a fair comparison either, even if this thing somehow ran hotter on Linux I'd probably wouldn't use Windows anyways.
The machine can definitely run pretty hot if you crank the performance profiles though, that's for sure. I've managed to hit 70c while playing music/running CPU intensive games on an external monitor, and I definitely think you could push it further with more CPU twiddling. For regular use though, a hearty underclock still renders the device usable with low temps and a good amount of battery life extension.
Well, for starters, virtualizing anything on MacOS is a pain in the butt. It's an open secret that Docker on Mac has performance issues, kernel interface problems and general compatibility hiccups that don't exist when the host is running Linux. Then there's the fact that quite a number of Docker containers aren't ARM-ready, and even the ones that are won't be an accurate portrayal of how they'll behave on other arches (most systems I deploy to are still x86).
Like I said, ARM may well have it's day, but right now it's just full of compromises that I simply can't make. So long as x86 benefits from the same big.LITTLE architecture that ARM has been transitioning to, I don't think I'll really have much use for another arch until RISC-V hits the mainstream.
On the other hand, I use a Windows on a Intel m3-6y30 machine and the fan hardly ever runs despite it being a fairly anemic system by today's standards.
One has to think that all these users who can't seem to configure and operate Windows properly are probably not configuring and operating Linux (which exposes much deeper control and complexity to the user) properly either. "It's a poor craftsman that blames their tools", after all.
True. I was not interested in Intel Macbooks because the pluses even with that great screen didn't win me over from of thinkpads but M1 has changed the game for me. The battery and performance improvements are just too good to ignore.
I have an M1 but never use it at home, the battery is only really useful if I need to spend a few days on the road away from power.
My daily driver is an Intel 1185g7 based Laptop which I can work a whole day without charging if needed (tops out at about 10 hours), and anymore than that doesn't really add any value. If Linux support gets better on the M1 I think I would consider switching but honestly when I put them side by side I don't notice a big performance difference. The M1 just uses a lot less power to get that performance.
Btw, how is E-series doing for you? I've had X and T for last 20 years and I wouldn't've thought about getting E. I've thought of them as too fragile to lug around. Years ago I broke one early 14" Yoga plastic bottom just packing it too tightly so I've stayed with T's and X's.
My E14gen3 feels absolutely solid and sturdy. I mean, the E14s are not _that_ cheap :D. I wanted a Ryzen CPU, keyboard backlight and a fingerprint sensor. Also the keyboard is great. Battery life is great as well.
I think it's not only about arm vs x86. I have a work provided Thinkpad and work provided Macbook Pro (with Skylake CPU) and I prefer to work on the Macbook for the same reason. It's completely silent 99% of the time, whereas Thinkpad has fans blazing 99% of the time and getting very hot. When I look in task manager, it's the usual suspects - corporate bloatware disguised as data protection and antivirus software, taking 40-50% CPU time when idling. Add Chrome and few apps and you got 90-100% CPU usage most of the time.
The difference between laptop with mandatory corporate cruft and same hw without it is quite something. My personal laptop is weaker spec wise that work issued, but is so much snappier.
> hen I look in task manager, it's the usual suspects - corporate bloatware disguised as data protection and antivirus software
Similar experience here. If I only had experience with ThinkPads from work I certainly would hate them. Slow, loud and shitty screen.
My T14 Gen1 I run at home at balanced power mode runs quiet and fast. NVMe I could switch, ram upgraded to 32GB, fingerprint reader and windows hallow, all for ~1100 EUR
Apple also sets up very different fan curves from most of the PC industry. They won't turn on the fan until it hits 95 C. This keeps it quiet but it affects performance and maybe longevity.
Also worth noting that most vendors have a way to adjust the fan curves. Sometimes right in the bios, often times through an external utility.
I recall at one point there was someone doing some work getting the Legion Fan Control app working for non-Legion Lenovos, but I cannot for the life of me find the link now.
My personal T480 is silent most of the time. Though I now have an M1 Mac from work, and my next personal laptop will likely be an M1 as well based on my experience with it so far. Not just the chip: Apple finally created the perfect form factor IMHO, with just the right screen aspect ratio, port options (Magsafe plus USB-C charging!), and keyboard.
FWIW, I switched from X270 to X13 Gen2 AMD (5850U, 32GB) recently and the difference has been night-and-day - even just web browsing on X270 caused audible noise, while X13G2A is practically completely silent unless I do something very demanding.
So maybe look into T14s Gen2 AMD (which may be pretty similar to X13 Gen2 since they share the HW maintenance manual). But YMMV of course, and I have no experience of M1.
I got the T14 Gen1, basically the direct successor of the T480.
What I notice is when I put the power profile to balanced it's very quiet. But of course does not power up all the way. Fast enough for my needs, but of course might not be a solution for everyone.
I was looking at T14, but I'm used to having slim laptop, s variants have been slightly powerful but lighter and with slimmer profile. The thing with M1 is that the difference with 8th gen i5 in my T480s is for some workloads 2x, if not more. And difference in some Lightroom tools is easily even more. M1 is amazing. Plus, it's quiet. I didn't know I missed silence so much. It's hard to go back now.
It only takes maximum of 24GB RAM. Right now I'm ok with it but I'd like to have some future proofing at some point, everybody is going crazy with containers nowadays. I've been moving my work off the desktop and while I can upgrade NVMe as I please I'm stuck with memory limit.
I also have 64GB on my work T480 (non-s) despite "max" being 32GB.
The Lenovo-reported max RAM specs often do not take into account newer larger memory modules released after the laptop (or at least I believe this is the reason for the discrepancy).
Intel has been absolutely incorrect on this a few times. Quite often there is a conflict in ranges that the ACPI spec included in the system firmware can't deal with, but if an operating system is clever enough it can work around the problem.
It may require some ACPI trickery, but it absolutely is doable. If the firmware guys are super lazy it may just work without ACPI trickery.
I have 40GB ram in my T480s, but it is single-channel.
I have also put a NVMe in the WWAN-port, I found out that the Transcend MTE452T (TS512GMTE452T) has the right M.2-port (B+M), and it is a compatible NVMe-drive and not a SATA-drive which some of the disks of that form factor is.
Thanks for the info, I'll have a look. Transcend is weird in a good way, over the years I have bought some hard to find config drives from them, like 512gb msata 5 or so years ago.
If these ARM-laptops supports the ServerReady platform, I guess it should be possible to just UEFI boot a plain Debian or Ubuntu Arm64 ISO on these machines, without need to tailor a specific image?
If not, I'm going to be dissapointed.
Edit: According to a reddit-comment[0], ARM ServerReady, UEFI and ACPI are all requirements for Windows on ARM, so if this thing runs Windows, we should be able to assume all those other things are in place too. Unless someone has locked the SecureBoot-settings, these laptops should be able to run Linux too without too much trouble.
I'd encourage you to do your own research on this. The kind of research where you assume you're getting a soon-to-be-abanonded piece of techno-capital-consumerist ewaste, and critically evaluate anecdata against a rubrik composed from spending time researching the hairy corners of the space. This comes from a bleeding heart optimistic burned almost too many times.
Here are some items to look for:
- a device tree for the chipset for your device
- a u-boot fork or patches for the chipset/device/both
- a Chromebook built with that chipset, due to how ChromeOS development is done
- any sign of commitment from the chipset manufacturer
This topic is a lot like early Android bootloaders before... one of the early Motorola devices. Before there was a set of expectations/framework-support/etc for unlocking. That first unlockable Motorola device, there was 9 months of every wise-guy on the Internet swearing it was just a matter of time. And often it is, but only due to what amounts to luck.
Hint: there's a reason that every single Linux enthusiast isn't running a ARM laptop. There's a reason that Linux folks are excited about the Linux on M1 project even though it means jumping through hoops and reverse engineering and playing ball with Apple.
Just FYI, we run our entire production api on arm on AWS. Its pretty rock stable and mature. The stability of ARM on linux is pretty much a solved problem at this point.
Sorry, but no. This is the kind of misinformation that causes people to wind up with $2000 paper weights.
ARM in the cloud and ARM on the desk-top (aka, SBC and Laptops) are nearly entirely different ballgames - certainly from a user UX perspective. Basic boot support, standards around device tree, actual driver support, firmware, quirks, ecosystem challenges due to vendors and the nightmare of a gazillion kernel forks.
I was actually shocked to see that Snapdragon 8 Gen 1 already has upstream support, but AFAICT that's not a tablet/laptop chipset. And the one that is again seems to only be seeing Windows support. The other mobile chipsets with decent suppor are few and usefuls even fewer - maybe three and that's including Librem/Pinephone.
Maybe you can find me an example of the 7c/8c running, anywhere? There are at least some DTSI for some 7c boards, but again, absolutely zero consumer hardware in the wild. I spend time every month looking and the options for running a proper upstream kernel, on an Android based laptop or kernel, and every time "upstream support" is added as a crtieria, the options drop to zero.
I've never wanted to be wrong so much before, but again, I look into this pretty regularly and am involved in some development efforts that expose me to some of these realities first hand (I run everything on aarch64 except my gaming PC). It's sort of amazing, actually, to boggle at how much collective time has been spent by some many random devs at random companies and then painstankingly re-discovered by OSS folks, especially for phones/SBCs.
(granted, if you can boot upstream, it's all downhill from there these days)
These windows on arm devices are UEFI/ACPI, not uboot/DT. They should work, if someone puts the effort into porting all the drivers, and working around all the quirks they are sure to have. Why QC doesn't do this, and get them SystemReady certified is a mystery.
Thank you for recommending Fedora. However, I'm on Void for the past 7 years now (whew time flies!). I'm not planning on switching to any distro that has systemd.
I suppose I could still use the live environment as a Linux bootstrap for installing my favorite distro, though.
i would recommend that you be very careful here. You want a distro that explicitly has ARM as one of its official targets. Because there are a lot of moving components here - bootloader, system init, etc that all need to be targeted to the new arch.
If you are not on the Triple Trouble (Fedora, Arch, Ubuntu)... I would recommend you wait until your distro catches up.
Fedora is guaranteed rock stable here. Amazon Linux was originally based on RHEL and still have a lot of cross-upstreaming going on. For e.g. RHEL has support built in for 100 GBPS networking on Arm (https://access.redhat.com/solutions/5691381).
But Apple's aren't? I agree with you but they are pricing it relative to what they can get away with and Apple set the starting point for ARM based laptops.
The M1-based Macs are faster, lighter, offer seamless compatibility, run iOS software, have better screens and better battery life than modern comparable laptops. It's also faster than this Snapdragon chip. And starts from $999.
It also runs a really good Unix OS. This laptop still won't run Linux out of the box.
While I agree WSL2 is pretty good, and it's great that it's available on Windows, you still have to put up with the day-to-day annoyances of the OS.
The general UX is the main reason why I don't use Windows. I actually gave it an honest try in 2020. I was pretty excited about Windows terminal, the inclusion of OpenSSH (I always hated Putty) and even went on the insider channel to test out WSL2, with Docker desktop and everything. And it was better that I had remembered.
But I just gave up after a few months. The laggy search in the start menu, which I had to fix with some registry change to prevent it from searching the web (!?). The random "quick access" folders in the explorer, which also required registry wrangling to remove. Bonus points for all these reverting randomly after an update.
The hidden taskbar, which would become permanently visible in case a window demanded attention, and no way (that I've found) to disable this behavior. However, it seems fixed on Windows 11.
The windows which would look active, with a blinking cursor and everything, but wouldn't actually be active, and input would go to some other random window. New windows starting up behind the active window, etc.
And my favorite: turning on the computer in the middle of the night for some reason if left in hibernation. I had set the GPO to NOT turn on to uninstall updates, so no idea what it was up to.
I moved back from a Linux zealot into Windows, because the day-to-day annoyances of the OS running on Laptops, and for graphics programming, are much more berable on Windows.
Running Linux on VMs since 2012, since VMWare made it painless to do so.
I've been running Linux on my work laptops since 2018, after a long stint on MBPs. They've both worked perfectly, even though the manufacturer doesn't support Linux at all. The only thing that didn't work was the fingerprint reader on the previous one. On the new one it works, but I don't use it since it's not in a practical position.
I'm getting the same kind of battery life as on Windows, Bluetooth headphones work great, it actually sleeps when I close the lid, etc.
The current one actually worked better on Linux than on the pre-installed Windows. For some reason, Windows couldn't get the brightness all the way up, even on battery power and "performance" mode.
I don't do anything graphics-intensive, though. I'm quite happy with the integrated GPUs. Although on my desktop, I have a Radeon that I use for gaming, and it also works great. But it's true that I specifically avoided Nvidia to not gamble on Linux support.
For the curious, the laptops are HP ProBook 430 G5 and 845 G8.
> It also runs a really good Unix OS. This laptop still won't run Linux out of the box.
That's misleading - these Macs don't run Linux well at all. I would love to see Linux fully and properly supported on these great new chips, but let's not pretend that is already the case.
Apple earned it - they don't charge you just for the laptop, it's for the entire ecosystem - For example, you can run iOS apps on Apple's ARM device. And what exactly does Lenovo bring to the table other than embedded spyware (which they've been caught in the past for)?
Windows also has an ecosystem, arguably made even larger with WSL2 and Android apps. This new ARM laptop is offering a fanless design and (they claim) 28 hours of battery. Don't forget that Apple have also sold plenty of "crap" laptops for four figure sums until relatively recently (e.g. customers beta testing new flawed keyboard designs).
Apple might still be selling Intel systems alongside ARM systems. But they’ve made it clear which side is the future. Their performance also isn’t a loss.
Probably the biggest factor though is in Windows land cross architecture comparability will be a much much bigger issue. If your legacy app only works on intel then arm is useless.
The M1 is literally the fastest laptop processor on the planet, by a large margin.
Current-year MBPs are a whole 'nother class of device.
This ThinkPad is going to chug in comparison to an x86 laptop because Lenovo doesn't have the technical capability or clout to design a whole new, hyper-fast ARM part and basically make arbitrary demands of the supply chain to get it into users' hands, the way Apple does.
I like this trend, I'm currently looking at next gen Surface Pro X as a replacement for my 2018 MBP.
I'm almost there with using remote development on super beefy Windows desktop (waiting for Rider remote development support) - I want a mobile device that's powerful enough for client apps - but I want to offload heavy lifting to a tower that I don't have to listen to whizzing next to my face.
I'm not sure why they keep forcing the laptop form factor on this - tablet is much more flexible, you can add the keyboard folio if you want it but I prefer using dedicated keyboard/mouse when I'm using it for anything nontrivial, and tablet is more practical in most other scenarios.
> tablet is much more flexible, you can add the keyboard folio if you want it
Maybe it's just me, but I have the first generation Microsoft Surface Go, which I mainly used as a tablet device, but occasionally attach the keyboard to it when I need to do more extensive typing.
Over time, I discovered the connectivity between the keyboard and the tablet became worse, until it finally failed, and the tablet did not recognize the keyboard. Trying a new keyboard did not fix the problem, so I assume the problem is with the connector on the tablet.
For now, I've bought a BT wireless keyboard, which I can use with the tablet.
For that reason, I've soured on tablets with attachable keyboards, and looking for a small (13-14 inch) 2-in-1 laptop for working on the go.
My wife had a Yoga 2in1 and I've been underwhelmed with the experience, "tent mode" is cool but as a tablet it's too chunky, as a laptop it's underwhelming.
In my experience - the only time I use attached keyboard/mouse is when I'm at the office and I need to join a call in the conference room or a call booth in co-working space - other than that I use external BT mouse and keyboard exclusively. For those use cases I think I could get away with using a touch screen, maybe if I needed to present - but I'd just pickup my keyboard/mouse in that scenario.
X1 cores? ARM launched the X2 core back in May of last year. It's already in the Snapdragon 8 Gen 1. Interesting to see why Qualcomm would sandbag the laptop chips with last-generation cores.
I'm curious if they even considered the A78C, the variant with a similar cache (except L2) that replaces all LITTLE cores with big and possibly costs less to license as it's probably a roadmap core like the A78. It's chronologically a step behind but could unlock better performance (than A78+55) with a negligible/nonexistent hit to battery life.
I eat my statement. That looks like a better configuration than an octa-A78C, but only if the licensing cost is the same or you REALLY have a need for the latest extensions.
Good catch. I wasn't even aware there was an X1C! Then again, my daily driver is a 1 or 2 core Cortex-M. On special occasions (once a month or so) something more exotic lands on my desk.
I know nothing about ARM, but I think it's time to learn.
Am I right in saying you need to compile the OS and all applications to run on ARM? If so, I bet at least one thing I use daily wont work, and I don't want to commit to a laptop like this unless I know it will be a full replacement for my Intel machine.
> I know nothing about ARM, but I think it's time to learn
Is it? Unless you’re routinely working with assembly it’s essentially irrelevant, unless you’re writing low-level multithreaded native code (aka you’re directly working with atomics and barriers).
Though obviously if you assume customers are or may be switching hardware, it can be useful to test or bench on ARM to get some inkling as to their eventual experience, or make sure you’re not relying on ISA details.
Don't get me wrong, ARM is neat and all, but your comment’s assertion that “it’s time to learn” doesn’t seem motivated by anything real, any more than “it’s time to learn” about RISCV or PPC.
> Am I right in saying you need to compile the OS and all applications to run on ARM?
Technically you could run emulated, although that has a non-trivial cost.
> If so, I bet at least one thing I use daily wont work, and I don't want to commit to a laptop like this unless I know it will be a full replacement for my Intel machine.
You could always get an ARM-based devboard (e.g. a raspberry pi or similar) to play around, that’s hardly a major investment, although obviously it is what it is.
An other alternative is to use ARM-based “cloud” systems.
Applications need recompiling to run nativly on ARM. However you can still emulate x86 for programs that need it (with a speed penalty).
Ubuntu has great support for ARM. However the is an extra dimension that Arm laptops are also _much_ less standardised then x86; I would wait to see hardware compatability this this precise device.
I give it another 2 months and I'd bet someone boots the basics at least. (Assuming the laptop is reasonably powerful and has enough users / interest) They mostly need to make sure the device tree is correct.
> (Assuming the laptop is reasonably powerful and has enough users / interest)
7cx devices are not popular and I haven't seen anybody excited about them. M1 got serious reverse engineering work done because it's worth owning in the first place, but not every random Arm device will get that.
however running Linux in a VM still has all the advantages of the modern Windows: adverts, mandatory updates with enforced reboots, irremovable spyware (telemetry) and other abusive practices
This is completely untrue. I had a Asus laptop in 2003 in which everything worked out of the box under Red Hat Linux, save for video acceleration, but mine was a unfortunate model since most colleagues had no problem with their much better laptops. A couple years later I bought a (much more expensive) wonderful Fujitsu Siemens P7010 which supported Debian Linux out of the box 100% after I provided the firmware for the wireless card (a few Kb download). That machine lasted almost 10 years and was one of the most beautiful machine I ever owned. Then I purchased another Fujitsu i5-something laptop, and again: everything worked out of the box under Debian, and now my girlfriend uses it under Manjaro. Then I purchased a X240 i5 1080p Thinkpad with additional bigger battery for a whopping 8-9 hours of continuous use, again 100% working under Debian Linux, plus 4 or 5 netbooks, all 100% supported. I could count other models I installed to friends, colleagues and relatives, all perfectly supported.
Linux supports a lot of hardware if you don't rush to buy the very last model, which wouldn't be advisable anyway for other reasons.
My experience with Linux since 1995, kind of proves otherwise, I lost track on hand holding Linux on laptops, including those sold with Linux pre-installed.
Yes, but this doesn't prove that Linux desktop is only usable as WSL any more than my experience proves that windows is only usable as wine. You didn't enjoy the experience - that's ok.
I've got zero problems on a T490. All I did was grab the nixos-hardware module for it and off I went.
Maybe don't buy hardware that other people haven't tested first? Some vendors pre-install or announce Linux support but don't actually have anywhere near proper support.
Yes, IIRC that was some of the models shipping with a non-Intel network card, not all were affected. Same with the Framework and its power states not working.
Unfortunately Linux "support" means little, you have to go by other people's experiences with the specific laptop or at least the specific hardware pieces used by it. But if you check before you buy, which is a pretty trivial affair for anyone tech-savvy enough to manage their own Linux install, or if you're careless and lucky, things go swimmingly.
I've got four laptops running linux, with all their features working. I stuck a USB stick with Ubuntu/Pop OS on it and booted, they ran and everything worked.
Thinkpads are famous for working great with Linux. The HP and Asus that I had worked great until their hardware failed.
I have experienced this. Thinkpad E480, sold with pre-installed windows 10. At some point got a Windows update, and suddenly it frequently goes to BSOD, especially after sleep mode.
This is simply not true. I have been using Linux (Ubuntu) on my work laptop (Dell) for at least four years, and have used Linux on my personal laptop for much longer. Please don't spread FUD.
Exactly. WSL2 effectively makes Windows 10/11 the best Linux distro out there without all the steps of formatting, wiping and backing up disks and then installing it.
Then after all those 'problems' I should be seeing tons of users choosing a Linux Desktop distro (which one out of millions) mass migrating out of Windows by now. Why is that still not the case yet after 20 years?
It seems that after 20 years, these users still do not care enough to do any of that and WSL2 has only given a reason to make backing up / migrating / wiping / installing a Linux Desktop distro even less worth it these days.
Why do you think they aren't? There's a massive gaming on Linux movement which is only going up, and Chromebooks are outselling Apple Macs. Most main OEMs sell Linux-compatible devices, that come with Linux preinstalled.
So when we say 'Linux' we're now talking about 'Chromebooks' with ChromeOS.
Not only they defeated the purpose of someone else's point on getting rid of 'closed source' and 'spyware' controlled by Google, it is already at risk of ending up getting replaced by Fuchsia but may still be called 'ChromeOS' and it won't be based on Linux. That is my bet on this decade.
I still don't see any evidence of Windows being challenged by any Linux Desktop distro other than being used in WSL2. That is it.
There are marvelous cleaning products for that, you should inform yourself.
2% market share on the desktop market, or people giving big bucks to Apple instead of supporting Linux laptop OEMs, don't need rebutals, they are well known facts.
As for Android and ChromeOS, keep padding yourself on the back, maybe one day you can run Gimp on them without layers of VM and containers.
I think Windows being a product of a big company also played a role. It mainly came pre-installed on most device. Because Microsoft pushes for it, and user, especially the non-technical one, won't bother to install something else on their machine. So, for most computer user, there is just no other choice.
You're making the presumption that more popular is better. You're also heavily implying that linux desktop users should care that it's less popular - I genuinely don't. 2022 is another year of the linux desktop for me, and I'm so glad to be off windows again.
Pluton is another core which is a security module---in this case, a Microsoft product. TZ splits data and instruction access into insecure and secure classes on the original core---letting a little information pass from one side to the other without revealing the inner functionality.
There are plenty of Arm SoCs with Cortex-A cores plus an onboard Cortex-M-based trusted platform module (TPM). There are also already Arm chips using Pluton, such as the MediaTek MT3620.
Unfortunately I don't think Lenovo has the market dominance and/or desire that IBM had in the early 80s with the PC to make their ARM platform the de-facto standard, which means this is going to be yet another of many proprietary platforms whose only thing in common is the instruction set. This is no doubt going to be very different from Apple's ARM platform, for example. Even the next model or revision might have some things completely different.
I still remember the days when "IBM PC compatible" meant (at the very least, a desire to) something.
If Lenovo releases the schematics and detailed architecture reference like IBM did with the PC, they might start another PC revolution; but I doubt they will.
At what point will there come laptops that have a slot for an extra battery like an 18650 or 26650 battery that can sustain a full workday extra?
That would be nice, just to carrey a small spare battery.
What's the efficiency of doing this over being able to change the laptop battery? I've done it a few times for my M1 Macbook Air and feel like it can't be much better than 50%-75% but it would be nice to know.
I would not take that wager. Just because it has a USB-C port doesn't necessarily mean it has support for anything other than basic USB device support.
How does this compare to X1 Nano? I'd been looking for a laptop, and the Nano seemed to fit my hand like nothing else. This one feels like a modified T line or a lineage of detachables rather than X series machines, is that the case?
Why are they doing this? Windows RT laptops flopped almost a decade ago due to the lack of software, and the landscape hasn't changed all that much AFAIK.
The difference is that, relative to the days of Windows 8, they've significantly diversified the ways you can target Windows, directly and otherwise:
- IE is finally dead, and shipping a downstream Chrome build in the form of Edge gives Win10/11 not only a revamped WebView but platform-level support for PWAs (I cannot stress enough how much of an impact there is here just from how much the Web has evolved as a platform, and as a Linux user this cuts both ways tbh)
- To this same point there's also a first-party React Native build, allowing devs that are already shipping mobile apps to extend those to target desktops
- .NET has since been open-sourced, gone cross-platform, and picked up native ARM support
- Windows Subsystem for Linux, and now Windows Subsystem for Android, let you reach outside the Windows ecosystem entirely (the latter of which, interestingly, should also compound over time with the various efforts that Google is finally making again to make large form factors viable on Android)
- Project Reunion exists now, for the stated purpose of providing a common set of APIs that Windows developers can access
That last one alone, btw, makes the landscape inherently different from WinRT, which removed Win32 support entirely and only allowed you to run UWP apps.
I'm quite certain it has. I believe Windows on ARM now ships with x86 emulation, allowing it to run any program (albeit more slowly). I'm not sure about the state of WSL on ARM, but Linux on ARM in general is a nearly indistinguishable experience from Linux on x86, with the exception of proprietary software which is for the most part relatively rare on that platform.
Some people want that but actually they are a really small percent of people overall. On CrowdSupply the Librem and Novena laptop projects did not have more than 1k orders. Although it is a huge credit to both of those projects to develop new hardware products and be so successful to begin with.
In fact, this laptop goes in the opposite direction, it's not intended to be modified as much by the user. It's one that runs Windows 11 with some feature called "S mode" ('S' as in Secure). In S mode there is some kind of secure boot and apps can only be installed through the Windows Store. A user with admin rights can disable S-mode but it can then never be re-enabled after that.
So actually this laptop is not intended to be open at all, it is marketed for business where the environment can be more locked down and secure.
Tomshardware is such a ad filled cess pool of a site. I refuse to go to any pages of theirs. I’ve blocked them in my DNS. The tech reporting is garbage anyway besides the ad filled pages. A better site for this info is: https://arstechnica.com/gadgets/2022/02/lenovo-announces-the...
Who needs 28 hours of battery? I mean the big step is that you can run a an extended working day, lets say 12hours, but at least after 16 hours you should be able to plug it in, shouldnt you; best next to the bed you are resting then.
You either have never tried a ThinkPad prior to the W530 (i.e. more than 10 years ago, when they significantly changed the keyboard for the first time) or you haven't used one of the "next-gen" models with "next-gen" keyboards that they have been building ever since and that get worse with every iteration.
I have used as far back as 15 years ago, have a back up one that is 5 years old (?), privately a X1 Extreme Gen 2 and a company L13. No significant difference in keyboards, all of those are, or were, great. Especially compared to HP or Dell. Not to speak about Asus...
I agree; the keyboards are fine. I do dislike the layout changes they've made compared to the "classic" keyboards, but it's still better than most layouts out there (and they fixed some design mistakes in the first iterations of it).
I honestly thought this to be true years ago. I recently bought a MacBook and am completely sold on their keypads. Wouldn't be happy if that was my desktop keyboard but for a laptop it's the best out there.
This $1099 config better have 16GB Memory and 512GB by default. Otherwise it is a machine that cost about the same as MacBook Air M2, with lower Display Res, slower CPU and GPU and No Touch ID. This is perhaps for the first time in history Apple has a better "spec" machine for the same price. With the scale of iPhone and iPad, Apple is now at a point where the whole PC industry can no longer compete. And that is excluding things like speakers, trackpad, and thunderbolt port.
>A spokesperson told Ars Technica that its business customers are being more open to the aspect ratio.
Biggest lie ever. People want 16:10, but for years the industry shove 16:9 to your customer because it was the cheaper panel.
> Arm Cortex-X1 cores at up to 3 GHz
For Context, the Single Core GB5 Scores
Arm Cortex-X1 cores at up to 3 GHz = ~800
Apple M1 ( A14 Core+ ) = ~1700
For PC Laptop counterpart, a 800 GB5 score you are looking at something like AMD Ryzen 7 4700U if it was limited to run at ~10W. So not too bad. Apple is the outliner here.