Hacker News new | past | comments | ask | show | jobs | submit login
Laptop OEMs – make a proper high-end AMD laptop (basilgohar.com)
236 points by basilgohar on Jan 7, 2020 | hide | past | favorite | 170 comments



I'm convinced there is a conspiracy that prevents laptop manufacturers from shipping AMD chips on 13" high end laptops. Almost every (every?) AMD laptop I find is some 14-15" clunker with a garbage screen.

I've had to put off buying a new macbook because of the shitty keyboard situation and all laptops (mac/pc) because it appears Intel's newest CPUs are still vulnerable to Meltdown/Spectre.

A highend 13" AMD based Linux friendly laptop would be utterly amazeballs.


If by "conspiracy" you mean "business deals where Intel uses their financial clout to compel companies to not provide AMD options", then yes, that is definitely provably happening.

I can't use google fu to find it now, but I swear I read something about this in the last 6 months or so. The best I have is from 2007: https://fortune.com/2007/02/15/suit-intel-paid-dell-up-to-1-...


This might not be so true after CES is done.

I was watching Linus Tech Tips[1] video yesterday, and Asus had a new array of nice laptops (albeit, gaming laptops) that were AMD based, including a 13" one with decent AMD CPUs

[1] https://www.youtube.com/watch?v=hGUESEq75ZI


About the only big shortcoming is good Thunderbolt support with AMD. I think that will gain parity eventually, but will probably be in the USB4 timeframe over the next couple years.

I watched the same videos, and will be looking at laptops over the next few months, so may well go the AMD route myself.


While I hear what you're saying, I think YMMV with Thunderbolt as a hard requirement.

When I bought my laptop a couple of years ago, I made sure that it had TB3 for future-proofing, but I have yet to use it, and I suspect that I'll never use it over the course of the laptop's lifetime (and I have a tendency to keep my laptops in use for a long time).

I think having USB-C is good enough for the majority of "normal" users, although I can appreciate that TB is probably a must have for some power users.


I mostly agree... I use a TB/USB3 dock at work, it works in either/or mode, though USB mode graphics sucks, it's okay for getting work done (web platform development, mostly text/terminal).

I was only pointing it out as pretty much the only thing that's likely to be a significant issue for many, especially if you're doing any kind of casual gaming while wanting to dock, vs plugin a few separate items. I use a desktop at home, so not as concerned personally.


> I mostly agree... I use a TB/USB3 dock at work, it works in either/or mode, though USB mode graphics sucks, it's okay for getting work done (web platform development, mostly text/terminal).

Not sure if you're aware, but you're presumably using your dock in a config where it's using DisplayLink for the display output. Configurations exist that do HDMI/DisplayPort passthrough over USB/TB, which have native performance.


I know.. I had to use it via USB for a few days when the USB-C/TB port failed on my laptop until it was repaired.


I would appreciate a high-end 15" AMD "clunker" with an amazing screen, though. Preferably a ThinkPad. But such a thing is nowhere in sight either, no?

I can't work with laptops smaller than 15".


Lenovo has a bunch of AMD ThinkPads (X395, T495, and T495s at a 14" size, E595 at 15") with Ryzen 3000 series CPUs, but the 15" model is from the E series which isn't one of their high-end series. The T495 and T495s seem to be a really nice machines though, even before a Ryzen 4000 refresh: https://www.notebookcheck.net/Lenovo-ThinkPad-T495-Review-bu... - quote: "Overall, we have to admit that the ThinkPad T495 is the better T490, especially considering its lower price." (The T490 is the Intel version) and https://www.notebookcheck.net/Goodbye-Intel-Thinkpad-T495s-w..., the title of which is "Goodbye Intel: Thinkpad T495s with AMD Ryzen is the better choice for many users"

If these prove popular (and given that they're quite a bit cheaper than the Intel models, they might), a 15" T-series Ryzen Thinkpad probably won't be too far off.


I skipped on these as the one I was going to get had single channel RAM, have they fixed this ?


The T495 is available with dual channel memory. 8GB of soldered on ram is available on all models and the 3500u and 3700u models are available with 16GB soldered on. You can then add an additional 8GB or 16GB in the sodimm slot.

If you want to save some money quality 3rd party sodimms are available for a fraction of the price Lenovo wants and it takes about 5 minutes to install one. This is what I chose to do.


The problem with that is you'd need to match to the soldered-on memory specs to gain all performance benefits, as opposed to just buying a matched set and plugging it in yourself. You're also stuck with whatever highest common clock speed all the RAM would support instead of whatever the maximum the memory controller would allow.


I'm afraid the reality is that finding a laptop without at least some of the memory soldered on is increasingly difficult. Some T series Thinkpads prior to 2019 did but that ship has sailed and there were no AMD T series Thinkpads prior to May of 2019. So installing a matched set is not an option in the T495.

One of the great things about the T series of Thinkpads is that access to replace memory and storage is quick and easy unlike many modern laptops. Another advantage is that they are heavily documented and have a vibrant community around them. This means determining the memory specs needed to match the soldered-on memory is a rather trivial exercise. If any one is looking to purchase a T495 matching the soldered-on memory means using a PC4-19200 CL17 unbuffered non-ECC sodimm. You should be able to find a good brand module for about 1/5th of what Lenovo wants to factory install the additional memory.


Generally speaking Lenovo has been on top of Ryzen, especially since Zen+


They announced yesterday a bunch of Ryzen 4000 laptop chip news, and a version of Dell's G5 15 Special Edition laptop is planned with one of these.

https://www.engadget.com/2020/01/06/dell-g5-15-se-amd-ryzen/


>A highend 13" AMD based Linux friendly laptop would be utterly amazeballs.

Such as the Thinkpad x395 I am using, with Ryzen 3700U.


I second this. Been using the X395 with the R3700U and it's been a blast.


> I'm convinced there is a conspiracy that prevents laptop manufacturers from shipping AMD chips on 13" high end laptops.

Based on the Anandtech review of a Surface Laptop with Intel and AMD versions (https://www.anandtech.com/show/15213/the-microsoft-surface-l...) I suspect the conspiracy in question is "the small batteries found in 13" laptops would make the power efficiency shortfall a bit too obvious".


There's really no conspiracy. Hardware builders just have other requirements. If AMD"s laptop chips were really better so far (this is actually very debatable), they were by a slim margin for some use cases. Not enough to offset existing business agreements with Intel, the more mature Intel laptop platform (it's more than a CPU, but also motherboards and wireless chipsets) and Intel's more mature - and already existing - supply chain and general mature manufacturer support.

AMD is now very popular amongst home PC builders because they make better chips for the money and there's not many other requirements to consider. But AMD still has a lot of inroads to make because with premade hardware builders - whether it's laptops, office PCs or even servers. For example, although AMD is making way in the server space, it's not going as fast as one might expect for them having the clearly superior server chip (for most use cases) for quite some time now.

For hardware builders it's simply a time and money investment to start using AMD hardware. And although AMD is convincing them to do it - having what appears to be clearly superior chips in the laptop space is definitely a big plus there - it's something that takes time.


>> Not enough to offset existing business agreements with Intel

Translation, Anti-Trust violating agreements that Intel has been guilty of in the past


There are also legal business agreements like "buy 100,000 CPUs from me and you'll pay 10% less", which are really commonplace, don't violate antitrust at all, and AMD will also be negotiating these with major OEMs all the same.


The issue with those is when you look at the business the customer does to begin with. If they sell 103,000 laptops and you say "buy 100,000 CPUs from me and you'll pay 10% less" and a competitor sells 255,000 laptops and you say "buy 250,000 CPUs from me and you'll pay 15% less" and so on, always offering the deal where you choose between a discount for making >95% of your sales volume Intel or paying retail prices for CPUs, the fact that you're specifying that proportion of their sales as an absolute number rather than a percentage doesn't seem like it ought to change very much.


>for the money

No, they now simply make better chips, period.

In addition they are also cheaper.


Users don't want to switch, either. I won't take a chance on an AMD machine to save a couple of dollars.

Maybe I'm being rational, maybe I'm not, but I'm sure I'm not the only consumer who feels this way. I want to buy the "happy path" when I'm getting a product.


What's the chance your taking? Intel has dropped the ball in many ways, both long and short term, from their handling of Spectre/Meltdown, to pricing, to technology rollouts, and more. AMD has been a reliable player in the CPU hardware space for decades.

So, please explain what risks you are taking on by choosing a device that satisfies your needs but has an AMD CPU inside instead of something else.


I'll forever be baffled at people treating one of the biggest semiconductor companies in the world which has historically released products which easily outlast the rest of the components in the system as some fly-by-night alternative to Intel.

This sentiment also caused my family to get a Pentium4 over the superior Athlon64 years back and I'm still salty.


...and the Athlon XP in early P4 days (Willamette and Northwood).

...and the Athlon in P3 days, but that was a closer call.

...and (IIRC?) the K6 in P2 days, but that was before my time.

In retrospect, AMD's fallout past the Athlon64- and prior to the Ryzen-era is actually the odd one out in the company's history.

https://en.wikipedia.org/wiki/List_of_AMD_microprocessors


I think most consumers are not very knowledgeable of chip manufacturers. A business guy buys Dell/lenovo, couldn't care less what chip is inside.

At most they might check if it has a graphics card, or number of cores


IT professionals usually buy the hardware, and they're typically somewhat familiar with the market. As far as CPUs go, their options are "does this person need an i5 or an i7". When buying server hardware they'll gloss over the selection of Xeon systems their budget can accommodate.

You rarely get consequences for buying Intel/Cisco/etc, no matter how overpriced and otherwise maladapted their products may be.


So Intel ~= IBM? Things make so much sense after realizing this.


Users don't switch. They buy whatever is in their price range


I'm using a 13" hp envy x360 like that. It's really good for the price although would have fanspin issues until after some bios issues. Also yeah Intel has probably tried to bribe laptop manufacturers to not provide an AMD option as another comment showed.


late edit: Instead of "would have fanspin issues until after some bios issues." I meant "would have fanspin issues until after some bios updates."

After installing those it became properly functional and a great laptop thought it made me wonder how they managed to ship with such a big, obvious and easily resolvable issue in the device.


Couldn't have anything to do with the fact that Intel literally has a more power-efficient SoC? I wouldn't be surprised to see more 13" AMD laptops with the 7nm die shrink next generation but even then I don't see AMD being more efficient than Intel 10nm.


Why?


well I made two claims but 1: Because Intel 14nm was more efficient than AMD 12nm. i'm not make this prediction with high confidence. 2: Independant of the competition 7nm should offer insane enough battery life in a 13" form factor to be "good enough".


Intel have a co-marketing brand they call "Ultrabook" where most components are Intel which prevents laptop manufacturers from using the same chassis/model for AMD.

Their new "Project Athena" might be the same thing.



Out of curiousity, what would stop someone from just buying an AMD chipset, a 13 inch screen, and all of the rest of the components, 3d printing the chasis, and then assembling it all yourself?


From other comments, the conspiracy is that AMD either doesn't give a shit or is very bad at demonstrating otherwise. No evil required, just laziness and other priorities.


Zen and Zen+ mobile chips had really crappy idle power usage, so they're not that suitable for ultraportables. It may change with this generation.


Aren't AMD processors (and virtually every processor with speculative execution) vulnerable to Spectre?


Yes and no. Most of the vulnerabilities are Intel-only and only a few apply to AMD as well.


Some of Intels design decisions made them uniquely vulnerable to Spectre and it's not simply a matter of them being a "bigger target" if that's what you're driving at. Yes though every CPU has theoretical Spectre vulnerabilities.


Just as almost any other consumer processor made in the last 10 years, what is your point?


I was confused by this: "I've had to put off buying ... all laptops (mac/pc) because it appears Intel's newest CPUs are still vulnerable to Meltdown/Spectre. ... A highend 13" AMD based Linux friendly laptop would be utterly amazeballs."


There are a couple of these MDS vulnerabilities that also exist on AMD, but the vast majority are Intel-specific. Thanks not to say that people won’t find some on AMD, but a) they haven’t yet, b) there are at least some that are much less likely on AMD than Intel, and c) the greater number of cores and channels is likely to make practical problems even less likely. Some of this also applies to ARM.


Spectre is basically a new variant in the longstanding category of timing attacks on hardware. There are things hardware vendors may be able to do to limit the impact, but it's just something software developers are going to have to learn to live with, like cache timing attacks. You mostly fix it by making the software different, not the hardware, because the performance benefits of having caches or speculative execution are too large to abandon in general.

Intel's trouble is that they're doing that kind of speculation across more security boundaries, which not only makes the attack more powerful (e.g. reading memory from the kernel/hypervisor or another process/VM instead of the active one), it also makes the mitigations more expensive. The benefits of speculative execution in those specific cases aren't worth the cost, but CPUs have a long lead time, so they're still selling silicon where that isn't fixed.

And then losing more performance to the mitigations than they gain from the speculative execution while enabling a greater attack scope for any software that doesn't implement the mitigations properly (or at all).


Intel have a great partner program - I'm a small OEM (as one arm to the company), less than 100 units a month - but, still a nice number... Intel give me advanced warranties, presales support, partner centre, certain privileged information including roadmaps and samples... I couldn't imagine better.

AMD - I'm struggling to get them to return a call despite numerous chase messages for over 3 months now.

I want to sell AMD due to Ryzen demand but literally can't find anyone... even met an AMD rep at an industry event and he gave me his business card... non stop voicemail and just has never got back to me.

Again, I know I'm not huge - but, it's still over 1k units a year and I'm talking to a brick wall. When I started ~15 years ago and literally doing 1-2 a week, Intel were VERY supportive.


This sums up a lot of older business models. Where a call to a whine and dine salesman is needed. Such a dead and dull way of doing business. In the age of on demand, make a website where OEMs can just sign up and get things done. Outside of a certain monthly sales figure. Then send in the suits.


Which Intel had and how we got started ~15+ years ago and as our sales figures went up, we got contacted and met with people and gradually the relationship got better and better.


Is 100 units a month sustainable as a business? Or is this a hobby?

I am surprised because in the age of Apple, Dell, HP, Lenovo, how is it even possible to compete?


We are a MSP and are mainly a Dell house at the moment - however, we have found a niche in media and are constantly building mid range servers and high end workstations.

On the high end, we can assemble a Xeon or i7/i9 machine, 64GB memory + huge NVME/SSD for roughly 50% of a branded machine...

We aren't in the celeron/i3 low end game...


How/where can I learn more about your laptops?


Sorry, We only do servers and high end workstations/desktops...

I personally couldn't find a ODM that was high quality... all seemed to be cheap/too plasticy and would break easily... Unless you are doing ~5k+ a month and can get better designs, the off the shelf prebuilt chassis were not great :(


Good timing! ASUS are showing off a pile of their ROG-branded gaming laptops at CES, which have 7nm, 8 core, 16T Ryzen 4000-line CPUs with RTX2060 (mobile) graphics cards, with fast screens, all in a 14" form factor. Certainly not workstations but still, considerably more powerful

More at LTT: https://www.youtube.com/watch?v=hGUESEq75ZI


Came here to post this, ASUS went from 8 to 80 on AMD.


> with RTX2060 (mobile) graphics cards

wish they had chosen a more Linux friendly card. Going full AMD would have been nice.


I will be not so easy.

Big OEMs are bound by the supply chain. Small OEMs are dependent on solution providers.

Intel pays to solution providers to not to work with AMD in China. That's pretty much on the record, and not a secret to anybody in the industry.

And for big OEMs, the supply chain trumps everything. Idling assembly lines are extremely costly for them. One week idle can easily wipe out 1+ month of profits for them.

Before AMD is ready to offer the whole BOM kits with inventory in mainland China, and is ready to put money behind the availability guarantee, there will be no mass transition.


> Intel pays to solution providers to not to work with AMD in China

Wasn't there literally a US antitrust case about this?


AMD is cooperating closely with Chinese firms and has them to thank in part for its resurgence in the desktop die space.

I wouldn't be surprised if these partnerships lead to more AMD PC's from small OEM's in the near future, perhaps only for the Chinese market.

Chinese home-grown fabrication is also at parity with AMD Bulldozer, and likely to accelerate with acquisitions of European chipmaker technology.

Notably meanwhile China has effectively blocked Qualcomm from making similar moves purportedly on antitrust grounds.


> cooperating closely with Chinese firms

You can say Taiwanese ones. On mainland, AMD is almost unknown in OEM space


With AMD announcing truly Zen2-based mobile APUs, I felt it was important to gather all the mistakes laptop OEMs have been making and call them out so that at least a voice exists capturing these complaints. Unfortunately, you cannot vote with your wallet if the option you want does not yet exist. Here's hoping that they start to listen.


The Asus ROG Zephyrus G14 pretty much has all this. David Lee even has a raving review about the Zephyrus.

https://www.youtube.com/watch?v=_v5IzvVTw7A


This looks like a killer laptop. Love the exterior pixels. Hope linux will run on it.


I have Zephyrus gx531, ubuntu (dual boot with windows) works perfectly (had to hack nothing), so I assume it would most probably work smoothly on Zephyrus G14 as well


Just watched the vid, that’s a pretty cool laptop!


> Hope linux will run on it

Me too. The cover "pixels" and the custom "caching" fingerprint reader are the sorts of features that easily get left behind proprietary drivers (for no good reason).

I really wish the EU would use some of their GDPR-focussed zeal and mandate that peripherals must either have open spec sheets or function on open APIs. I'm not suggesting that manufacturers have to write drivers, just ensure that anybody else can.


That looks promising!


Reading through the comments proves different strokes for different folks.

It took me quite a while to find a laptop I'm completely happy with. I ended up with a (late-2019) Razer Blade Stealth (with gtx1650). I do run the internal screen at 1080p (at 100%) and I couldn't imagine running it at 4k. My older 13" macbook has higher resolution, but you end up having to scale it anyway (In reality, I think it's actually doubled, then scaled back). On a 13", a good quality 1080p screen (to me) is the way to go - anything higher needs scaled.


I really don't understand why there are so few QHD screens on laptops. It's always either a 1080p or 4k, while a good QHD screen would still give higher resolution while not completely murdering my battery.


Yeah, I've got a 4K Thinkpad, and the screen is gorgeous, but it's way overkill. QHD would have been good enough, but it was 4K or 1080p, and I do want a bit more than that.


Marketing mostly. QHD screens seem superior to 4K in a laptop form factor at this point.

For that matter I see HD as generally superior to QHD in phones and think Apple has it right.


TIL that in 2019 people still have DPI problems with their software.


Only Macs have really made high-DPI work properly, and especially having multiple monitors with very different DPI.


Mac makes most apps "just work", but it has issues: Any scaling besides the default makes it impossible to do 1:1 device pixel rendering from within an app.

Windows has the complete solution for non-integer pixel ratios. For instance, on this machine, my browser's window.devicePixelRatio is `1.7647058823529411`. For my 4k screen, this means apps should be rendering into 3840x2160 pixels, but scale it /as if/ it were a 2176x1224 display. (2160/devicePixelRatio = 1224.0)

This allows Windows apps to handle non-integer scaling, whereas on Mac this causes apps to get "fuzzy", since they would e.g. render into a 2176x1224 screenbuffer which is upscaled to 4k by the OS compositor.


It's 2020 now, friend.


Sorry. Time machine problems. Should be fixed now. :)


Do you not? My work laptop has very high DPI, and I hate all the scaling issues I encounter with all the corporate software I have to use. To the point that I much prefer to plug it into a 1080p monitor.


Sounds like a Windows problem. Macs have had high DPI solved since 2012.


Sounds like a legacy software problem. Old Mac software is the same.


Nope. Old Mac software doesn't suffer from scaling issues on high dpi displays.


Windows relies too much on legacy software. The built in RDP client for example fails when you remote from a high DPI machine to a windows server that doesn’t support high DPI. Microsoft’s own RDP client on Mac upscales perfectly.


Windows having DPI scaling problems is a good problem to have since it's mostly rooted in the fact that it supports much older software.


Xorg + multiple monitors = dpi scaling frustration.

Wayland implements per-monitor scaling but not much has made the transition yet.


Adobe Apps on Windows are terrible for this. All screen recording apps on Windows have problems if your internal monitor and external monitor have different scaling as well.


They did: https://www.anandtech.com/show/15213/the-microsoft-surface-l...

Spoiler alert; it does _not_ show the AMD chip in a good light. Will be interesting to see if Zen 2 does better.


It was not the best of comparisons [0]. tl;dr they had different memory and the AMD hardware was older generation (nothing better available, it's true) compared to the Intel. There were other differences too.

[0] https://news.ycombinator.com/item?id=21976211


Sure, as I say it'll be interesting to see if Zen 2 improves things. However, as things stand today, it's not surprising that AMD high end laptops are rare, because they're not very good when they are made. Zen 2 could absolutely change that, of course, if AMD's claims work out.

(AFAIK the memory difference was because the AMD chip used didn't support LPDDR4.)


I have no complaints about my Thinkpad x395, which I use heavily (on Linux, as OpenBSD I can only use if an ethernet cable is at reach due to wireless chip being too new).

I would expect a successor on 4000U series to be even better.


Until AMD (well, TSMC) had process node advantage parity it didn't make sense for AMD to compete at the high end portable market. Now they do, so they will.


Quite a few of these complaints are equally applicable to intel-based laptops as well, particularly regarding ports/quantity of ports, and thermals/cooling. A single thunderbolt / type-c port is nice and everything, but since there's only a single one, that means I can only really plug a dock/hub type device in it, or else I'm having to give up something else I need - that goes double if the vendor supplies it with a type-c power adapter as well.


Thunderbolt-3 is the way of the future. People are building thunderbolt 3 external chassis with a gaming card and extra storage in lieu of having a desktop computer and a laptop computer.


This is how I'm rolling these days (HP Spectre 13" + 1080Ti inside an Akitio Node) and it's not without it's issues, but overall I'm very bullish on this type of thing. I would love for a similar AMD powered laptop, especially if they would let me mess with the fan curves etc. With some hackery my Spectre can be configured to a 20W up-TDP state over it's normal 15W, and when connected to the node I can have it in tent-mode, which improves the cooling a fair bit, as long as the fans stay on full blast (which unfortunately can only be achieved by pushing it to the thermal limit briefly) it stays at a nice ~80C at 25C ambient under all core loads. I manage 240FPS in Overwatch, it's fantastic.

This being said I opted for the 1080p screen (wish it was 1200p) and I personally see little reason to get a UHD screen in a laptop this size, it would be nice if it was a 10 bit HDR OLED screen with 400nits, but I can definitely live without the extra resolution, it's pretty meaningless on a screen this small, and most of the time when I'm at home I'm using a 144Hz gaming panel anyway.


It is not. Intel gave away the TB3 standard to the USB IF who made USB 4 out of it and that is the way of the future. It came out in '19 August, the differences are small but important: where the TB3 bus only carried PCI Express and DisplayPort signals the USB 4 Gen 3 bus also carries USB signals. Previously the USB ports on a TB3 dock / enclosure were implemented using the hotplug USB root hub in the TB3 controller in the TB3 device and it often resulted in a less than satisfactory experience (in other words, it sucked so badly some manufacturers used two TB3 controllers to make it better).

TB3 enclosures will be usable by USB 4 hosts.


I agree yet at the same time I avoided it largely because it's a zero-redundancy setup and a desktop/laptop setup is fundamentally more reliable... and it allows the failure of one device to not involved panicked repairs. Plus I do like the ability to segregrate the devices by function, or use one device to repair the other device. Also I'm staggered by the price of these GPU adapters considering the often crap quality of the hardware.

Yet I wouldn't be surprised to see myself go this way in the future especially because my phone only becomes a more and more capable "Backup" every year.


In that sense, it is not a very good version of the future. https://egpu.io is a good resource to get started on this. The 4-lane (sometimes only 2-lane) configuration of Thunderbolt 3 controllers, the overhead of the protocol and Intel’s insistence to reserve DisplayPort and USB bandwidth has made TB3 slower — 22Gbps maximum in a 32Gbps port hosting graphic cards that are usually electrically connected at 128Gbps (PCIe 3.0 x16).

I am going back the other way now, after having used an external enclosure for years.


Or just get a laptop with dual graphics card, having a gaming/workstation class card

I don't own a desktop since 2004.


Such laptops are usually pretty bulky/heavy, or they have a very low-end discrete card. They're a compromise.


15" Thinkpad, Asus and Dell workstation laptops are pretty alright.


Great machines, sure, but it's still (to me) a compromise. Right now I have a Macbook pro for portable use and a full-on workstation at home with a 20870Ti in it for playing with ML and CUDA, and for gaming.

I love the idea of having one machine and just plugging in to an external GPU caddy when at home for the full capability. Unfortunately MacOS and Nvidia cards aren't friends right now...

And the laptops that have got discrete GPUs are both heavier than the ideal (I'd like a portable machine coming in at 1kg or less) and far less capable than the 2080Ti.


Every laptop I've used is thermally limited if you try to do anything interesting computation wise. Simple things like 8 minute compiles turn into 12 minutes on a laptop, with fans at full blast.


Me neither, but I look at eGPU enclosures regularly. I don't even need the GPU power, but a desktop card is miles ahead of a laptop if well cooled and overclocked.

Will consume a lot of power though, and it definitely doesn't look like a portable solution.


Overclocking is something I never cared about, the prospect of being forced to buy new hardware was never appealing to me.


It's not rocket science. I've run a 30% overclock on a machine with 80% daily uptime on air and saw it BSOD like twice over 4 years. Also as some other people in this thread have mentioned, modern CPUs will overclock themselves, and if you undervolt you can get quite a lot of headroom, especially if you get lucky with binning. With that headroom and some hackery (Throttlestop is often all you need) you can get modern chips to hold all core turbos indefinitely. YMMV but I've managed to hold a stable 35% increase in sustained processing power without running too close to the thermal limits.


The risk of breaking anything by overclocking is really low anymore - you'll run into instability long before you damage anything. The best argument against overclocking IMHO is that the time and expense is generally not worth it. With a CPU overclock you might gain 10% performance while consuming 30% more power, after investing time and fancy cooling solution. Gone are the days of rapid improvement when you could occasionally get a stable 33% overclock with no real downside.


That, and CPUs will juice themselves up by default. Intel has turbo boosts up to like 5GHz nowadays. I couldn't do that if I tried, and frankly I don't feel any reason to.


Yep, they're very aggressive with the voltage and frequency out of the box. Continuous per-core thermal monitoring and frequency throttling changed the game - chips no longer need extra headroom to accommodate the worst thermal conditions they might see.


USB4 really. TB3 will be replaced with the former. So TB is becoming history.


I know it might be practical and hip if you're constantly on the move, but, eww.


I just bought a Thinkpad T495 with a Ryzen Pro 3500U, and but for the display, I think it hits the points made in the blog. Cooling is good, it's far from a clunker and comes in an even slimmer model (T495s), and the battery life so far has been a revelation (knock on wood) compared to the Dell XPS 15 9650 it replaces.


How do you find the power for intense tasks compares to that XPS 15?


> Some motherboard makers have gone ahead and included it, but so far, AMD laptops with Thunderbird 3 remain elusive.

I assume the author means Thunderbolt 3 here.


Yes. I made that typo so many times....one slipped through....

Edit: Fixed now!


> many times an AMD laptop will have ONE USB-C port and it will be used to physically power the device, thus removing it from utility.

That's indeed annoying, especially when you need to use it also to drive external monitors. I use USB-C hubs with DP output to work around that. Works well with Lenovo E495 running Linux.

Example: https://www.cablematters.com/pc-899-126-usb-c-multiport-adap...


I would like the [new] Dell XPS 2-in-1 with the 3840x2400 screen, as an OLED - using an AMD processor. There's been a lot of complaints about the keyboard but i find it strangely fantastic. Yes it takes getting used to, but I can type faster on _reliable_ low-profile keys. Wish the SSD weren't soldered in. Seriously, I've waited forever for 3840x2400. I like the Surface laptops but I need at least >= 2160 in the vertical so i can watch a 1080p movie at native resolution and have it take up a quarter of the screen while I websurf/program. :(


Something no one has mentioned but I think is important to note is how the volume in laptops comes from that medium to low end range. Sure a high end thinkpad is a fantastic laptop, but it is never going to see two hundred units at the local walmart.

AMD needs to improve margins and take market share. Low end laptops bring that market share. The margins will follow once Intel cannot rely on moving the majority of chips. Until then Intel has the upper hand for low margin cheap laptops, that is the critical market.


> Ultramobile, slim, and desktop replacements

No! I want proper thick modular 17" desktop replacement with 256GB RAM. Platform that would replace old Thinkpads.


We're the "nonexistent" market of IT consultants with high incomes that need a high-end laptop. As a third-party to my clients, I cannot rely on the availability of external monitors, keyboards, docking stations, or WiFi.

I lug around a heavy Clevo laptop with a 17" 4K screen, 2x RJ-45 ports, quad-core 4 GHz CPU, and 64 GB of memory.

These are my minimum requirements, and practically nothing meets it any more!

Clevo has largely stopped making 17" 4K screens in favour of 144 Hz gaming screens.

Most laptops have tiny keyboards that are uncomfortable to type on for any length of time, and are particularly irritating if you do programming.

Most laptops are 14" or smaller, which makes my eyes hurt after an eight hour day.

Most laptops focus on battery capacity and low power operation, which doesn't affect me because I work plugged in 99% of the time.

Most laptops assume "everything is wireless" and no longer have RJ-45 connectors, let alone two, but all too often guests can't get on the main corporate network on WiFi, but can plug in to the LAN just fine.

Most laptop manufacturers are in a race to the bottom. I paid $6,000 for my laptop because it's a tax writeoff, and I'm willing to go as high as $8,000 if I get what I want.

Apparently though people like me are "too low margin to bother with" as a market...


I've thought about making (well, more about 'having' it to be honest) a 'semi-portable' computer for people like us. Basically a briefcase that when folded open would 'swing out' two screens. Maybe you would fold it over so that the rest of the case (where the mobo etc are are in) would work as a stand for the monitors. You wouldn't be able to use it on a couch or plane but it would give you a vastly superior experience when at a desk. Sort of like a portable office. Ideally the whole inside would be 3d printed (maybe some of the display panel mounts would be metal and the rest custom) so that you can make it fit for any briefcase and any hardware you want to put in it.


There are vast untapped markets that are being totally ignored. As in, you can't vote with your wallet because there's literally nowhere to put your money.

Very few laptop manufacturers allow key customisation (e.g.: using laser engraving). None allow a choice of keyboard (e.g.: with or without a keypad, with or without dedicated ins/del/home/end/pgup/pgdn keys, separate arrow keys, [fn] button left or right of [ctrl] etc...)

Nobody makes laptops with really good HDR screens and HDMI inputs capable of RAW recording. There's production crews out that there that will happily drop north of $10K for something like that.

Similarly, there's only a handful of laptops with non-key inputs, such as shuttle/jog wheels and the like.

I don't think anyone on the planet makes OLED 4K screens in the 16 to 20 inch range. They're just not made, for any purpose, let alone laptops.

I've never seen a 17 inch or larger laptop with a narrow bezel. They all seem to add an extra inch on both sides just to waste precious space.

There are telco engineers out there who would appreciate a laptop with a built-in QSFP+ connector. In general, laptops with > 1 Gbps wired network connections are very rare or borderline non-existent.

Very few laptops have built in GPS, although this is slowly becoming more common.

I bet there's a lot of field engineers (think oil exploration, forestry work, etc...) who would appreciate a large, capable laptop that's semi ruggedized. As in, not a shoebox with bumper bars designed to hammer nails in, but merely rainproof. Think cooling fans but with the fans outside a sealed electronics compartment so the water isn't sucked in.

I'd like to see internal power-supply as an option to replace the battery. Alternatively, the option to switch between two batteries or one battery plus integrated PSU.

I'd like to see high-wattage PSUs optimised for weight instead of cost. My 130W PSU is a literal brick in size, shape, and weight. It doesn't have to be, and Apple has demonstrated that this is possible, albeit at a slightly higher cost.

Again... the race to the bottom is fine if selling laptops for Moms and Dads that need to buy their kid something for school, but it's not okay for the high-end that professionals are looking for.


There has to be small manufacturers in the US. In Spain I know this two guys: https://www.mountain.es/en and https://slimbook.es/en/ and I've been pretty happy with their products.


Most small laptop manufacturers sell Clevo based laptops with some customization, so there's a limit to what they can customize on the Clevo base model.


I'm not sure with Slimbook,it could be, but Mountain seems to be more custom than that.


A portable computer with a high-res screen and a standard mini-ITX board with a PCIe slot would seem to do the trick. The big problem is battery life (unless they could live without a battery, which would help the size quite a bit.


What happens when you get called in to a meeting? Do you take your luggable and dominate the conference room table with it? Or carry around a second laptop or iPad?


For the occasional presentation I either use whoevers laptop that is already plugged into the projector, or the dedicated machine that's already in the room. For day(s) long meetings, I take a dedicated ultrabook.


I'm surprised you find that guests can plug into the main corporate LAN. That's a sign of totally incompetent security (unless it's a lab network or something firewalled away from the regular enterprise network). Wired Ethernet access should be locked down to only authorized devices.


You can do 802.11x cert-based authentication on wired LAN too.

Don't dismiss other people's technical choices because you ignore the existence of certain technical possibility.


I didn't dismiss other people's technical choices or ignore the existence of a certain technical possibility. 802.11x is obviously a way of locking down a LAN to authorized devices. I assumed every network administrator would be familiar with that.


PC Specialist have some nice custom built 17" laptops. You can configure them with 4k displays, 8 core i9's and 64GB Ram, might be worth checking out. Looks like they have RJ45 ports as well.

https://www.pcspecialist.co.uk/saved-configurations/octaneVI...


I assume you've heard of them before but just in case, System76 higher end laptops have 17" 4k, RJ-45, high end CPU, high end GPU, 64GB RAM

https://system76.com/laptops


Doesn't the Thinkpad P7x line of laptops still tick all those boxes? The only one it might not have is dual RJ45 ports, but I'd think a USB RJ45 port would suffice in the cases where a second LAN port was required.


You and me both but unfortunately we are a invisible minority since just like with phones, when it comes to laptops, consumers want something sleek and sexy that looks good in a hip cafe regardless of thermal performance or reparability.

Gaming laptops are 80% there though but the ones with good build quality and modularity carry a hefty markup.


what about an amazing 14" with the quality screen of a MacBook, a newer Ryzen 7 4800U/H, 32/64GB RAM and a PCIe 4.0 Drive.

And no RAM/SSD is soldered


As much as I hate it, I think it makes business sense for vendors to solder. You can "upgrade" by buying new. I can't imagine the business value of such a laptop - esp now that hardware improvements are slowing down (for the typical use case)


What about dead SSDs or RAM defects? You swap an entire machine just for those components? And what if that 512GB drive doesn't fulfills the requirements anymore?


Time for geeks everywhere to practice their de-soldering skills.


For some reason, I do not expect Dell or HP or anyone big to release a decent, high quality workstation with AMD inside. HP has consumer laptops with Ryzen, and they're the cheapest sheet possible, as expected. I believe Dell does the same.

By decent and high quality I mean socketable CPU, GPU and RAM, as well as a good selection of ports (forget TB if it's not possible, give me eSata) and high quality displays (even if 1080p). Could not care less about the weight/size.

The only ones I can see doing that are Clevo/Sager/Eurocom, or someone with a good business plan and a well funded Kickstarter...


Some of the better pieces of hardware have been gaming laptops that feature desktop parts, which I believe are also socketable/replaceable. Not sure how far you can get with them, but that's one area I also hope to see growth in AMD-powered offerings.


Excellent challenge to the big OEM's. I doubt many will give it a second thought.

What I'd love to see is some of the gifted and talented Makers we've seen in 2019 whip out their dremels and 3D printers and say, "THIS is what I want!"

Couldn't you take a chassis with a high-end screen and fabricate a mount that would accommodate securing an AMD board to the existing mounts of a previous Intel board?

I know I sound naive, but I haven't looked into it yet. It certainly seems possible from a basic engineering standpoint.

EDIT: my horrific grammar


At least part of the problem is getting said board; laptop motherboards are often specially manufactured for each chassis, and there is no standard form factor like there is for desktops. It might be possible to buy one from a different model, but most boards aren't socketed for AMD chips so that's out, and it's not really feasible to just build your own motherboard.


Very true and. I understand your point. I was trying to suggest that you cou take an AMD motherboard and 3D print some sort of adapter that would adapt its form factor to fit the particular form factor of the existing chassis.

At least as a PoC.


FFS, I hope they do this. I waited almost a year for a Lenovo that came out with single channel RAM, and got an HP instead.

Screen was only FHD, and keeping Linux working on it has been painful.


Microsoft recently offered a model of the surface lineup with a ryzen mobile processor. That can't count for nothing right?


It's not high-end at all.

Only 8GB Ram, no dedicated GPU, no PCIe 4.0 drive..


It’s worth noting that neither AMD nor Intel’s latest notebook CPUs support PCIe 4.0. I imagine that won’t be coming until the next generation. Also, you can configure the Surface Laptop 3 being referred to with up to 32GB of RAM: https://www.microsoft.com/en-us/store/configure/Surface-Lapt...


the Ryzen 4th gen will support PCIe 4.0


Not on laptops, according to the anandtech article.


not high-end, but I bought a couple of $300 AMD laptops (Acer Aspire 5) and they are great: 15" 1080p screens and faster than my "high-end" laptop from 5 years ago.

https://www.amazon.com/gp/product/B07RF1XD36



The AMD variant does not have more than a 1080p screen. Also, seems to be no longer available when you go to the "Models" link.


It simply takes time to develop everything to make this happen. These high end ultrabooks require more time to engineer. One thing is certain, anyone developing this would surely not announce it in advance. Why piss off Intel until you have to?


3:2 display? I know it harks back to CRT specifications, but 4:3 was just right for pretty much all computing purposes other than watching wide movies. Even better? 3:4, a 90-degree rotated tall screen.


"For as long as I can remember, AMD-powered laptops’ potential have been intentionally shortchanged by OEMs for any number of reasons – perceived lower value, kickbacks from Intel, or sometimes legitimate performance limitations."

Seems to be a reason missing from that least - very-low demand.

There are clunky low-end intel laptops just as much as there are AMD laptop ones. If enough people show they are willing to buy these than high-end AMD laptops will sprung up a few minutes after.


This is patently untrue, as I also called out in the article. Voting with your wallet is not possible if the option to buy what you want is not even presented. Moreover, the self-fulfilling cycle of offering sub-par AMD systems and then people not wanting those, and then the OEMs say, "see, we offering (lousy) AMD systems and no one got them".

The problem is there are very few DESIRABLE AMD offerings, and moreover, the vast majority of people don't even care what's inside, they are looking at price.

This is all before even mentioning that AMD systems, even when performant, have been positioned as value or low-end systems, just to differentiate between them and the Intel offerings.

It's also already heavily covered elsewhere in the thread, but Intel has been furiously and notoriously working to undermine AMD with backdoor deals to keep them from a greater marketshare.


With all these CES announcements, it's sadly too late to save devices shipping later this quarter. :(


You're right, but it's a good time to highlight the problem with all the attention on AMD products now. Hopefully someone's listening and can point to this to influence, however slightly, opinions in a constructive manner.


Yes, please!


"There needs to be display options well above FHD/1080p."

With a 13 inch screen, 1920x1080 is at the pixel density (150) where someone with 20/20 vision won't notice pixellation from a distance of 24 inches. Even with a 15 inch display (density = 128) you're still pretty damn close. Realistically, it would make no difference in usability to go beyond 2560xwhatever in resolution on a laptop, and even that's already getting excessive. Most people with high res displays are already using magnification, which defeats the purpose of such high resolution.

I'm using the new Honor Magic Book with Ubuntu Mate installed and am quite pleased with it. Lightweight, good battery life, good screen, good touchpad (SUPER IMPORTANT), USB-C (only one port, unfortunately), dongle-free HDMI, and enough power for development. And the price can't be beat.


Pixelation is not the only factor that makes a high-res display worth it. Smoother fonts, scaling, and overall sharper graphics are all perceptible even if you are not consciously aware of individual pixels. All of this contributes to, or helps alleviate, eye strain and fatigue.


Fonts are so god damn crispy at 4k you have a "awwww maaaan" reaction to going back down, particularly when using an aesthetically pleasing font like Fira Code Retina with ligatures


I use my 13" laptop screen at a comfortable distance of 15 in (40cm). 2K is definitionally not enough - text is either jagged or blurry. 4K is the next logical step because you can use 2x scaling.

Using scaling ("magnification") doesn't defeat the purpose. The purpose is to have clear text.


Unclear text is really a hinting+rendering problem but it's so often bad (cough cough Windows..) I'm also a fan of just throwing hardware at the problem. I quite like the ~3K display on my xps 13, although the occasional program (say, written in tcl or Java by a professor for a specific course) will not scale, which is "fun".

Separately though, where have all the high-dpi matte displays gone? Why's it impossible to get a 4K matte touchscreen in any laptop? I don't want to see my face in the screen while I'm trying to read something.


People use laptops at far closer than 24 inches, though?

Not all the time, sure. But it's kind of half the point of a laptop to be able to use them where you can't just clear out two feet to put between your face and the screen.

You can get that much space in an economy seat if you don't have a head, maybe.


When my laptop is in my lap, the screen is about 60 cm (24 inches) from my eyes. I could move my face 10cm closer, but not for long without being uncomfortable (any closer and my elbows would be behind my back).

When it's on a desk, the screen is at a similar distance.


That seems like a weird situation to optimize for. 24" between your laptop screen and your face is really not a lot of space.


I currently have a 4k 15.6 inch laptop, it is very easy to differentiate it from a 1080p one.


I certainly notice the difference between my 4K OLED 15.6" Thinkpad X1E and my old 1080p 17" MBP. The smoothness is unbelievable.

No idea whether that's the 4K or that OLED contributes too. I certainly think QHD should have been enough, but that was not an option, weirdly. Having to choose between 1080p and 4K is absurd. Give us some options in between!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: