Hacker News new | past | comments | ask | show | jobs | submit login
AMD’s Mobile Revival: Redefining the Notebook Business with the Ryzen 9 4900HS (anandtech.com)
322 points by partingshots on April 20, 2020 | hide | past | favorite | 226 comments



What is interesting here is that multiple news outlets are calling this "AMD's revival". As far as I'm aware AMD has always been kind of "second rate" in the mobile space - until now, apparently. There's nothing they're really reviving from - it's a first.

Even when AMD was previously whooping Intel - in the Athlon 64 vs Pentium 4 days - AMD was not particularly great in the mobile space. The Athlon 64 tech was downscaled for mobile uses fine, but AMD did not have a good answer to Intel's broad Centrino platform for mobile technologies. Moreover, Intel was quick to introduce the Pentium M on the mobile market, which was a much better and competitive product than the Pentium 4 (and the direct predecessor of the "Core" branded processors that marginalized AMD again later). AMD was competitive with Intel before the Athlon 64 too, but the mobile market was more niche before the early 2000s and the differences were smaller anyway. AMD was ahead of Intel multiple times in the past, but never in laptops.

This seems to be the first instance where AMD is not only "sort of competitive, but meh" in mobile products - it's the first instance where they're in the lead technology wise. The first time you could be better off buying an AMD laptop.


Agreed.

I remember you were more likely to spot a VIA processor in a laptop then spot an AMD processor before the Athlon 64 mobile processors arrived as Turion.

This is new.

Intel really did accelerate into the laptop mobile space with Centrino. It seemed like no one could match them on a combination of performance, aggressive power saving and battery life, consistently good WiFi and good GPU solutions (Nvidia optimus integration)

It looked up until now like Intel had some secret sauce in it's drivers or elsewhere as well, to get such (relatively) good battery life out of the pig that Windows running non-UWP apps can be. I wonder how much is AMD getting better at software/drivers and optimization, and how much is AMD just getting such a lead in power usage and performance that they can compensate their software.


> to get such (relatively) good battery life out of the pig that Windows running non-UWP apps can be.

What does UWP change regarding battery consumption ?


UWP apps use different APIs to interact with windows than Win32. So at least in theory, the OS can actively manage them similar to Android or iOS apps.

I've noticed that Zen 2 on desktop can easily suffer from re-occurring spikes in temperature, caused by background Win32 running on a 10-30 second timer, doing small amount of work that causes the cpu to Boost to it's maximum clock, then drop back down. I could see that mitigated a lot of that with drivers or CPU code that is much slower to ramp up in response to load, but it would be better still if the OS wasn't running any Apps that behaved like that, especially while User isn't actively interacting.


shared resources and the ability for the OS to dictate background activity on a per-(UWP)app basis.

probably more. i'm not that uwp-hep.


I agree. I'm quite happy to see AMD compete effectively. More competition is ultimately good for all of us. Unless it drives them to cut corners resulting in security issues, like with the recent spate of Intel issues, but then it's good to have an alternative.

I just bought a laptop for my son, and felt this was a good opportunity to try out a Thinkpad with a Ryzen processor.


AMD offered mobile versions of their K6-2 and K6-III chips, they weren't too far behind (in speed or power) at the time.


A revival from "nonexistent" to "on the market again at all"?


That's not a revival, that's a birth. AMD has never found any success on the mobile market before. Even the APU didn't find much success.


https://www.anandtech.com/show/15708/amds-mobile-revival-red... is the page you really want to look at along with the battery life at https://www.anandtech.com/show/15708/amds-mobile-revival-red... which just destroys the i7-9750H.


I picked my kid one of the new 4800h based laptops for the heavy Adobe work she does at work and university. Unplugged, I can watch full screen twitch.tv videos with all indications looking like the 9+ hour estimate might be correct - which was a bit of a shocker. I've not had a chance to do much else with it... as she won't give it back. :)


Yeah, that's quite a shocker truth be told. I bought a little "netbook" last year for my mom and I was surprised the Intel Goldmont+ [0] it has inside could pull 11 hours of netflix.

Now having a beast like those Ryzens pull a similar feat? Sign me up! I'm looking forward to getting one of these!

[0] Goldmont(+) is an architecture where Intel pulled a few tricks from Skylake, put them in the blender with an Atom and got a surprisingly good, yet power efficient design... For late 2018 that is. Seems AMD topped them there too, and without having to go for a less powerful design.


I've been looking for a good netbook that isn't a chromebook. Which one did you pick up?


Not OP, but if you're looking for solid performance in a tiny form-factor, I'd suggest One Netbook 3.


I'm OP: got a way too stripped down hp stream. For the use case ("here you go mom, you got firefox and netflix installed") it was more than enough.

But it does have some severe limitations. Soldered 4GB RAM and 32gb eMMC drive. Soldered everything basically. The procesor has a 6W TDP [0]. The free storage space is so limited that once it got a big windows update and I had to disable virtual memory just to get windows to update, then clean the update files and re-enable virtual memory.

OTOH, the processor has 4MB of L2 cache. That's a monster performance boost in that form-factor, so it doesn't feel like previous Atoms (and props to intel for not calling an Atom, it's called Celeron N4000 or Pentium N5000). Do take into account this is Q4-2017 tech, so the Ryzens these year have leapt over it/are about to.

The computer doesn't have fans, but it never gets hot, a bit warm at most. The final touch is a mate screen[1] at 11", paired with a comfortable keyboard. Nothing about it is incredible, but I got it at a good discount (about $180) which was a killer price.

I did a quick amazon search and it seems the 11" model has been discontinued and people are selling at $300 (that's a good one) and the 14" stands at $240 lowest price.

As for buying someth today, I've read great reviews of the Motile 11" "netbook" [2] (not the 14") that packs a Ryzen 3200 (that's Zen+, not Zen 2) which packs good performance into a low power envelope, with decent build and good specs. They sold those at sub-$200 for a while, but then the (too?) good reviews bumped that price up to the $300+ range. Still better than other alternatives at that price, but not a killer.

I'd certainly look to buy one of those today if I was looking for the same use case, mainly because I want to support AMD and low-power intel is still kicking forward the arrival of Tremont (the successor to Goldmont+). And given the performance AMD is pulling from these mobile Ryzens (specially the Renoir chips), I doubt intel's Tremont chips will hold their ground -- rumour was about 30% IPC improvement over Goldmont+, but that's already a very underpowered chip.

Well this got quite a bit longer than I expected. That's all I know/care about small, lightweight, cheap laptops these days. Hope it's useful to someone!

[0] https://ark.intel.com/content/www/es/es/ark/products/128988/...

[1] I hate glossy screens, the glare always makes my eyes uncomfortable if I readd too long from them.

[2] https://www.notebookcheck.net/Walmart-s-most-affordable-Moti...


I love to tinker but my best investment has been a recent purchase of an Acer 11" Chromebook ("311 spin") for the kids. Touchscreen/convertible, Celeron 4100, 64gb.

It just works, is less than 1kg, 9h battery life, no worries or what the kids might click/install/... and for myself a hassle-free Linux (VM?) for some quick evening fun.

Just wow. All my preconceptions about Chromebooks are gone.

Only downside is that for kid accounts you can't install chrome extensions - so no ublock. But you can install Firefox' android version so also not the worst.


Yeah, forgot to mention weight. The hp stream I talked about is about 1kg/2lbs weight so it's super lightweight. Seems like we're talking about similar machines, though you have extra storage space that would certainly make it more comfortable to use on a day-to-day basis.

Don't really know why hp decided to kill de 11" form factor. I suppose that "competes" with 11" tablets? Who knows. It's super comfortable to use and I've taken it on a few trips sometimes and it can certainly show a presentation slide, and even some light python programming on the go (but certainly not numpy or anything "heavy").

Which is why those Ryzen 3200 notebooks looked so interesting at sub-$200! But like I said, I'm certainly waiting for Renoir and seeing if some lightweight, well powered, energy efficient, well-priced notebooks apper in the market.


Microsoft please use this in the surface laptop update for this year. Last years Ryzen Surface Laptop 3 was a bit anemic IMO. This would be insane, pair with upgradeable storage and ram and I would buy over a MBP.


Once a machine gets old, I usually donate it to somebody who can use it further. In that regard, my Surface Book 2 is one purchase that I wholly regret. Once the battery dies there is no way it can be replaced - not even by Microsoft. They just send you a refurbished piece for $500.

This expensive machine contains high quality components. For the sake of the environment, it should be illegal. I will never buy Microsoft again.


The other thing Surface does wrong is no Thunderbolt. I wouldn't buy a laptop without Thunderbolt ever again because being able to attach a new GPU extends its usefulness by years. I can still play games on my 2013 13" MBP using Thunderbolt 2, anything with 6+ cores and TB3 is going to be competent for a very long time.


Absolutely agree here. However the change from surface laptop 2 to surface laptop 3 made it a lot more repairable. Hopefully this years model will have an easily replaceable internal battery.


The Surface Book line is more akin to a tablet than a laptop as the guts of that machine are still in the screen, much like the Surface Pro. I don't think it's unreasonable that there isn't in the way of repairability given its tablet form factor.

Don't get me wrong, I'm disappointed myself in this product, but it doesn't take away from the successes elsewhere in the Surface product line.


when i bought my laptop, foremost on my list of features to look for was actually having the ability to open it up and replace parts (RAM, HD, Battery, etc).

Unfortunately, Microsoft went the same direction as Apple with the Surface Book (minimal to zero repairability).


It’s frustratingly hard to find a modern laptop with a replaceable battery. Which laptop did you get?


unfortunately, I was unable to find one with a replaceable battery. I, at least, got one with other replaceable parts, though. I got the HP ENVY 15z (Ryzen 2500u).


Thanks for letting me know. It’s so frustrating that laptops don’t have replaceable batteries these days as the batteries will last for much less time than the rest of the laptop. Plus, if you use your laptop plugged into the ac socket most the time like I do, this is not good for battery life and this could also be avoided if the battery was removable.


Oh even better, the Surface Book. I much prefer that form factor over traditional ones.


The Surface Book 3 is expected to be announced shortly, but early rumors and leaks point to Intel. But a boy can dream!


I get a feeling that was the plan all along.

With AMD processors in the next Xbox, they'd have pretty good visibility already on the AMD offerings coming.

I think ARM based Surface RT was a similar plan, getting in early on ARM to be ready for when their performance matched Intel. It's a shame of course that Qualcomm never got there, and it was only Apple that kept ramping ARM processor performance.


Yes. This would also make me consider moving from a Mac.


Are you on a Mac just because of the CPU?

What do you do that solely depends on the CPU and would improve so much by moving to AMD, at the cost of a completely different OS and software?

It would still be beleaguered by Windows, the main reason most of us moved to Macs. :)


I mean honestly the only reason I would stay with Mac would be the 4 TB3 ports, the heat and performance of MacOS pales in comparison to their Windows software production counterparts.


macOS' thermal management of macOS appears to be better than Windows': https://youtu.be/LGOmbNRlZdM?t=495 (8:15)

It also gives a better battery life on average compared to Windows on the same machine.

And the new Mac Pro is apparently a beast in performance and cooling.


I will say I consistently get better benchmark scores on CPU in macOS than Windows, but anything that involves the GPU is hilariously poor on macOS compared to Windows.


> but anything that involves the GPU

Even on Metal?


> It also gives a better battery life on average compared to Windows on the same machine.

Only if you have a dedicated gpu. Because OSX can easily switch between integrated and discrete.

On my Macbook 12" I get 2 extra hours out of Windows 10. But on my old Retina Mac Pro I get almost half the time out of Windows 10.

It's a shame that Apple wont go AMD because the battery life would be awesome on OSX with discrete gpu.


> It's a shame that Apple wont go AMD

I think they're probably just gonna go with their own custom ARM processors, which may have even better battery life.


I feel like going to ARM will alienate the customer base of developers that Apple has, won't it?


The developers that use XCode with Objective-C probably won't feel any difference as XCode will likely just silently compile the code into ARM (or both a la universal package from PPC days). It's too early to tell though.


there are a lot of programmers that use mac OS that don't even touch Swift or Objective-C. I'm in the web development industry, and I'd say 50% of the developers I know use apple computers for their job. Java, Ruby, even C#.


After Java/Ruby/C# runtimes are recompiled for ARM, these guys not gonna feel any change as well.


that seems a little simplistic. It isn't just the runtimes, but also all the tooling that goes with it. There would be a lot of work involved with recompiling all tooling and runtimes to work on ARM, and would probably alienate a large portion of developers currently using Apple products as a result.


This was already done twice in the past, once on transition from Motorola 68000 to PowerPC, and once on transition from PowerPC to Intel. All arrived with updated versions of interpreted language runtimes. We'll have to see what will they do this time.


I thought MS Surfaces use ARM chips?


Surface Book uses Core i5/i7.

Surface Pro uses Core i3/i5/i7.

Surface Laptop uses Core i5/i7 or AMD Ryzen 5/7.

Surface Go uses Pentium Gold.

Surface Studio uses Core i7.


They experimented with that on a few models in the early days of Surface machines, but quickly gave up.



The Surface Pro X that was released last year uses an ARM64 processor.


I went from a €2000 MacBook to a €500 ThinkPad with Ryzen. Before the lockdowns. Not comparable whatsoever. Benchmarks are usually 30% to 300% faster. The Ryzen even blows away my fastest desktop machine with Intel and 4x more RAM.

Unfortunately I cannot benchmark with the fast AMD because it switches frequencies too fast. Lots of state changes. Eg it's faster on higher load than on low load. All my benchmarking logic is wrong. You really need a locked down kernel now for proper measurements.


Is that Renoir (Ryzen 4x00) chip? As far as I've read, the 3x00 mobile (Zen+) were good but not amazing. The Renoirs seem to be blowing everyone out of the water, and I've read several times that it seems to pack "desktop performance on a mobile envelope".


Yes, just the old one, a 3200. Still much better than my Intels.


While that might be true and I am happy that you're happy, this is something different. Intel has competitive products to your cpu, outright better ones, too.

To Ryzen 4000(mobile) they don't. It's not even close.


And apparently they won't have either against Ryzen 4000 desktop (Zen3 on 7nm+). Interesting times ahead...


Since there is not _yet_ a Thinkpad available with this chip, you must have a previous gen Ryzen chip which, yes, was still a bit behind. That has no application to this new gen chip which is out-performing all expectations.


Which ThinkPad model and chip?


Kinda weird to deep link into the middle of a review.


My mistake, I didn’t realize that I’d done that. Anandtech seems to set each part of a review as a separate page, which I didn’t realize when copying the article link to post.

Maybe dang or someone else could potentially change the link to point to the beginning? I don’t think I can edit anything on my end anymore unfortunately.


For a business notebook to the page comparing game performance.


I think you misread “notebook business” as “business notebook”


oh apologies!


It's not a business notebook.


Depends on your business


If “this laptop is the business” does that make it also a business laptop?


It depends also on your definition of “business time”: https://youtu.be/WGOohBytKTU


I misread the title as 'Redefining the business notebook...'

This looks like a very good CPU. At some point it would be nice to see a well-cooled laptop with an 8 core CPU benchmarked against an 8 core desktop. AMD seems to be doing very well on the TDP side and I wonder if we may be reaching the point where only games and a handful of really demanding (non-cloud) workloads require a desktop.


> I wonder if we may be reaching the point where only games and a handful of really demanding (non-cloud) workloads require a desktop.

I sort-of feel like we are there already. I think we are quickly reaching the point where even games and most demanding workloads don't require a desktop. I work on video games professionally and even I do most of my work on a laptop.


Do you have any suggestions on laptops? I've been using a MacBook Air 2013 as my daily drive for a good minute now.

I've been eyeing the Razer Blade, while the build quality is enticing, the battery life has me worried.


Laptops come in a huge variety of shapes and sizes. If I was buying for battery life, I would think the new AMD Ryzen 4000 series would be high on my list though.

For reasonable portability, decent gaming and excellent CPU performance, the laptop in the article is $1450, though availability is still low.

For seriously good performance on a tight budget, the Asus TUF 506 (A15) lines packs a Ryzen 48xx and up to an RTX 2060 into a $1000-1200 laptop. The screen has high performance on paper but older versions of this laptop had bad things to say about the screen, so I'd wait for reviews.

There are other Asus options like the A17 and Zephyrus G15 which are all worth a look, and starting to show up in stock here and there.


Me, not really. We were using the Razer laptops at work, but they had some frustrating QC issues on the bigger 15" and 17" models. I had a coworker that liked her Razer Blade but do know what you are getting, it is pretty underpowered.

Personally, I can deal with reasonably under-powered, I still have a desktop when I need and I'm mostly proprietary engine technology that doesn't need as many resources as say an artist or designer who spends most of their day fully in-game content.

I do a lot of rendering work so I have an eGPU (Razor Core) and the integrated Intel GPUs aren't so bad anymore.

So, having said that I have the most recent XPS 13 2-in-1 as my primary machine, it's for sure a compromise in power, but it does most of what I need it to, and I often RDP into the desktop when it's not.

I choose the 2-in-1 because it was the first 13" laptop that I could find that had 32GB of memory and met all my other criteria (USB-C charged, etc) I understand the non 2-in-1 model now has a 32GB configuration so I'd consider that for sure.

Compilation is my largest frustration at this point really, but I'm playing with various remote/distrusted build options to speed up that workflow.


This probably isn’t what you’re looking for but I’ve been playing RDR2 via Stadia and it works surprisingly well. It’s probably not at the level a hardcore gamer would want but streaming is well worth a look if you’ve been put off laptops because you want to play a few games now and then.


I would like to use laptop as desktop replacement but the problem is laptop are too noisy.


I haven't seen any thorough analysis, but here's one example of some mild comparison:

> Core i7-9700K, a desktop processor with a 95W TDP

https://www.techradar.com/news/the-amd-ryzen-9-4900hs-is-fas...


Right. What's weird is Anandtech themselves don't seem to have the common benchmarks in their 'Bench' tab - e.g. to compare Cinebench R20, the 4900HS isn't even shown: https://www.anandtech.com/bench/CPU-2019/2580

But comparing the numbers, 5070 for the 9900KS vs. 4394 for the 4900HS, it looks quite good.


Was my experience that the Bench data takes a moment before it has current reviews.


On a somehow related-topic: I tried to get a laptop at beginning of this lock-down (in Feb) and market was looking kinda bleak. Had my eyes on a Ryzen 7 with 8 GB RAM and 500 GB SSD which went for ~$500 at the time. Boom, gone. Also what's left is only crap that's expensive. Laptops that only 4 months ago were laughed at, now are for those $500 (I mean c'mon, a decade old CPU to pay $500 for a laptop which has that is laughable IMO).

Anyway, coming to this topic, how fast you guys reckon this platform will come to consumers, given the current circumstances? I'd wager is not this year.


I've seen laptops with Ryzen 4800HS CPU in stock at NewEgg and Amazon. Not often at Best Buy yet.

Look for the Asus TUF 506 and Asus Zephyrus G15.


My wife needed a new laptop just recently and she's a ThinkPad-only kind of gal. It was nice to see that there are many AMD options, but you can't buy them. In fact, you can't buy much of anything from Lenovo. Outside of the L13 (what she ended up with), everything is "ships in 5+ weeks" on the Lenovo site.

Having one in stock at NewEgg or Amazon might be helpful, but they are typically base spec with soldered memory. If you want 16G+ memory, you are just generally out of luck.


Don't buy those from Lenovo yet. They do not yet have this new generation of Ryzen chip available. Don't buy a last-gen slow part.


Still waiting for a similar review of the 4900U, which I understand targets a much larger market with its very low power footprint.


Intel can still play the marketing budget game where it can bundle Thunderbolt ( for Apple ) or WiFi Module all while lowering price. ( For Server they were doing it with SSD )

If anything I think that was one of the reason how Apple got their MacBook Air to be priced at $999.

Despite having the better product, AMD will still need to work on its marketing message, support material, distribution channel, Sales, and forecasting. Right now AMD seems to be winning at Technical while Intel is winning at operating.


The TB controller is ~$9 [0]. That should be next to nothing for a premium laptop. Yet even HP on their premium/pro Z-series went again with Intel, despite the fact that the the current gen Z-series (a series that tops out at ~$10000 in the highest specced models) includes configurations like i5-8400H, Intel UHD 630 iGPU, or 8GB of RAM. Not a real "workstation" config, not high performance, no ISV certifications. Yet AMD was still left out.

I think one reason is many OEMs don't want to invest in an AMD platform until they're sure AMD will be a long-term success. The second reason is probably that they don't want to annoy Intel. I'm sure there are still plenty of tricks Intel can pull to achieve the same result as in the past but without running afoul of regulators.

[0] https://ark.intel.com/content/www/us/en/ark/products/series/...


>That should be next to nothing for a premium laptop.

For low end to mid range laptop a BOM cost of $9 is a lot. Especially to PC manufacturers, so bad that installing crapware becomes their major source of revenue. For high end the marketing budget comes in form of Rebate.

One of the major reason AMD got off to a good start from major manufactures is actually because Intel cant even provide them enough chip in the first place. And hence intel hasn't done much to push back at all because they dont have the capacity right now.


Which is why I said "premium laptop". I don't expect lower end laptops to include it regardless of the cost of the controller itself. There's far more than the $9 involved: greater PCB complexity, additional ICs, extra connectors, etc.

But a premium line like HP's Z-series includes TB even in the lowest config (i5-8400, UHD 630 iGPU) starting at ~$3500. I'm reasonably certain that going with AMD CPUs would actually lower the BOM despite the extra $9 TB controller. Intel CPUs are not cheap.


We’re seeing a much stronger resurgence this time. These transitions won’t happen overnight, but will be over many product cycles. It was a recent surprise revelation to the public how powerful the 4000 series is!


What's the price of adding an extra controller on a motherboard?


The same regardless of CPU so I considered it irrelevant for the comparison. The difference between an OEM building Intel or AMD based systems is in the fact that the TB controller comes for free with Intel chipsets. The extra PCB complexity, ICs, and connectors are still the OEM's burden.


The TB controller is built into the CPU with Intel. So it's still relevant.


Related: AMD 25x20 Energy Efficiency Initiative

https://www.amd.com/25x20


I've been asking AMD since January to get an update to this with Renoir. They say they are planning an update for later in the year. Perhaps there's a better Renoir system to come?


Is there a firmware available for these that can disable the AMD version of the Intel ME like the Purism/coreboot ones do?


Nah, I don't think there's a way to disable PSP on ryzen.


There was a talk at ccc last year by a group dissecting the psp[1].

It would not be a stretch to say that eventually it will be possible.

[1]:https://media.ccc.de/v/thms-38-dissecting-the-amd-platform-s...


Hello,

Disabling PSP in such a case would be subjective. Ryzen doesn't support cache as RAM anymore, with the PSP doing RAM initialization. The furthest that you'd be able to is getting a minimal PSP firmware that only handles system bringup.

Some OEMs currently propose an option to disable the PSP <-> main processor communication interface during bootup.


Would that proposed option disable PSP main memory access? It’s access to system memory + access to PCI devices (networking) that most concerns me.


I would gladly pay an extra $50 for a variant of the CPU without PSP.


Any idea how the 4900HS compares with the ARM chips in the new iPad Pro ?

I saw a benchmark of the 2020 iPad Pro vs the 2020 Macbook Air where the iPad Pro won by quite a bit on pretty much every benchmark. If Apple were to migrate the Macbook line out of Intel, it would surprise me if they didn't migrate to ARM instead of AMD.


There is something interesting going on with those Javascript GeekBench benchmarks worth bearing in mind.

Despite Apple now for years destroying everything but the latest high clocked Intel processors running Chrome (all other browsers get hammered), this doesn't really translate into much real world.

Sure, the JavaScript performance is probably excellent, the best maybe, but I feel like they've only optimized for that.

I don't see video encoding or 3d rendering being a strength of Apple processors any time soon, but I guess even on Desktops that's moving and moved to GPUs.


I still have a hard time believing those benchmarks. Are they really comparing the same things - just compiled for different architecture? Or is there more behind it - one is a "mobile" benchmark, the other something else?


> Are they really comparing the same things - just compiled for different architecture?

IIRC they were just Geekbench 4 and 5, which are standard benchmark suites for the MacOs and iOS ecosystems, and IIRC only the subsets of the benchmark suite that ran on both the Macbook Air and the iPad Pro where compared.

The only thing that I know about Geekbench is that it benchmarks application performance, e.g., by running an application and performing a task (e.g. Cinema, Adobe, etc.).

I don't know if they use the same applications and the same tasks on every OS/arch combination. If they don't, the scores cannot be compared, and being able to compare the scores is the only purpose of these tools... so I hope so (the article I read seemed ok, so I had no reason to suspect).

Does anybody know for sure?


They don't benchmark apps, but test on a set of standard workloads and algorithms.

https://www.geekbench.com/doc/geekbench5-cpu-workloads.pdf


I'm with you. Benchmarks should always be looked at with extra-care. The environment of a tablet is quite different, and there's not much things that are benchmarked. Not saying that those cpus aren't great, but benchmarking those in the exact same conditions than an AMD or Intel CPU on a laptop, desktop or server might give some results that are more contrasted.


Apple as a company is all about integration. It’s amazing how many shared parts exist between the iPhone, iPad, and yes, even the Mac to some extent with the T2 chip.

A larger, thermally unconstrained ARM octacore, with SMT? Maybe with 20 hours battery life (but realistically they’ll just make the device thinner lol).

I think it’s entirely predictable for Apple to go ARM on Mac.


There will always be a question of backward compatibility due to the nature of the Mac ecosystem that you don't have in iOS. Then you have the issue of getting developers to migrate any existing arch specific optimisations. This can be mitigated somewhat and Apple has done this a few times before, but it's still a massive undertaking.

It may be easier for them to play the CPU vendors against each other if there is real competition in the AMD64 space. There are rumours of Apple going ARM on Macs, so it may be that that path is set, but if it isn't, there is sense in holding out the transition to ARM at this time.


Honestly, Apple has killed backward compatibility multiple times, first from PowerPC to x86-32 and x86-64, and now, killing x86-32 in Catalina.

Apple also has a good solution to this problem (the AppStore solves it), and a big interest into locking developers into this solution...

So Apple doesn't need to restrict themselves to any hardware vendor when it comes to CPUs, and due to Metal, they don't have to do that either when it comes to GPUs.


I wonder how they'll get past the RISC/out of order execution issue that still is an issue with ARM chips.


Sorry, can you elaborate? Apple's "A" series CPUs have been out-of-order for years and at least a few years ago had a similar superscalar/ooo structure to Intel's Haswell chips. (Apple poached a bunch of Intel processor designers…)


And yet many of us will still buy the next macbook pro 13inch refresh when it comes out with a "good keyboard" and a substandard processor. Why is that? For me - its the total package. The software/the hardware/integrations that just work/the arguably best track pad in the industry.

CPU is one piece to the puzzle. I read on the message board to unlock the top tier AMD mobile processors you had to guarantee it wouldn't be paired with other crap hardware like vendors done in the past. We will see - hopefully next microsoft refresh breaks this trend for amd.


I'm not sure I follow what you're trying to communicate. It's true that if you only consider an Apple laptop, you will only consider an Apple laptop. Windows laptops compete against each other. Few people cross shop them. That being said, it seems that 75-80% of personal computers sold today are still running Windows, not OS X. As someone that has software that runs equally well on Windows and OS X, and who does not use a track pad, bang for buck on hardware performance and quality (and minimal OEM meddling) sells me my laptop.

It's also true that AMD has an uphill battle with convincing OEMs to make really great laptops using their APUs. But one would suspect that reading this article, you can tell that Asus has at least a foot in the door, as this is a review of a very high quality laptop - the whole package, not just the CPU.

And finally, it's of course also true that most Microsoft Surface products use Intel chips, with just one version of the Surface Laptop using an AMD chip. But they already began to buy into AMD hardware, so I would expect more along those lines this fall!


I'm was making an assertion that I feel that a lot of people will buy apple while having a substandard product in some aspects. Because overall it has the "package" When i refer to "many of us" I was talking about startups/developer community maybe not super enterprise-y like a Freddie or Fannie/ BofA etc.

"Why is that?" was a question. I answered it for me, but I was wondering for others. That may buy a macbook over something with arguably notebooks with better cpus / configuration. Why are they doing it?

AMD has a great processor, what they need is a great platform. I hope Microsoft partnership they have pushes them forward. Because I'm tired of buying intel macbooks/notebooks


I've owned/used lots of laptops (both my own & from work).

When I tried windows laptops in recent times I searched for the best ones in my size range - both times they were dells (last one is a 2015 XPS 13) & in both I've paid close to apple-level prices (the xps was €2400 IIRC) and yet I had to have both repaired within about a year of buying. With the xps it was just after the 1 year mark which meant I had to pay several €100s for a screen replacement.

This never happened to me with a MacBook pro despite having used a lot more of these than I have PC laptops (which were supposed to be high-end laptops).

For what it's worth I don't care about OS - my main computer is an iMac I duelboot Mac & Win via boot camp.


This is anecdotal. The last macbook I bought the GPU stopped working and it black screened after 2 years or so. It was also ran hot due to apple's weird indexers constantly using 100% cpus in the background and it being a closed system meant you couldn't really figure out wtf it was doing that or fix it.

However my 2012 era thinkpad is a complete tank, and even better any issues are easily user serviceable.


Yeah I had a company issued 2019 top of the line xps 15. It replaced top of the line HP 5 years back. I would commute a lot and bring my macbook to use in place of the dell. I would dual boot i. I only take out the XPS to get on vpn :-/


Windows has plenty of developer community to keep the 80% market share customers happy with applications and games.


I personally have never seen a "windows" developer do any development work on a windows laptop (I see they have visual studio etc installed). Unless you counting documentation / requirements. I have seen a lot of ix devs and osx developers develop on their laptops. I work as a consultant in big enterprise world and with many startups on a range of projects.


I have seen plenty of them in the last 30 years, specially since 2006 doing consulting for Fortune 500, where docking station/thinkpad combos are pretty much standard.

A very famous Finish telecommunication company, several well known German companies in medicament production and life sciences domains, and a couple of car and package delivery companies.

Anecdotes.


Ahhhhh!!! You are correct the docking station. I ment being mobile and doing work. Most definety. I see more devs with docking stations then Apple / Linux combined. I just never seen them used it undocked for dev though lol. Great point!


I may well end up buying it to replace my aging Macbook Pro, but it will be because the cost of switching to Windows is too high, and because there is proprietary software I need that won't run on Linux.

I am buying it because I have little choice, not because it represents good value or the "total package". Few ports, bad repairability, and an OS that won't run perfectly good 32-bit applications.


A good alternative to this is to run Linux and use Windows in a VM if necessary. Having to use a VM for something once in a while is mildly inconvenient, but it's not that bad, and on the other hand it's just bad enough to keep you on the lookout for ways to stop using it entirely.


> use Windows in a VM if necessary

That's my last vestige of Windows usage, and by the end of the coming weekend I'm hoping not to need that VM very often either.

My only other occasional need to use Windows tends to, ironically, revolve around the software needed to root some Android phones.


Interesting. I'm not stuck on OSX ecosystem as much as you but certain things do turn me off. I'm really in love with the trackpad and for the most part overall build quality. (my daily is still a macbook pro retina late 2013). I would love to make jump to linux but traditionally battery life the hardware options i have not been in love with. :(


I have a feeling a lot of comments here are missing one thing which is huge for me: MacOS has a consistent UI across all applications.

Last year I decided to switch to a Thinkpad with Ubuntu, and while I can work on it, and it kind of has replacements for everything I am used to on Mac, not only the quality of such replacements is usually lower (Good luck finding an equal alternative to iTerm2 or Alfred) but also the shortcuts and UI elements differ from application to application, and it bothers me a lot. I am certainly buying the upcoming Macbook, even though it's probably going to be based on an average (compared to this AMD), highly throttled Intel processor. Am I happy about it? No! But for my usage, it's the best trade-off as of now.


>MacOS has a consistent UI across all applications

Let's see my currently running apps :

Chrome - no

Firefox - no

Slack - no

intellij IDEs - no

VSCode - no

Spotify - maybe kinda sortof ?

VLC - no

WhatsApp - no

Telegram - no

Steam - no

What is this consistent UI you speak of ? Most of these apps look the same on windows.

The only reason I own a MacBook is because I wanted a single device and I occasionally need to do some osx/iOS work - would be happier with windows laptop for sure


The problem with Windows 10 is that it doesn't have consistent UI across applications from MS, bundled with system. Heck, there is even 3 or 4 different styles for context menu. They are trying to roll out Fluent Design for what, 3 years now? New control panel is also kinda laughable functionality wise. And I write it as W10 user.


Like Apple when it was transitioning from Aqua into more Metallic look, followed by Skeuomorphism and now everything should be Dark?


How about just the following:

Cmd + q, Cmd + `, Cmd + p, Cmd + shift + 4

If I could do this easily these things on Windows in every application, it would be a great start. I have to use Mac and Windows for work and I still think that Mac gets in your way less than Windows do.


Alt+f4, install greenshot for screens - it's better than osx screengrab - don't know what the middle two do


Just to make sure I understand you, I need to install something to have great UX on Windows? You see how quickly we getting there, default UX on Windows is crap but you can install x,y,z. This is exactly why I use MacOS because I am tired of tracking what exactly is doing those things I (and many other people) need on a daily basis on Windows.


So where is a calendar widget in the OSX that allows me to see my events when I click on it ? Why do I have to install itsycal when windows has it built in to the os date time widget which is superior to OSX ?

It's a ridiculous argument and I don't see why screen capture should be a part of the OS other than you expect it from OSX.

Besides the UX of OSX screen capture is inferior compared to Greenshot - much easier to determine the capture target and way more options - I don't want to spam my desktop with screenshots.


It might be the total package or it might be inertia. While Apple had a head start on polished hardware, all of the producers are catching on in some ways.

If you can do your work on another OS (and, generally, you can but habits are strong and you are just so used with your favourite brand) then another laptop doesn't seem so far fetched.

I think this is getting closer to the truth. We are creatures of habit and staying with Apple feels cozy.


> I read on the message board to unlock the top tier AMD mobile processors you had to guarantee it wouldn't be paired with other crap hardware like vendors done in the past.

Crap HW (utterly terrible keyboards) don't seem to have stalled Apple's Macbook sales though.


I think what makes this inherently harder with notebooks is that everything is integrated. It's not like you can decide to upgrade the keyboard or screen if you don't like the one it comes with. As a result, you're constrained by the whims of the manufacturer.

Here's hoping somebody builds a laptop with one of these CPUs that has good battery life, is quiet under light load, has upgradeable memory/storage and comes with a good 16:10 screen.


Looking at the benchmarks it's amazing how despite not having significantly more advanced gameplay compared to Civ 2, Civ 6 still manages to take up as many resources as a modern FPS.


Civilization uses a single-threaded, interpreted language for all of the AI and game logic during a turn. It's slow as molasses and hasn't benefited from modern CPU advancements.

For comparison, games like Factorio or Cities Skylines can faithfully simulate hundreds of thousands of entities in real time, but they use C++ and are multi-threaded.


Video game FPS need GPU for graphics, not gameplay.


Civ 6 is still a 2D (gameplay-wise) tile based strategy game with a cartoon art style. You can't even rotate the camera, there are plenty of optimizations they could've made graphics wise.


I think the complaint is that people would like to play Civ 6 on their low powered laptops. It's perfectly okay for the laptop to be awful for reflex skill based games because those are expected to require even more processing power.


Well, what do we know. There may be some game AI computations done on the GPU as well.


The good news is you can still play civ 2. And turns dont take 30s to resolve


What level of machine hardware will OEMs use to market these CPUs? Typically the Intel lines brought the best overall components and AMD machines were...lacking.

Will they have similar machines, but with the choice of AMD or Intel (XPS 15 / X1 Extreme)? Microsoft went in this direction with the Surface Laptop 3 by marketing the AMD line to their business customers. This will be expensive to maintain, but may allow OEMs to breathe a little easier not being reliant on Intel for their premium lines.


> What level of machine hardware will OEMs use to market these CPUs? Typically the Intel lines brought the best overall components and AMD machines were...lacking.

Lenovo is starting to have some parallel Intel vs AMD product-lines. For instance you have the Thinkpad E490 (Intel) vs E495 (AMD) and E590 (Intel) vs E595 (Intel).

Not sure about other vendors, but for those Thinkpad-models, AMD seems to come out quite favourably.


Not quite. One thing Lenovo still hasn’t done is pair a high res display with AMD. Nor do they support Thunderbolt.


Hopefully they release T15/T14 soon. Slated for second quarter of 2020.


I wish they did the X-1 with AMD.


I'm actually considering the opposite: replacing my (2nd gen) X-1 with another Thinkpad, primaryly an AMD-based one.

When doing that, my primary concern is going to be serviceability and upgradability. The lack of options when it comes to upgrading my X-1 has been somewhat disappointing.

For my laptop-needs, a top-specced E495 looks reasonably doable. Small enough form factor, powerful enough for light dev work, and upgradable to handle future needs. and at such a low a price-point that I'm willing to take a chance, even though it might not be perfect.

Of course, If I hold out a little, these 4000-series mobile chips will start shipping, and I might go for those instead :)


I asked a friend about that in IT, and that’s the plan


I just want an AMD "gaming laptop" with an OLED, 4k display :( Ideally 3840x2400.


I'd rather have 1440p since I can't normally see the difference. In exchange for less density, can I get a wider color gamut like DCI-P3 or AdobeRGB?


Even on a gaming laptop I'm usually playing at half-resolution, so I like 2160p or 2400p because for gaming that would be 1080 & 1200. Normally I'm running a text-dense setup where >1440p is very nice. :-)


1440 is good enough for sharp text, but you need to use fractional scaling to get reasonable ui sizes. Better to get 4K and use integer scaling.


On a 1440p screen at 14 inch you end up at 210 dpi which is 2x integer scaling.

Everyone puts in 1080p or 4k panels at 14 inch and both require fractional scaling. It makes no sense why so few vendors opt for 1440p.


You aren't making any sense. 4K on 14 inches is 314 dpi. If you want 105 dpi equivalent that's almost exactly 3x.

Personally I find that makes things too large, but in no way does 1440p have an advantage in terms of fractional vs integer scaling.


"The ASUS Zephyrus G14 as tested is set for $1449. There is a 4K version with Ryzen 7 for £1600."


I see a 1440p display :( I have this currently.


You sure ? I have the XPS 15 7590 with OLED 4k. You can snap your fingers and watch the battery drain... Sure it looks crisp and all, but it comes for a heavy price.


does anyone own both a t490 and t495 ? how do they compare ? and how's amd with Linux i.e Ubuntu KDE variants ? I own a t490 running Kubuntu


At this point it's probably better to wait for the T14.


For anyone else wondering like I was... The T14 will be the replacement for the T490 and T495. Likewise there will be a T14s to replace the T490s and T495s and a T15 to replace the T590. They will no longer distinguish between Intel and AMD in the model name.

https://www.notebookcheck.net/Lenovo-ThinkPad-T14-T14s-T15-w...


The T14 Intel is better then the T14 AMD. Better display and thunderbolt.


They both have 1x Thunderbolt 3, but yes, intel allows for 4K panel upgrade



There's a Thinkpad E495 (previous-gen Ryzen/Vega) review on notebookcheck. The "E" series (fka "Edge") is a budget notebook though; for example, it uses slower memory and doesnt't have keyboard lighting/fingerprint sensors. I can confirm an E495 works perfectly with Ubuntu 19.10 OOTB; not sure the newer Ryzen is already fully supported by Ubuntu 20.04 LTS coming out these days.


check out notebookcheck.net for reviews, they are quite thorough.


There is a T495 review on notebookcheck, but it's based on previous gen CPU and it lacks Linux coverage. Possibly the Ryzen 4000 model will be called T14/T14s instead of T495? No review of those either though.


I see lots of these business notebooks come with 8GB of RAM. Is it really enough for most users? I assumed 16GB should be the standard by now, especially with how memory hungry day-to-day software is.


If you do some text processing, some email and generally office software, 8gb is absolutely enough. Beyond that it's not anymore. 16gb is where most people with more serious applications are happy. The cool kids have 32gb. Anybody who needs even more knows it.


I hope that AMD having decent mobile performance in some of this year's laptops will make the next cycle of laptops really see OEMs embrace the platform. While they are catching up with performance, cooling solutions, sleek design (MS Surface, some Lenovos, etc)... there's nothing like an AMD flavor XPS 13 I could pick up now, no 4K option really. That would be great competition for Intel and potentially a better option for ultrabooks that are so reliant on integrated graphics.


We've yet to see a proper business notebook with those Ryzen H series APUs. So far, the manufactures have been eager to put it into gaming machines with a discrete GPU on board.


Depending on use case, I would expect the Ryzen U series to end up on thin and light business notebooks. 8 cores, 15W TDP, expecting fantastic battery life. But I'm still wondering when one of these OEMs will tackle Thunderbolt ports. From what I've read it's not as easy with AMD, but certainly possible, and up to the OEM's discretion.


Stumbled across this tonight... this is Lenovo marketing so take it with a grain of salt, but given what we've seen so far, I wouldn't be surprised.

> 14 hours of battery

> WiFi 6

> 3.1 lbs

Under 15 mm / 0.6" thick

https://www.lenovo.com/us/en/coming-soon/IdeaPad-Slim-7-14AR...


The Lenovo Yoga Slim 7-14ARE has the 4800U configured as a 25W part with a battery life around 10 hours.

You will find additional details in this reddit post: https://www.reddit.com/r/Amd/comments/fhq98i/renoir_r7_4800u...


Aaaaand for HDMI, they just say "HDMI".

8K TVs are coming. You can get one for $2500 right now:

https://www.bestbuy.com/site/samsung-55-class-led-q900-serie...

Can we PLEASE get HDMI 2.1? Laptops hardly offer HDMI 2.0.


Eyes don’t have the visual acuity for 8K TV’s at normal sizes (sub 100in) to make any sense. On top of that, there is no content and given how little 4k content there is it will be years before there is a meaningful amount of content.

Regular cinema is 2k or 4k. There is very little reason to move beyond 4k, especially in the living room. HDR on the other hand has plenty of room for improvement.


That's ridiculous, why do people like high DPI screens then?

8k on a 55" TV would have lots of real estate and lots of DPI for very nice fonts.


You don't put your eyes 30 cm from your TV. It's about angular distance between pixels. At 3 meters, a normal viewing distance, normal eyes cannot see the difference between 4K and 8K on a 55 inch TV.


is there even content which is 8k? It seems to me we're barely starting to have 4k content ATM.


I’m waiting for 16k tv so I can finally tile 4 4K streams on it.


People said this about 4k. Between gaming and having a huge monitor for work, I do not care about movies being in 8k.

8k, like 4k, is for gaming.


When USB4 standards come into play, which will be in 1-2 years time.

USB4 contains all of thunderbolt3 features, even if it is not 'certified' to be called thunderbolt3.


External gpu in the case we need it would require thunderbolt?!


Even more baffling is that NONE of these new laptops are 13in with high pixel density displays.


Maybe the market just isn't asking for them?

High-densisity displays has a cost in term of increased battery level to drive the display, not to mention the associated GPU.

Why take the cost, if nobody is really asking for it?

Anecdote: I may consider a "downgrade" from my current 2K laptop display to a standard 1080p display in my next laptop, because at 14" that really is enough. I’m certainly not going to limit myself to 2K+ devices only.


i have a 14" with a 16:9 1920 and a 13" with a 3:2 2160 screen, you can barely notice the difference, and with linux and fractional scaling being p.i.t.a i prefer the 14"1080 screen, the 13" i cannot run at 100% its just to small at 2160x1440. i also have a macbook 15.6" that i run at 1920. that is perfect for me. the 4k makes more sense to me at a desktop 27" monitor, but at laptop all those pixels are lost.


> with linux and fractional scaling being p.i.t.a

Off topic, but when using Wayland-based window-managers like Sway, fractional scaling is really well supported and works flawlessly.

It can even work on a per-display basis in a mixed multi-monitor setup. In fact, I find it superior to Windows in that regard.


Only if all your apps are running in wayland mode. If using x apps over xwayland, the scaling is quite blurry.

That being said I'm using sway every day, but just tweak my font size accordingly.


but how many native wayland apps do you run, most of those that i run (even chrome) did not have native wayland support and where run by x, so they are bury.


I have a 4K 14 inch and it’s the perfect size at 2x scaling. Text looks much better then at 1080.


lcds dont work great in not native resolution, even if its perfect 2x scaling it still little bit bury, so if you did that (set the displa to 1080) on the same display im not sure that's fair, at least that's what i noticed in my experimentation, and i had/have a dell, apple , huawei and asus laptops with hdpi. for me the difference between high enough and 4k on a small laptop display is not that noticeable/important and it comes with drawbacks (at least on linux for now)


2x scaling is not setting 1080p, it's telling the software to draw at 2x scale. That's sharp and very well supported. True fractional scaling is not yet supported in most Linux software but I do 1.3x scaling for 1440p just by setting font scaling. Has worked fine for many years. Going to 4K from 1440p makes things easier on the software side as 2x is very well supported.


what i meant is setting the lcd display that has the native resolution at more then 1080 to 1080 resolution and comparing it to a native 1080 display would usually would not give the same results, as running the display at lower resolution (not scaling) would usually give you blur.

also fractional scaling works ok'ish in linux' but it's "fake" actually in that it uses xrandr to create a 2x resolution and show you scaling of that. and it uses more resources especialy with external monitors with different dpis.


No one ever runs displays non-natively to get any of the scaling being discussed here.

Fractional scaling works like that everywhere that I've seen. OSX does the same, and I'm not sure about Windows. The alternative would be properly scalable GUI toolkits which then creates other difficulties in mixed-DPI situations. I don't use that though. Font scaling is more than enough to get 1.3x scaling. Has been for years, and doesn't take the efficiency hit. It doesn't solve the mixed-DPI situation either though.


I am comparing with all other displays at about 150dpi (1080 on 14 in). Non-bitmap fonts are either jagged or blurry if you use hacks like hinting and subpixel antialiasing.


I think LCDs look great at 2x scaling, assuming the content is actually rendered at a higher resolution.


that is true. but there is a point about diminishing returns. where at small enough laptop screen the increase in resolution gives you only small improvements, and at least still it gives you some downsides besides battery life.


Would love to get one of these in a form factor like the NUC.


Do I have good news for you: https://www.anandtech.com/show/15103/simplynuc-unveils-sequo...

(now we need Renoir based NUCs)


Any reason these wouldn't work for desktop use? The website seems very kiosk/signage oriented.


Electronic signage is one of the most popular use cases for PCs with such a small form factor. Those customers are willing to pay more of a premium than the typical home desktop user who simply wants a smaller footprint. But right now, AMD's focus with these chips is almost exclusively on getting them into laptops, not SFF PCs: https://www.youtube.com/watch?v=ACBDe1obhkM


This is more of TSMC's merit. While Zen 2 certainly is a good microarchitecture, it's not much better than Comet Lake. If Intel would have a working 7nm process, AMD wouldn't take the lead.


Sadly, on most Ryzen 4000 Notebooks with a U processor so far, all I see is soldered single-channel RAM maxing out at 16GB...

I would kill for a refresh of the E595 but it's never going to happen.


Anyone tried running Fedora on this machine yet? (perhaps the beta of Fedora 32?)


The comment section on there is something else. Jesus


Maybe some of Intel's $3B "meet comp" budget went to paid trolls?


Unfortunately, most if not all of those accounts are all too genuine—often users who have been on the site for well over a decade. PC hardware attracts almost as much vapid tribalism as politics and pro sports.


> PC hardware attracts almost as much vapid tribalism as politics and pro sports.

This seems to happen a lot where a lot of money is on the line. How much of this long-term astroturf? Just because the account has been around for a long time doesn't mean they can't be an "influencer".


Years ago, I managed one of the larger performance computer hardware forums on the web (its still around, albeit having changed owners a time or two since I left).

We knew pretty well that most of the audience was kids, and knew most of the marketing people from the major chipmakers. The brands were just starting to wake up to the idea of community engagement at the end of the tech magazine era / start of the Youtuber era, and as best I can tell there were no signs of any of today's astroturfing behavior - yet all of the tribalism was there.

Its certainly more influenced now, but even back in the day everyone wanted to justify their own purchases by insisting it was the best possible thing ever.


> everyone wanted to justify their own purchases by insisting it was the best possible thing ever.

This post-purchase rationalization is a frighteningly powerful cognitive bias. Users can get quite hostile even if you simply suggest that they should have bought a lower-end model from the same company, and will readily lie to themselves about their usage patterns to justify their purchase of a device that is only superior in a few irrelevant microbenchmarks. And that's before anything like brand loyalty gets thrown into the mix.


It's sad, I'd expect more from people that presumably pick parts and build PCs based on benchmarks.

That level of fanboity should be reserved for the teenagers and the XBox vs Playstation debates about their gun shooter du jour that looks indistinguishable on both without a magnifying glass.


Just using 8k on the desktop would be nice


really interesting time ahead! will be excited to pick up one to run WSL2.


The video game benchmarks really show that it's still not powerful enough to play less than 5 years old video games in good conditions. Old games are fine, as always.


Just to clarify for anyone who at first misunderstood the comment as I did: the integrated graphics are not so powerful, not the CPU itself. So you'll still want a dGPU in your laptop if you plan to game.


It's a really misleading comment (if I'm being charitable) considering that the review points out, quite clearly, how the laptop being reviewed has a dedicated GPU, which is pretty powerful, though arguably maybe 30% slower than the full powered desktop RTX 2060 (and a third the power consumption.)


The test that is linked is about the integrated GPU though.


Which is pretty much a given with integrated graphics. The Vega series iGPUs are worlds faster than Intel's, even going back to the 3000 series laptops.


I assume you mean that AMD's integrated graphics still do not compete with discrete GPUs.


Which isn't surprising - integrated graphics are never going to beat discrete GPUs with their own power delivery, cooling and dedicated high speed memory.


This is not intrinsically true. Cooling and power delivery certain have it beat, but it seems feasible to beat based on memory bandwidth. I bet a GPU uses roughly the same order of power/cooling that say half a threadripper does?


No, how do you beat the wide bus memory bandwidth of a discrete GPU with the standard dram bandwidth? Even if your processor was not competing with the GPU for bandwidth you'd still have a small fraction of the memory bandwidth available.


You'd have to come up with a new architecture to make this work, but perhaps you could have CPU/GPU on the same chip with dedicated GDDR memory lanes.

PS4/XBOne have an APU with powerful (ish) graphics - how does their memory work?


They work with graphics memory rather than regular memory if my memory serves. Hehe.


Aw, I was thinking bw between the CPU and GPU directly, my bad.


Aren't console GPU chips on the same die as the CPU ?


Might be more accurate to say that consoles put the CPU on the same die as the GPU. The GPU portion of the console SoC die is bigger than the CPU portion and the memory controller is very much a GPU memory controller not a CPU memory controller.


HW-wise the Vega is better, but the SW, OpenCL compared to Cuda at least on Linux, sucks. It rarely works, and you get the crashes at runtime, not compile-time.


Yes sorry to not be explicit enough. It's still way too slow compared to a dedicated GPU.


This seems like an unreasonably high bar to set.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: