Hacker News new | past | comments | ask | show | jobs | submit login
RPCS3 – Open Source Playstation 3 Emulator (rpcs3.net)
312 points by notmysql_ on July 12, 2023 | hide | past | favorite | 157 comments



Slightly unrelated but emulators have been a big part of my childhood because I grew up in a country where a PlayStation was unaffordable for the middle class and Nintendo simply didn't sell their consoles there.

Luckily emulation has always been legal thanks to a case 2 decades ago [1]. I see it as one of the "big wins" of the (US) judiciary because it has made the lives of many children joyous, across the globe.

Plenty of old games on now obsolete platforms can still be enjoyed thanks to emulation. I hope the precedent never goes away.

[1] https://archive.nytimes.com/www.nytimes.com/library/tech/00/...


Doesn't that case only apply in the USA?


Who cares? This person is telling a story about their lack of access to cool new tech, and what projects like this one have provided them.

Split hairs all you want but giving less privileged people access to funsies that eventually turn to deeper technical interest and (I assume) a higher earning career is the overarching message and I’m all the way about that. The weird dissection of an uplifting & cool comment like OPs is classic HN & definitely unnecessary.

Edit: And their reply resonates with me. Not fronting like I couldn’t have those things, but I dug deep into retro gaming (via emulation) in my early teens and my career can be traced back to that original interest & deep diving.


I just fail to see how a court decision in the US would have any effect on another country. It's not like OP's situation and access to emulators would have been any different if the decision went Sony's way.


> It's not like OP's situation and access to emulators would have been any different if the decision went Sony's way.

I would respectfully disagree - although it's hard to speculate on alternative histories.

The United States has an outsized impact on the software industry - even for the largest economy in the world (American big-tech dominates the largest companies by market cap).

Preventing or impeding the development of emulation software in the US would definitely impact the rest of the world, simply because fewer people and companies can legally contribute to open-source Emulation software.


People and companies in the US have contributed a whole lot of hours to emulator development, such that it's plausible emulation would not be nearly as good as it is if such development were illegal in the US. Moreover, less effort may have been put into emulator development or emulation-based-product development by foreign companies, if they were legally unable to sell their products in the US, which is a large and rich market. Further, US copyright policy has a tendency to influence international copyright regimes, over the longer term.


The actual mechanism seems pretty straightforward (the US is a big open country, if things can happen here somebody will do it and it’ll spread across the world without many speedbumps).

Their phrasing was a bit awkward though, when they say “Luckily emulation has always been legal…” one would assume they are talking about the jurisdiction they live in, by default at least.


so straightforward that it makes you realize how comically bad some people here in the comment section are at understanding even broad strokes kinda implications about their and others actions.


You legitimately can’t comprehend how a court ruling in the US would affect anything or anyone outside the US? For real?


Well - it depends on the topic really. And, not the fact a law exists in the US. A judgement in the US has no bearing on law in other countries, except where it is used to then set a presidency as others had said.

US law is actually different enough from some other countries that what is legal in the US is outright illegal, and vice versa. If you want some low hanging fruit - anything to do with guns, gender identity, abortion, liable/slander and "free" speech - many countries disagree and actively oppose the US stand point, on both sides of the US position. Another good example is anything granted in the US constitution is not a "God given right" outside of the US (which probably intersects with guns and free speech, maybe other things). US passing a law is not going to change that.

DMCA does not affect other countries - we have our own laws - some of which were put in place in line with the ideas that caused the DMCA, but some of which predate it. Also - "fair use" is a wholly US concept, and might not apply, depending on where you are located.


You just gotta look at DMCA or Copyright


Lol, I've got very bad news for you about where copyright came from.


Haha, didn't see that one coming ofc, stand corrected https://en.wikipedia.org/wiki/History_of_copyright


> I just fail to see how a court decision in the US would have any effect on another country.

By way of return, I fail to see how you fail to see precisely that.


At a guess, it at least doesn’t discourage developers from working on the emulators in the first place.


As demonstrated by copyright, US legislatures generally define global law when it comes to tech and entertainment


Not really. US copyright law is based around protecting the rights to use a "work", but European copyright law tends to protect the originator. It might seem similar, but in practice it is not.

"Fair use" as an example, does not exist in the way defied in the US in the UK. We have "fair dealing" and it is a lot more specific. There are defined use-case exceptions to copyright, it is not "do what you like, claim 'fair use' and take it down when you get a DMCA notice".


Not really.

The general standard length of protection (life plus 70 years) is from Germany for example.


The US is a significant market and emulators being allowed there helps emulators in general?


If USA doesn't care about emulation, it means those specific copyright tentacles won't poke my government, hence my country (Mexico) also won't care. Even if USA cares, sometimes my country still won't care if it means spending a lot of money on policying the Internet.

Sometimes, the love-hate relationship between Mexico and USA has its advantages.


International copyright law is somewhat harmonized through international treaties. The Digital Millennium Copyright Act (DMCA) actually implements what the US is obligated to when it joined the World Intellectual Property Organization.

Copyright provisions regularly make their way into multi-lateral trade treaties (such as the Trans Pacific Partnership)


It may be a US law but the result is greater availability of emulated tech. As a regular user, the legality of the emulator is secondary to the likelihood of the emulator existing.


Yes, but it doesn't need to apply anywhere else for people everywhere to be able to access the software. It only needs one place where it can be openly worked on.


Seconded. My PC hardware was never very powerful either, but I had access to the entire library of (S)NES/GB(A) games to play on my mum's phone as a child.


The fact they managed to achieve this astounds me.

I grew up with the PS3 as a teenager, and I just remember being told how complex the architecture was, as well as just how powerful the system was. Keep in mind, the US Air force connected 1000+ of these together for super computing purposes.

...and now a large part of it has been emulated to a high degree and it kind of blows my mind. Will be crazy to think that the ps5 will likely be emulated on a computer within the next 20 years; perhaps easier so because of the simpler architecture.


The PS5 is already emulated on computers today to some degree. Kyty doesn’t run any commercial PS5 games but it can run some homebrew PS5 apps.


The PS5 is basically a regular old desktop computer. The PS3 has a quite unique multi-"cell" architecture.


I can really recommend this tech talk regarding PS4 and PS5 hardware by their lead hardware architect https://www.youtube.com/watch?v=ph8LyNIT9sg

some really interesting topics related to what kind of performance they wanted as a baseline and how to optimize to make the PS5 do what it currently does... like no other platform.

a $700 console that:

- outputs 4k / HDR (upscaled from lower native res. ofc)

- renders games steadily at 30 / 60fps

- no hitches, framedrops

- always records your gameplay

- live video sharing of stream with PS friends

- live streaming to youtube

- updates of games being installed

- downloads of games/data

all simultaneous, instant game switches, quick loading times. the experience cannot be replicated with a 1k gaming PC, not even a $2000 machine. They really delivered a device which is imho, more than just "an AMD gaming PC with a custom GUI and some DRM"

would have loved to have widescale keyboard/mouse support, as playing Far Cry with a controller is frustrating at best. And keyboard/mouse support is in hands of the game dev if they want to support it or not.


Performance is vastly helped just by not having a fragmented ecosystem or a moving target. If I know all my customers are running my software on the exact same hardware, I can optimise the hell out of my software, and also I can see exactly how it will perform for the end user and polish the worst bits until either it's shiny enough for me or I run out of budget.

If every single user has a different setup with components chosen from a vast array of possibilities that can all do different sets of things at different speeds... well, I can get it to work on my machine, and I can try and guess what to degrade when things get bad, but ultimately I just have to throw it over the wall and hope it's not too terrible in the wild. It's impressive, really, that PC desktop games work as well as they do.


that's the case for every console ever made after the 90s; here the fact that both gaming desktop PC and consoles share similar hardware pieces makes it imho that more interesting to be able to find use cases the deliver a better out of the box experience, and have it cost a fraction of the game PC budget.


- no hitches, framedrops

This statement alone is pure bullshit. So the rest should be taken with a pound of salt.


While that one line is incorrect, it’s a fallacy to use that to try and sow the seeds of doubt in the other points. People make mistakes in wording all the time and it’s a low debate tactic to do what you did.

The fact is some frame hitches are gone but obviously not all.

The ones that will primarily be gone or largely mitigated are

- shader compilation hitches, because console versions of games can ship with pre compiled shaders.

- data transfer hitches because the consoles have shared memory and dedicated compression blocks to optimize transfer

- system resource scheduling contention because the OS and other processes won’t start interfering with the game process since they use dedicated resource allocation.


thanks, on an internet forum, you'd expect some leeway, but alas, it seems it's worse than a technical paper ;)

the OS and hardware working hand in hand to help overcome some of the causes for hitches and frame drops is what sets the consoles apart from the DIY PC builds where you simply don't have access to the custom design;

and it's also imho where in this generation Sony has pulled ahead of Microsoft even if both are using similar hardware


> the DIY PC builds where you simply don't have access to the custom design;

On Linux (or Steam Deck) you can precompile shaders for your specific hardware like Switch or PS5. There is nothing about "DIY PC builds" that prevent you from building an experience like this.


The steamdeck is a single known entity of hardware. For all intents and purposes, it can be treated like a console in that regards.

But DIY PC builds, that’s a wide range of hardware to support. And it’s not just hardware, it’s driver versions, OS versions and firmware versions.

So it’s possible to do what Valve does where the first playthrough caches the shader compilations and then stores them by a configuration hash, so subsequent users get it. But the sheer number of hardware and software permutations makes it significantly harder.

It has nothing to do with Linux either.

The shaders are therefore not precompiled in the same way they are for console. It just means that the second playthrough of a section is a shared experience taking advantage of the first users resources.

If a game hasn’t been played first, or you encounter an area of the game that hasn’t been encountered before you, or you’re on a slightly different hardware/software combination than the previous shader cache, you’ll hit the stuttering again.


That's not how Valve caches shaders on Steam. They accommodate those DIY builds by compiling them on-machine with Fossilize, converting them to system-optimized files. For DirectX titles like Elden Ring, this effectively eliminates all shader compilation stutter in-game. It also doesn't rely on fancy "first playthrough" setups, since it's translating and optimizing the original shaders wholesale.

> It has nothing to do with Linux either.

It's an out-of-box feature with Steam on Linux. You can run all of this stuff on Windows too, but you'd have to build it from source and configure DXVK environments for each game by-hand. On Linux it all happens automatically.


It’s a factor of Steam not Linux. They could in theory do it for other platforms too.

Fossilize does require at least one playthrough because shader permutations can be generated at runtime. There’s no static shader setup that’s common to all games. It just means that the first playthrough doesn’t have to be the same person playing it right now


I believe fossilize snapshots the entire pipeline configurations. It can then replay that and generate final hardware-specific binaries, not just SPIR-V, for the cache completely ahead of time.

That's much better because it doesn't matter what hardware the first person used, the data can be used everywhere.


True, the replay aspect does help as long as nothing invalidates the pipeline, which is still a higher possibility on PCs than consoles


[flagged]


pointing out a problem that many bad PC games have does not magically give the ps5 extra performance and better frame rates.


It does when you are misinformed believing that the only cause for frame-rate drop is JIT shader compilation.


never said that shader compilation is the only cause, but not wasting effort replying somebody calling BS without bothering to RTFM, so ya. nothing of value lost


I'm not saying that the PS5 isn't a good performer for its cost. It is also clearly cost-optimized to do exactly those features without any resources wasted on extra hardware. But at the end of the day even if an equivalent desktop computer cost 5x as much the hardware and hardware architecture look identical.

Sure, this will make emulation hard right now because you don't have the huge compute advantage that you do when emulating a PS3 on modern hardware, but you shouldn't have much difficulty matching the architecture, because it already matches. Basically a PS5 emulator can look a lot more like Wine as opposed to hardware emulation like you see for NES, N64 and similar consoles which were completely custom hardware.


it's definitely done on purpose by Sony, to have PC grade hardware so they can port their games easily to the PC platform and have a larger install base for the first party titles, which previously only existed on the Playstation. All in all, I see it as a win for end users that they are converging as the software titles are available across multiple platforms. So price competition is very relevant. In a world experience worst inflation in decades, this is a very thin silver lining for sure.


> always records your gameplay > video share > live stream

it's called hardware encoding video. it's great that those features are present, but HW encoding (even in consoles) isn't new


never said it was "new" have nvidia shadow play forever running and OBS for the better open source solution on a game PC; but that doesn't take away that the all-in-one polished end user experience is very nice and required some better planning.

PS3 multitasking performance was horrid


Consoles are a loss-leader so saying that the price is $700 doesn't tell you what it costs without including how much of a negative margin Sony was willing to take.


PS5 and Xbox SX are AMD Ryzen 4000 series x86 and a GPU slapped on a die with some memory so emulation isnt so far off


The original Xbox proved that hardware similarity doesn’t necessarily make emulation easy. It took a long while for OG Xbox emulation to be decent , and that was with it using fairly commodity hardware and a very DirectX api.

The PS5 has custom APIs that would need to be reversed out, especially graphics APIs. It also has a large-ish pool of shared memory that makes it difficult to map to most PCs which don’t have that setup.

There’s several custom hardware blocks for bespoke decompression that are routinely used and the equivalent to direct storage to speed up resource access.

It’s not impossible to port those games over as has been seen, but it’s also not easy to emulate that if the specific game builds make use of those features (and many many games do)


I think what we'll likely see is that it's (relatively) easy to get PS5 games running in a PC emulator, but running them _well_ will take ages. Primarily because PCs will have to be able to out-horsepower the PS5 by a wide margin to make up for things like the shared memory setup, texture streaming stuff, etc.


Yeah, PS5 games expect to have up to 16GB of VRAM available, and thanks to GPU vendors being stingy with VRAM to upsell to get that you’d need to buy expensive high end cards.

But that doesn’t do anything to help with the PS5’s shared memory architecture, where because VRAM and RAM are one in the same, textures that need to be in memory aren’t duplicated between RAM and VRAM like on bog standard PCs, which has performance implications.

Windows has support for streaming assets directly from SSDs like the PS5 does now (at least if you have a fast enough NVMe SSD installed, SATA SSDs or older NVMe drives won’t cut it), but PCs still lack the hardware texture decompression of the PS5 which once again impacts performance.

The mass market computers closest in architecture to PS5s are actually M-series Macs, with how they also have a large pool of memory serving as both RAM and VRAM. Once the integrated GPUs on M-series SoCs achieves parity with the PS5’s onboard Radeon, they might actually be the most straightforward to emulate a PS5 on despite needing x86-to-ARM translation.


> PCs still lack the hardware texture decompression of the PS5 which once again impacts performance.

They might not implement it the same, but hardware-accelerated texture decompression has been around on PC for as long as SIMD has existed. With tech like ATSC floating around I'm not sure if it's appropriate to say PCs really "lack" the technology.

> they might actually be the most straightforward to emulate a PS5 on despite needing x86-to-ARM translation.

The problem with Apple Silicon is that nobody wants to use Metal. The big Switch emulator Yuzu should have also been a perfect fit for Apple Silicon too, but it took years to get "ported" and the end result used MoltenVK for the GPU API. Now that it's here, systems like the Steam Deck are hitting 60fps where M2 struggles to hold 50:

https://youtu.be/pubEj1yLknI?t=414

https://youtu.be/5BeYYuLnS3I

It would be cool to see, but nothing I've witnessed surrounding these sorts of emulators suggests that will be the case.


At the end of the day it all depends on the will of the individuals involved with the projects. Dolphin got a native Metal port for instance.


You’re conflating texture compression like ATSC with generic resource compression.

https://gamingbolt.com/former-frostbite-software-engineer-ex...

Kraken is a generic resource compressor while Oodle is closer to ASTC


> The problem with Apple Silicon is that nobody wants to use Metal.

iOS games market begs to differ.


The iOS games market speaks for itself. It's littered with freemium games and low-effort asset flips, the number of shitty 2D lottery/lootbox games outnumber Monument Valleys 100:1.

The vast majority of substantial game experiences are not getting ported to iOS. The reason for this is mostly Metal-related. Apple has acknowledged this themselves on many occasions, like the last WWDC with their Game Porting Toolkit.


Why weren’t they getting ported before Metal when OpenGL 3.1 was still at parity with the rest of the industry?

The graphics API is not the significant portion of the porting issue. It’s market share and the fact that until recently, very few Macs by market share had great GPUs.

The game porting toolkit works alongside wine and Rosetta to make time to first pixel easier for developers to consider the platform.

Regardless of metal or not, time to first pixel and consistency of hardware has always been the biggest hurdle. Most big engines support metal just fine already, so it’s not the primary hurdle people claim otherwise we’d see more unreal and Unity games running natively on Mac’s.

Now every Mac has a decent GPU (for some definition of decent) with very similar hardware.


> Why weren’t they getting ported before Metal when OpenGL 3.1 was still at parity with the rest of the industry?

They were. The number of OpenGL games was minuscule though, and Apple's underlying APIs have broken now, rendering most of these games unplayable. Apple doesn't really provide a stable gaming runtime, outside of the DirectX-like promise that if you use their proprietary APIs they won't depreciate them.

> The game porting toolkit works alongside wine and Rosetta to make time to first pixel easier for developers to consider the platform.

See, that's the thing. "time to first pixel" was an issue because of Apple's APIs. If you translate non-native program calls into native ones, then obviously you circumvent the problem.

Furthermore, the reason why Game Porting Toolkit didn't exist before now was because Apple had to write a Metal translator for DirectX. The community never wrote one like they did for Vulkan, likely because nobody wants to translate DirectX to a second proprietary API. Kinda defeats the purpose, at least for non-commercial contributors.

> Most big engines support metal just fine already, so it’s not the primary hurdle people claim otherwise we’d see more unreal and Unity games running natively on Mac’s.

Most big engines also support PS5 and Nintendo Switch as development targets. The reason why they are relatively unpopular for porting is the exact same as Apple's - the APIs are nonstandard and closed, with limited distribution and long-term support options. Why would anyone put in the majority of their effort to support a minority of the market?


The number of Mac metal games is about the same as the number of Mac OpenGL games. Which is to say minuscule like you said, but all that shows is that it’s not about the APIs or we’d see Unity/Unreal games a plenty.

It’s just down to market share. Time to first pixel still matters for off the shelf engine based games because devs need to get over the hump of building it etc let alone consider all the possible hypotheticals of how it works, even before they get to APIs.

Game porting toolkit solves that. It’s not meant as a general purpose translator , just to get people over that hump

And again it’s just down to market share. There are plenty of AAA games on iOS that use the same engines as PC games without having Mac versions. Take the Call of Duty games for iOS. Why wasn’t there prevalent CoD on macOS?

All that proves to me is that APIs aren’t the primary reason.


PS5 and Nintendo Switch unpopular?!?

The first and second champions of game sales of this decade!

What a joke, thanks for making my day.

By the way, game studios don't have any issue translanting DirectX to LibGNM/X and NVN.


> PS5 and Nintendo Switch unpopular?!?

>> for porting

I don't think my statement is wrong. People don't like porting to Switch or Playstation 5, there's a significant amount of development and testing overhead required to support either platform. The Switch has a decently popular SDK backed with Nvidia drivers, but requires deliberate ARM ports and very carefully written Vulkan code (if any). The PS5 is a little friendlier to PC-first devs, but still has a unique runtime and zero options for DirectX code. Both platforms require fairly bespoke versions of your game, compared to the "press play" porting experience of the Deck or API parity of modern DirectX on Xbox.

I wish the situation was better for these platforms, but they reap what they sow when they make highly exclusive SDKs and resist open specification.

> By the way, game studios don't have any issue translanting DirectX to LibGNM/X and NVN.

Are there DirectX translators a-la DXVK for GNM and NVN? As far as I'm aware, porting from DirectX has to be done by-hand unless you're using an IR language like SPIR-V (at which point you may as well use native Vulkan).


I would advise to spend some time in developer conferences like GDC, GDCE, PAX.

The only people that don't like porting APIs are usually indie devs in some FOSS forums, proper game studios have hardly any issues dealing with multiple backeds.

Doing game engines with pluggabble backends has several decades of industry experience since Atari and ColecoVision.

Games IP, game design and getting good publishing deals is what matters, not the 3D APU du jour.

As for shaders, usually there is either an internal language, shader graphs, or chosing a specific one, with a translation layer in the middle.

There is no native Vulkan on Playstation, and in what concerns Switch, Vulkan and OpenGL aren't as used as FOSS folks think.


I’m sorry but this post reads like someone dyed a little too in the wool on Linux and hasn’t worked in the video game industry.

Except for indies, PS5 and switch get a ton of high profile games. Very few companies have issues porting over and most will have their engines able to target multiple platforms.

Very few people use Vulkan on the switch. It, like the PS5, has its own graphics api.

Very few games ,outside of the few indies that make their own engines, target DirectX or a specific APi. They use an intermediary HGI that abstracts over various backends so that they can target the wide range of console behaviour that exists from APIs to console specific features.

Thinking about PS5 development from the perspective of DXVK or SPIR-V is the wrong way to think about it. Higher level abstractions coupled with low level backends make it easy for any well architected engine.

Like the sibling comment says, please spend some time perusing the GDC vault or among professional game devs. Your world view on the matter is not representative of the those communities. It is more representative of the external view common within the Linux gaming community that holds Vulkan on a pedestal


Exactly. They’ll do what they need to do for any market they deem to have an adequate ROI.

I always point to Linux when people mention APIs being the issue. Linux gaming was depressing before Proton, despite having both Vulkan and up to date OpenGL. Devs could have supported them but didn’t. So the API isn’t the big reason people make it out to be


Linux actually seems like the antithesis of your point. It has the lowest ROI of any of the platforms we've mentioned, yet the highest degree of compatibility with PC and console games outside Windows. If openness and API support isn't the issue, then why didn't DXVK get written for Apple platforms first?


Not the antithesis unless you’re purposely ignoring that nobody ports games to Linux.

Almost the entire Linux gaming scene is dependent on the fact that Valve wanted to make consoles, failed with the steam machine and then figured the formula out with the steam deck. That’s why DXVK exists, between funding and direct development. It was a high RoI for valve to have their own platform. Nobody else cares.

Linux is not a target gaming platform. Even though it has native Vulkan and OpenGl, nobody targets it and nobody targeted it before proton either.


Hardly any game studio that targets Android bothers with Linux, despite the similarities of being a Linux kernel, with the NDK having ISO C, ISO C++, OpenGL ES/Vulkan, OpenSL, OpenMAX.

Not even Valve managed to change their mind in this regard.


Yep. I think if anything valve actually made the state of Linux targeted gaming enshrined as forever translated since proton is so good. There’s no impetus to even bother making Linux ports and dealing with support when people are willing to translate and blame the lack of native support if it doesn’t work well for some reason.

I imagine that’s the reason Apple doesn’t allow studios to ship with game porting toolkit. They likely want to prevent the eternal translation solution, especially since their GPUs are so different than the original targeted ones.


Their Game Porting Toolkit is mostly about macOS.

There are plenty of AAA studios with iOS games, regardless if you like their business model or not.


Excellent points on memory architecture. Though I’d posit that the non-base M series GPUs are actually already equal or higher performing than the fairly aging Radeons on a PS4/5, depending on the respective SKUs

The wild cards will be translation overhead , differences in TBDR access and thermal headroom.


It would be interesting to me to see an emulator that specifically required APUs to run to not have to design around the unshared memory pool of a discrete GPU.

Either an AMD APU or, if Rosetta sticks around, the Apple Silicon chips.


OG Xbox took so long, because most of its exclusives were also available on PC. Halo 1 & 2, for example. So there wasn't much interest. Developers focused on more interesting systems like the PS2 and the GameCube at the time.


> most of its exclusives were also available on PC. Halo 1 & 2, for example

While I agree with your point about most Xbox "exclusives" being available on PC as well, Halo 2 didn't arrive on PC (Windows Vista only too iirc) until 3 years after the original Xbox release, which was after Xbox 360 was already released and in full-swing. So I think there was a bit more to it than just lack of interest. Especially considering how massively popular Halo 2 was.


Consoles usually get working emulators well after their generation ends. Recent Nintendo consoles are an exception to that rule due to their low power compared to their competitors.

When Halo 2 was released - it was too early for a working emulator to be developed. Back then PS2 and GC emulators also weren't working properly yet. Though they were better than xbox emulators.


Also OG Xboxes were cheap and trivial to mod including converting in to debug consoles, so the homebrew community that often drives emulation didn't have as much reason to care. They could just install a debug BIOS, find a copy of the XDK floating around the ol' interwebs, and have basically the same toolset commercial game developers had.

Now that unmodified OG Xboxes are being irreparably damaged by failing clock capacitors and the used market is drying up as a result the people who still care about the platform have more reason to want a good emulator.


"Keep in mind, the US Air force connected 1000+ of these together for super computing purposes."

Which I have long suspected was just a marketing stunt, because even at the time it made no sense. Even on the day of release, the PS3 was merely right where we'd expect a console of the time to perform. No better, and certainly not massively better.

My hat's off to Sony for the quality of their propaganda around the PS3. Lots of people still seem to believe it was something amazing, rather than... a console, which worked about as well as expected. Honestly I'd rate it a bit under par when evaluated objectively, in their zeal to pump up some of their metrics they trashed some other ones like the way memory access works in that system. They'd probably have been better off with a more traditional architecture in the end.


The consoles were cheaper than buying the same CPU from IBM. I think that was their logic.


Cell was theoretically great about computing, but moderate (or worse) about gaming


The hw is tricky to program (eg SPUs can't directly access main memory), but not necessarily hard to implement.


For some reason, I am very fascinated by the architecture of PS3. Its complexity makes RPCS3 the most impressive emulator in my eyes. Though to be fair, all emulators are impressive one way or another.


> For some reason, I am very fascinated by the architecture of PS3.

You’re not alone. I also am enchanted by non-standard architectures: the PS3’s Cell, its predecessor the PS2’s Emotion Engine, the Transmeta Crusoe, etc.

There’s a sibling comment to mine that talks about needless complication and dead-ends. That’s fine, but marching along with essentially optimizations to a basic architecture seems boring to me from a creative point of view (not that the achievements made haven’t been, of course, profoundly technically impressive).

And that’s on top of the fact the designing computer architectures can probably be thought of as a huge multidimensional optimization problem (where optimal can change over time or between customer demographics). I think of the approach of iterating as helping us march up that manifold to a local maxima. I think of these “exotic” architectures as sampling far away from those points to see if maybe we can find a more global maxima.

And that’s not to say that the main platforms aren’t innovating: with big-little, NUMA, etc.

But there’s a soft spot in my heart for those wild, long shot bets.


The architecture is unnecessarily complicated and it turned out to be dead end. Which is why SONY dropped it and now uses x86 like everyone else.


Everything from this generation (PS3's Cell included) was some sort of PowerPC, many previous gens also had consoles using PPC or MIPS. Sony and MS went x86 during the same generation (PS4 and Xbox One). Nintendo went from PPC to ARM (though arguably they'd already used ARM for a while because of their handheld stuff).


I believe the GP is referring to the entire computer architecture, not the instruction set architecture. I seem to recall that it had three distinct heterogenous processors that had to be coordinated to get the most out of the system, so porting from, say, PC to PS5 wasn't necessarily straightforward, without leaving performance on the table.


The xbox 360 was the one with the triple CPU, PS3 had the odd combo of one "main" CPU and 7 auxiliary ones.


Iirc there were 8 chiplets on the cell die, one was disabled for yield rate and one was dedicated to the os itself, leaving 6 of them for gaming.

Then they had the super riced out rsx (Nvidia GeForce 7000 series) for the GPU, but there wasn't really a "main" CPU was there?


The Cell had a PowerPC core (the Power Processing Element) as its main CPU, alongside the Synergistic Processing Element co-processor units.


I think the 360 had a regular homogeneous tricore CPU, so programming it was just normal multithread programming. As you say the PS3 had several auxiliary processors that all needed to be told what to do in specific, non-portable ways, which is what made it inconvenient to work with.


Fun fact about the 360, MS bought a bunch of PowerMac G5 towers and installed a PPC build of Windows XP on them to turn them into Xbox 360 dev kits. Makes sense because those were the most readily available and most cost effective PPC boxes at the time, but kind of funny.


OG Xbox was x86 and was released similar time as PS2.

Then they went PPC for 360.

And now back to x86.


IMO they went AMD because that was the only real option left. PPC was dead by then outside of big IBM servers and ARM was just getting into low-end x86 performance levels. If PPC was still getting improved in the embedded space I think they might have went that way for backwards compatibility reasons.


IBM put in a bid for PS4 (based on Power7 IIRC) but it wasn't selected. I think the convenience of getting everything from AMD helped them take over consoles.


AMD was also able to offer up an APU (GPU integrated with CPU) vs separate chips that would have increased cost and complexity.


Developers had to be more creative when the hardware shaped the kind of game you were making. that's what's fascinating to me


In practice it made multiplatform games to look and run worse. For example - GTA IV ran @ 720p on x360, but only @ 640p on the PS3.

Another reason why both the PS and XBox use basically the same hardware under the hood.


I think the other aspect as well is that the 360 ended up with the somewhat better GPU in Xenos.

I'm not sure if it was obvious at the time in-between the Cell BE and "reality synthesiser" marketing etc. but I understand that gave the 360 an edge in a lot of cross-platform titles.

I think this really influenced the design of the PS4 which actually had a really weak Jaguar CPU but a solid GPU in its APU.


On the other hand, weird hardware architectures are the worst from a preservation standpoint. If no one had made a PS3 emulator, MGS4 was at serious risk of becoming lost media.


Is MGS4 really that difficult to port to a different architecture?


I don't see Konami interested in doing that anymore, and I don't see how else it could happen.


They’re releasing a “volume 1” collection of MGS games soon, which contains up to MGS3. Presumably MGS4 will be in volume 2.


I had no idea. Apparently it's coming to PC. Here's hoping I can finally play MGS4 on PC, as well.


Here's hoping they leave in the install screen somehow. Big fan of watching the main character chainsmoke for a couple of minutes.


They announced a MGS3 remake recently. So they might remake MGS4 as well.


One of the hooks of the PS3 architecture was that it sported a core count greater than 2 in consumer hardware (albeit as a very simplified core).

Rumours about Intel Larrabee were also flying around at the time, so it seemed like the future was here.


This CPU tier list by the r/rpcs3 sub should be helpful (specifically the "What do i buy" tab) :

https://docs.google.com/spreadsheets/u/0/d/1Rpq_2D4Rf3g6O-x2...


No mentions of Apple Silicon, and a quick Googling didn't reveal much. Anyone have insight or experience?

Asking for an M1 Max 64GB ;)


According to this video, setting it up and running the games shouldn't be a bottleneck, but performance isn't going to be very good. He's using the M2 pro and you can still see/feel the slowdown.

https://youtu.be/YnDAkZLXkPA?t=603


I haven't looked at RPCS3's codebase in quite a few years at this point but when I last did there was still a bunch of x86 specific stuff that prevents it from being buildable for aarch, if it compiles and runs at all you will be stuck with having to use rosetta and moltenvk translation layers so performance is gonna suck.


I'm trying this out on an M1 Air and while the emulator itself works great, performance is very lacking.


You could just try it out and document your experience somewhere


Check out apple’s recently announced game porting tool kit


Lacks 7000 series Ryzen...


Note that Ryzen 7000 is only missing in the tier list tab, not the "what do I buy?" one.


Also all threadrippers


Related. Others?

Playstation 3 Emulator Adds AMD FSR Upscaling - https://news.ycombinator.com/item?id=28114817 - Aug 2021 (1 comment)

RPCS3 Inside Look: A Deep-Dive into Hardware and Performance Scaling - https://news.ycombinator.com/item?id=24247586 - Aug 2020 (11 comments)

RPCS3 PS3 Emulator – January 2019 Progress Report - https://news.ycombinator.com/item?id=19415445 - March 2019 (61 comments)

RPCS3: An open-source PlayStation 3 emulator for Windows written in C++ - https://news.ycombinator.com/item?id=7457764 - March 2014 (48 comments)

(note: links to past threads like the above are just to satisfy extra-curious readers)


Am I right in thinking that the PS4 and Xbox One (and later) consoles and should be easier to emulate by virtue of being based on a more standard x86 architecture?


> Am I right in thinking that the PS4 and Xbox One (and later) consoles and should be easier to emulate by virtue of being based on a more standard x86 architecture?

Easier to emulate to a working state, but potentially harder to emulate to good performance if a game uses any of the quirks of a modern console like the high bandwidth shared memory, fast storage, custom hardware, etc.


Sure, but we're still talking about customized SoCs, with some additional silicon in there for extra audio and video DSPs. Also I'd bet that even on modern architectures most console games (or their base libraries) still require precise timing on the side of memory access, CPU stalls, instruction cycles, available bandwidth, etc.

Once you start requiring per-cycle emulation of any of those components we lose any advantages from having any kind of paravirtualization elsewhere.


Perhaps, but not necessarily. The original Xbox was based on mostly off the shelf PC hardware of its early, Intel Celeron, standard motherboard architecture (mostly), Nvidia GPU, and regular IDE hdd. But it's only in the past few years we have begun to see working emulators.


I’ve always wondered whether that was because of technical challenges or just lack of interest. After all, nearly all games on Xbox were available either on PC or other consoles. I can remember a few exclusives like Blinx, Conker, Far Cry Instincts…


100% lack of interest. There's been like 5 people total in the og xbox emu scene in the last 10 years, one of them didn't even do any actual work besides getting mad in reddit comments (jayfoxrox) and another put all his effort into working on an hle/binary translation emulator which by definition cannot work with half the library of games as xbox api and dx api calls are inlined in those games.


Matt really saved the day when he started up xemu from xqemu. It was advancing by leaps and bounds there for a while but has slowed down once again over the last year. I'm hoping it picks up steam once again.


Pentium III no?


Because the Xbox "Coppermine" CPU has the same L2 cache size as a "Coppermine" Celeron (the main differentiator between Pentium and Celeron at the time), the Xbox CPU is sometimes called a Celeron instead of a Pentium III. The Xbox chip retains 8 way cache associativity of a "Coppermine" Pentium III instead of the 4 way that Celeron has, so it's not quite what you'd get with a PC Celeron chip. Ultimately this configuration is only used in the Xbox, so if Intel and Microsoft want to call it a Pentium III I guess that's what it is.


Yes, in fact the PS4 is beginning to see HLE with various emulators, most notably fpPS4 (free pascal PS4).

https://github.com/red-prig/fpPS4


I recently purchased an old Guitar Hero controller and connected it to RPCS3 emulating Rock Band 3. It worked shockingly well. It felt like playing on a real PS3.

This is a super impressive achievement from the RPCS3 team, because rhythm games are horrible with even a little latency.


You may be interested in Clone Hero.


I recently installed Clone Hero, with my only PC Guitar Hero-like experience being Frets on Fire probably 15 years ago, and I was blown away by how it felt exactly like Guitar Hero.

FoF felt so clunky whereas Clone Hero is a delight. It handled high FPS well, the song interface looked exactly like Guitar Hero, and the customizability is the icing on top. The community has ported over basically every song from every Rock Band or Guitar Hero game, so there's more than enough content. Clone Hero was so smooth I bought a wireless receiver to use my old Rock Band drums and play with my wife. It's a blast!


I have tried Clone Hero and frankly it drove me to RB3. It’s fine if you just want to shred, but my god that UI is hideous. Zero polish. It absolutely feels like an amateur fan project next to RB3.


Have you tried Rock Band 3 Deluxe? https://github.com/hmxmilohax/rock-band-3-deluxe


I've been using this to play a silly japanese arcade game, Gundam Extreme Vs. Full Boost -- with my friends, online, over an emulated PSN. Worked immediately on all our PCs. Some random guy even joined our lobby one time. Tested it and Tekken Tag Tournament 2 also worked online. Walked away unbelievably impressed, tbqh.


I still have a collection of cherry-picked PS3 games I kept so that I could play them one day, upscaled and potentially at better performance than the PS3 was capable of.

Despite having hardware that could now play those better than a PS3 using this emulator, I can't get the games from the discs; When my OG 60GB PS3 died I disposed of it and got a slim, which also died, so I bought a new slim, which as far as I can tell now can't/hasn't been hacked, so there's no way to dump my disks.

I wasn't too lucky with PS3s to have had so many; I _still_ have my original childhood PS1, sold both my PS2s (OG and Satin Silver) to fund other purchases as a kid, gave away my PS4 to my nephew (who tore it down and couldn't reassemble it) when I got PS5. Same generation as PS3, my 360 also died.


Why not buy a compatiple blu ray drive and dump them on your PC?


I hadn't considered that; It's been a _while_ since I've had a PC with a disc drive; thanks for the suggestion!


This page[0] describes how you can accomplish this.

[0] https://rpcs3.net/quickstart


I'm wondering - heard that PS3 architecture is so complex, and PS4 is back to x86 - so what is it that makes PS4 emulators so hard to implement?

Surely nowadays hardware such as 7000 ryzen series and 4090 gpus are leaps and bounds above PS4 hardware.


The PS4 has its own graphics API called GNM and GNMx that would need to be reverse engineered and ported.

It also has shared memory between the CPU and GPU that are harder to do with a discrete GPU. See how much work Dolphin has to do with the various direct write memory tricks for the GameCube. Now imagine that kind of trickery on a bigger scale for the PS4.


Having very briefly looked into this problem a few years ago, my impression of the primary challenge was that of re-implementing a large OS API surface in a non-infringing way.


Looking how some emulators just ask users to copy firmware / software / keys from their real consoles - couldn't the same thing be done and only the hardware layer be emulated?


How do you get that software out of your console ?


rpcs3 just uses the PS3s update pup files to install the OS so I imagine there might be a way for newer consoles like that. You can obtain the PS3 ones directly from Sony for use on a flash drive to update the console.


No idea, that's why I'm asking :)

I guess there may be ways to do it not really legally by some hacker and then sharing it with others.

Not that I support it, but there are still people who don't see problems with such behavior.


Eventually, people will learn that dulling enthusiasm for a thing by limiting it to only people who can afford it is actually more hurtful to your success in the long run. There's literally millions of kids who can't afford your concert ticket/game/movie/whatever but who would become lifelong fans if they could experience the thing NOW.


It’s not like what you said hasn’t been tried by various companies. The most successful product ecosystems however have been the ones that can say:

- hey devs, we have hardware that is known that you can optimize for and don’t need to troubleshoot against untold permutations of random hardware

- hey gamers we give you a consistent experience on here without having to think about settings and debugging drivers

- hey investors, when people buy into our ecosystem for even one game, they’re more likely to buy even more games within that platform

That’s a lot of wins that overshadow the potential of wider sales. often wide isn’t as lucrative as focused.


So, how do you suggest a company would make profit while also not limiting to people who can afford paying for the product?

If you are thinking of the free to play model with microtransactions, I do not think that it would work as well for single player games.


I don’t know.

What I do know is that illegal consumption of games leads to increased legal consumption though

https://arstechnica.com/gaming/2017/09/eu-study-finds-piracy...


This is honestly one of the best emulators ever made! Uncharted 3 was amazing, thank you RPCS3!


I wonder if the PS5 could emulate the PS3 with reasonable performance. It’s sad that Sony seems to have just abandoned 90% of their library rather than putting forth the investment to develop and maintain their emulators like Microsoft has.


If it goes anything like how ps2 emulation went on the ps3, Sony will hire someone who worked on emulation for the console and let publishers re-release individual games that are tested to work.

I assume they're not doing that because it'd eat into their game-streaming service that has a lot of ps3 games on it. I don't think they will go the same route that Microsoft did, sadly.


> Motorstorm at 60 fps

I just remembered how undesirable the ps3 was when it came out. Then as blu ray depreciated as a consumer media format the console slowly became a less popular Xbox 360


Would be cool if this runs as UWP on an Xbox :-D

the irony of compatibility and homebrew development.


good project. Much better than PCSX2.


Stenzek who created the duckstation emulator moved over to PCSX2 and the updates from him and others have made that emulator really impressive in my humble opinion. It just seems a bit odd to make a comparison like this. Is there something i'm missing as to why you would compare either?


That sounds interesting, and I've enjoyed reading the deep dives into dolphin emulation. Could you share a link to these?


Its basically changelog notes with each update to the emulator ( usually daily ) and has been very active in recent years. You can also find a wealth of information and upates of various computer and console emulators on reddit r/emulation and that'll have what i think you are looking for.


I'd agree with you a few years ago, but PCSX2 has made leaps and bounds since then. It's quite good now, maybe one of the best.


PCSX2 can run Burnout 3 though. What else do you need?

But this is seriously impressive. Modern Vintage Gamer has done some nice YouTube videos on it.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: