Location specific issue. Higher voltage resolves this and a regular household socket will power this just fine. Then again, if you're spending that much money to get the job done, new wiring is probably one of the cheaper parts of the build.
I mean, it is kind of nice to notice that something isn't going to work because you have the linters and tests. When your developer count goes up, chances that erroneous behavior is included also goes up.
I've created proof-of-concepts that worked perfectly but would make you cry if you looked at how they worked. Eventually they became that mess. Everything is a self-contained unit so it doesn't mess anything else up. Of course, there is always time to keep adding new stuff but never to refactor it into what it should be.
I prefer the way with linters and tests, it at least lessens the chances of whatever is put in being broken (or breaking something else). (Then again, somebody putting "return true" in a test _will_ surprise you sooner or later)
That, as a person from a nordic country, sounds like a very American take. At least over here, the point is to make the people be in a state where criminal behavior isn't desirable. Coming out of a sentence with debt (unrelated to the reason you were there) seems counterproductive.
Voting patters. Americans, like any human, loooove righteous violence, both witnessing and enacting it. The System in America is Americans' collective expression of this impulse.
Consumer motherboards haven't had gpus for a while now (IPMI usually comes with one, so servers do), they're built in to the CPU (if they are, not all cpus have them). These can't usually be easily allocated to a vm.
I clicked randomly on a number of motherboards sold by the 2 brands that came to my mind, Asrock and Gigabyte, and all of them advertised hdmi and usb-c graphics output so I am surprised by your declaration that consumer motherboards don't have GPU. If I am not mistaken on AMD Ryzen architecture it comes down to choosing a CPU with a G or 3D suffix which states they have an integrated GPU.
It really still is the case that most if not all consumer motherboards don’t have built in graphics. For the most part especially on the intel side, they’ve relied on the iGPU in the CPU for output for probably 10 years now
Well my case still stand that you still have an integrated graphics, if not by the motherboard but the GPU, that you can use on the host while you dedicate a discrete card for VM passthrough.
Desktop Ryzen 4's and newer have a very small iGPU that's just enough to put up a desktop (and presumably a framebuffer fast enough to feed a discrete card's output into)
How can the host have integrated graphics, if integrated graphics don't exist?
Per, Korhojoa, and my personal experience plenty of desktop CPUs simply don't have integrated GPUs. Consumer mainboards simply don't come with them at all. Consider my previous workstation CPU, top of the line a few years ago and no iGPU: https://www.amd.com/en/products/processors/desktops/ryzen/50...
Integrated GPUs is a feature of server mainboards so that there is something to display with for troubleshooting, but not on any retail mainboards I am aware of. It is a feature of some consumer grade GPUs designed for either budget or low-power gaming. It simply doesn't exist on all CPUs, consider the AMD 5600, 5600x and 5600g, last gen mid-range CPUs adequate for gaming and the x had a little more clock speed, and the g had an iGPU.
This is a fundamentally dishonest take. I provided three specific CPUs that varied by just the letter at the end where some had an iGPU and some didn't. I am being honest that some have it but that it isn't ubiquitous.
Well when you buy a desktop computer in 2024 their are usually 4 main ways:
- buying a ready made computer from a brand --> always come with an integrated GPU. Some will be even such a small form factor you have to use an external thunderbolt connected GPU if you want to use one.
- you build your computer yourself from parts --> you decide your motherboard and CPU, if VM passthrough is something you want to do, you just buy the parts that fits your use case
- you buy a configurable prebuilt computer from an online or local vendor --> you just have to choose the right option on the configuration tool so that you get a motherboard/cpu that offers integrated GPU.
- you buy second hand and you don't have an igpu: you buy the cheapest gpu available, usually around 10 to 25$ and you have your second GPU that the host can use.
Even when you are using laptop, having 2 GPUs is really not complicated in 2024, especially with thunderbolt external GPU cases/adapters.
Bottom line: you only have one GPU if you actively choose not to have 2.
The average PC is already a trade of that costs the average user around $800 and near 2/3 would need a substantial RAM upgrade a new GPU or both to make gaming through VM passthrough a reality. Most people aren't looking to buy new hardware and learn new tech to game.
It sounds like a useful toy for those whom already enjoy playing with their computer as much as playing with the game.
That said wouldn't limiting the host to integrated graphics (or whatever you get for $25) be a substantial limitation compared to using wine/proton or dual booting?
> Most people aren't looking to buy new hardware and learn new tech to game.
Most people don't play game.
Most people that play game that isn't solitary or a web game just buy a playstation, xbox or Switch.
Only a relatively small fraction of people playing AAA games use computer for that. The most hardcore one and the most willing to spend money on a game Rig. And I am pretty sure most of them aren't the least interested in dual boot because they would have a desktop gaming rig and a laptop for everything else anyway. Only a tiny fraction of gamers is probably interested in dual booting. You are part of that tiny group. Fine. The nmbl tool presented in this conference do not prevent dual booting anyway so I am not even sure why people act like they should be offended because grub might be replaced someday by something else with more capabilities.
It doesn't make sense to ex post facto try to justify what people SHOULD do when we can look at what in fact they actually do.
The idea that the only people that play PC games are ONLY play AAA games on their souped up rigs is also a counterfactual. People play games everything from 8 year old laptops to $5000 custom built rigs with RGB everything. You are oversimplifying the universe consists of many and varied irrational individuals not spherical cows.
Dual booting is simple and suitable for nearly 100% of machines running Linux.
Wine/Proton is suitable for nearly 100% of machines running Linux. Steam has reduces this complexity to a few clicks for the majority of titles.
GPU passthrough is unsuitable for 70-80% of configurations and by dint of complexity undesirable for nearly everyone which is why virtually nobody does this.
Because people don't want to play "games" they oft want to play a particular game and if it doesn't work it doesn't work. Also consider how many people are new and they have an existing computer with Windows the standard play is to dual boot first and then possibly transition to only Linux if it works well enough for usage.
Approximately half of gaming revenue is from PC customers. It wavers up and down depending on exactly what metric you want to use and when the last console refresh was.
You are correct on the complexity cost and how most people, even those with nice gaming computers, just don't want to deal with more complexity than needed. Even mandating a store app that works causes a significant hit to conversion rates. EA couldn't give away Dead Space a previously successful AAA title when bundled with their store.
You are thinking of pay to win games with microtransactions. Whereas this trash HAS come to the PC platform there is no reason to believe it represents any substantial portion of the revenue in PC gaming.
I suppose everyone is entitled to their opinion, but most people base it on something. You are free to do whatever you're doing, but I hope no one takes you seriously.
Yeup. Interesting how the older LED bulbs are particularly bad for this (and in the UK, at 50Hz). More modern high quality ones seem significantly better.
GP's comment said 100Hz because that's what you get from full-wave rectification of 50Hz mains, so they were talking about that. It'd be 120Hz in countries that use 60Hz mains (e.g. in North America).
As a sidenote, my apartment has a 3x63A main fuse. Three-phase is everywhere here. Really convenient for ev charging too.