Hacker News new | past | comments | ask | show | jobs | submit login
Cybercriminals who breached Nvidia issue one of the most unusual demands ever (arstechnica.com)
720 points by jbredeche on March 4, 2022 | hide | past | favorite | 669 comments



A bit off-topic, but the fact that criminals asked for help mining, instead of money or crypto-money, just gave me an epiphany:

The whole internet is showing the first signs of a digital Resource Curse [1] brought by crypto mining. Crypto mining changed the economics of the digital world in such a drastic way that it is poisoning all kinds of internet interactions. It is not just about disrupting how money used to work, it is disrupting every kind of interaction, even those that had nothing to do with money. Things that were completely economically neutral before, so people did them just for fun, can now be exploited to extract value, so naturally some people do it and the previous innocent/neutral status quo is lost. For example: most free unix shell providers, a fun tradition from nerds from the 90s, had to shut down because now there is such a big economic incentive to abuse such free service.

And while writing above I realized that crypto is probably the second internet curse, with Google's algorithm being the first. When Google became the near-monopoly in the early 2000s, linking to a website ceased to be an economic neutral activity, people realized they could extract value from linking, so link spam became a big problem and forever changed the web.

[1] https://en.wikipedia.org/wiki/Resource_curse


> The whole internet is showing the first signs of a digital Resource Curse [1] brought by crypto mining. Crypto mining changed the economics of the digital world in such a drastic way that it is poisoning all kinds of internet interactions. It is not just about disrupting how money used to work, it is disrupting every kind of interaction, even those that had nothing to do with money. Things that were completely economically neutral before, so people did them just for fun, can now be exploited to extract value, so naturally some people do it and the previous innocent/neutral status quo is lost. For example: most free unix shell providers, a fun tradition from nerds from the 90s, had to shut down because now there is such a big economic incentive to abuse such free service.

The whole internet is showing the first signs of a digital resource curse brought on by advertising. Advertising changed the economics of the digital world in such a drastic way that it is poisoning all kinds of internet interactions. It is disrupting every kind of interaction, even those that had nothing to do with advertising. Things that were completely economically neutral before, so people did them just for fun, can now be exploited to extract value, so naturally some people do it and the previous innocent/neutral status quo is lost. For example: most free, non-monetized content providers, a fun tradition from nerds from the 90s, don't exist now because there is such a big economic incentive to abuse such free service.


Ads at least require an audience. Crypto mining abuse requires just the ability to execute anything.

That’s why ads are like acne for the Internet, but crypto is a cancer.


I'd argue it's the opposite.

Crypto is fairly easy to avoid as an individual - the only thing I can think of that crypto has "contaminated" is GPU prices. Otherwise, don't get involved and just watch and laugh at the dumpster fire from a safe distance. You don't have to get involved in crypto to participate in society.

Advertising on the other hand is very hard to avoid, and even blocking the ads themselves doesn't isolate you from its nasty effects such as the constant tracking and that a lot of products nowadays pivoted to be ad delivery mechanisms (try to buy a consumer-grade TV that doesn't show ads or spy on you). A lot of products & services you need to use to participate in society are involved with advertising and may sell out your data (and providing fake data might be impossible/illegal). Even some government departments (DMV, etc) do it.


Crypto is easy to avoid "for now".

The endgame for crypto is tokenizing everything in some peoples eyes.

Video game skins become NFTs that you now "own" instead of just being in-game. So you can transfer them around, sell them on composable markets etc.

Assuming things actually get worked out you could see a merging of the digital and physical worlds assets.

If this a good thing, or will even happen who knows.


> Video game skins become NFTs that you now "own" instead of just being in-game.

This use case actually doesn't make sense if you think about it. Items only make sense in the context of a game, and you trust the game makers to honor it. At that point, there's no incentive for them to actually allow you to port in/port out items.

Think about it this way: what good is the BFG from Unreal Tournament in Cooking Mama?

Even in related games, guns from Battlefield 2042 in say COD Vanguard?

Sounds pretty shit.


A more interesting example would be From software issuing a “moonlight sword” nft (maybe for an achievement like 100% elden ring) and various indie devs creating souls-like games choosing to honor it within their game.


I would like to be polite to you here, but it's going to be hard, because this take is wrong at every single level, and it feels like you haven't tried any sort of critical thinking whatsoever:

1) NFTs are the most expensive, least efficient way to implement the least interesting part of this technology. Even if you accept the dubious premise that they do even that job well.

2) You have explicitly described a system in which every single agreement between devs would require special testing. Is the moonlight sword balanced in the other game? Don't I still have to describe literally every aspecet of the moonlight sword other than who owns it in the second game?

3) Skins and other microtransactions are per game, by an absolute law of their design. That is the point. A game developer has no interest in honoring a microtransaction that I can prove that I paid some other game developer for. I would want my cut.

4) The condition upon which you have proposed that people can acquire the moonlight sword is... for beating Elden Ring? So it's a sign of status. Except that it's a sign of status that I can sell, so it's actually just a signal of wealth or status, maybe? Maybe I can't sell it. But wait, why did I make it an NFT then. So I must be able to sell it, but then it is presumed to have a dollar value, so why wouldn't Fromsoft sell it in the first place. Sounds a lot like a regular microtransaction.

Which brings me to my final point.

5) The market for microtransactions is already very optimized. Videogames are already very good at extracting every penny that people are willing to pay for bullshit cosmetics and play to win garbage. You can't even make money hucking this shit.


Also one final point, if 10 mutually independent devs share an NFT blockchain so the moonlight sword can be transferred between games... what stops me from minting my own moonlight sword? Or super moonlight sword? Or unauthorized micky mouse hats?


But again, not only is that trustful (after all, the "moonlight sword" name, image, etc, are subject to copyright and trademark) if From decides to allow that they can just open an API.


“Is thing in wallet” strikes me as much easier to implement then n^2 api integrations across companies. It’d be easier to do the api method via drm platforms like steam at the cost of lock-in to those platforms; cross pc/console access to any item would be far from guaranteed. Outsourcing all of that logic to a centralized (ex google blockchain) or decentralized blockchain and wallet just seems like a more scalable (num of actors not tps) approach with a low barrier for entry for each individual actor.

Your comment about “trust” has little to do with the technology. It’s a cross between legal gray areas (which would need to be solved/accepted if this becomes the norm) and dogma surrounding web3. I tend to agree that this isn’t likely to happen unless from software essentially abandoned their ip for some reason. But if that happened, it’d be a cool way for the legacy to live in future games in a way that pays homage to the original creators. It’s a use case that makes much more sense then just copying a random gun from one game to another because there are years of history in the item.


> Outsourcing all of that logic to a centralized (ex google blockchain) or decentralized blockchain and wallet just seems like a more scalable (num of actors not tps) approach with a low barrier for entry for each individual actor.

I do think for this to make sense you'd have to have, you know, one blockchain everyone used otherwise you still have the "10 app stores" problem, no?


if the game itself runs via a decentralized computational model then there’s no need to appeal to authority to honor the ownership. The game itself, just like the skin, would exist on the p2p network.

We are of course very far from that happening, or knowing if it can happen, but it’s still a possibility to consider


You have to make a game that supports every item. There's no such thing as a "generic" game. Games have balance - hours are spent tweaking relative attributes of items to ensure the game is, you know, fun. For instance, what's to stop me from making an NFT nuclear launcher item on my own, then porting it in and wiping out everyone else's stuff?


MINE + CRAFT


The fact that people keep trotting out this idea about the future potential of NFT gaming is evidence that people know basically nothing about making video games or the assets that go into them. And neither do the founders of any NFT projects with the rate at which they crash and burn if they aren't outright scams.


Crypto is held back by its own stupidity & inefficiency - the system self-regulates well enough.

The only thing a blockchain is good for is assets that fully live on the blockchain for which the source of truth is the blockchain itself - like cryptocurrencies.

Putting NFTs, video game/metaverse avatars, real-world assets, etc will never work; it's at best a very inconvenient, inefficient database replacement, and at worst a scam.


This is something few have realized. Crypto guarantees can only ever extend to things that are wholly represented on-chain, and that's basically just cryptocurrencies. As soon as the chain reflects the real world it's at the mercy of the correct data being entered, or reality not changing around it.

And this means pure cryptocurrencies, backed stablecoins do not offer any of the crypto guarantees either since they are representations of assets held in trust that can be seized.

For a system to be trustless, decentralized and permissionless it must be entirely trustless, permissionless and decentralized. It's not a gradient, it's a step function. Once anything creeps in, all guarantees are void.


Interesting take on this. Something to think about.


It's certainly better than paying companies for "licenses" we don't actually own or digital goods that amount to nothing but bits in their database. With these digital content NFTs, at least we get a real market.

I'm not sure whether these things will actually get integrated into video games or how well it's gonna work. It will certainly be interesting to watch this develop.


>Crypto is fairly easy to avoid as an individual

It's probably contributing more to inflation than people believe. Both in the demand for resources and in terms of redistribution of wealth. Although probably not as much as the new work-from-home paradigm.


Couldnt be the thousands of dollars handed out, making markets discount the dollar, could it?


Yes, that too. The printing has to catch up with them eventually.


> try to buy a consumer-grade TV that doesn't show ads or spy on you

I haven't bought a consumer TV since... well, actually, I just haven't ever actually bought one except second-hand non-smart TVs a decade ago. Is it literally that bad? i.e. does everything show ads? Can you root these machines and/or install non-ad-based apps instead?


Any consumer-grade "smart" product (whether TV or home automation) now phones home (even though it can run entirely locally on the LAN), requires an account and collects data and the (often mandatory) mobile app will typically include various advertising malware such as the Facebook SDK, analytics, etc that spy on you.

The primary objective is "growth and engagement" aka an ad delivery mechanism (or data collection mechanism for more ads down the line). The functionality of the product (if any) is merely a necessary evil (from the manufacturer's point of view) to convince the mark to buy & "engage" with the device. If they could get people to "engage" with it without any functionality they'd definitely do that instead.


Do what I do and simply disable internet access to the TV. Netflix and whatnot gets ran on my PS5, which mainly advertises their PSN store, so it's less intrusive.


The only good panels now are also "smart" tvs with something like a roku or samsung spyware built in in an unavoidable way. You can't simply change inputs to avoid it because the whole OS you're changing inputs within is the smart TV's OS.


looks like there might be an emerging market for 60-inch computer monitors


You can absolutely find these, usually offered as commercial products, e.g. for menus, or billboards in airports, etc.

Buddy of mine used to work in a kabab shop before finishing his IT degree and helped out there off and on. He ordered an installed some 48" monitors (simple HD TVs) and then ordered a few for himself. I think they could 2 inputs max, but that's fine for most purposes.


Sadly, I suspect adtech is already working to target them too. What better way to fight adblockers?


Ads no longer work online, period. The ad industry is just 10 years behind acknowledging it. Most of what people pay for is just paying for the organic traffic you'd normally get with no ads so they don't give it to someone else. It's more of a protection racket than it is a source of genuine new customers.


I have rather different experience. Most people either don't care or don't know how to. Unless you're tech savvy you're watching ads. And if you're on mobile you're watching them anyway in many cases.


People ignore ads either way these days, I mean


Many free CI services have stopped being a thing because people exploited the free compute for crypto mining. That’s directly impacted a lot of developers at least.


> Crypto is fairly easy to avoid as an individual

Web3.0 folks are trying their best to change this.


> may sell you out.

Even HR departments. Got a new job and am completely spammed constantly at my brand new work email address.


> the only thing I can think of that crypto has "contaminated" is GPU prices.

And storage. See Proof of Space and Chia.


> crypto is a cancer.

Proof of Work is a cancer. The two are heavily intertwined, but they are not technically mutual terms. Proof of Work needs to be stopped, irrelevant of what it is used for.


PoW(hatever) is a cancer as long as the "Whatever" doesn't have any value behind it, it doesn't bring value to society because it's guaranteed to take resources without leaving anything useful behind. It incentivizes people to abuse anything at hand to obtain proof of that otherwise worthless "Whatever". Whether it's "work" or "space" or "time" then someone will always look to abuse others to obtain more proof without it creating any value.

Unlike money (generically) PoWhatever cryptocurrencies aren't backed by anything anyone can define as valuable. Proof of Housing_built, Proof of Goods_transported, Proof of Music_created, Proof of Space_flights all create real value. Proof of Work where the work is without doubt useless in itself and an absolute net loss for society can't result in anything practically valuable. At best it results in a very wasteful Ponzi scheme.

Imagine something as innocent as "proof of clapping hands", you may get some reward for holding a record but it's otherwise completely useless. That at least only affects you and your hands, even if a waste of time that doesn't seem to bring anything of value - joy or other practical benefits. What if someone could abuse you and force you to clap at the cost of lost productivity at work, health issues, etc. and the result of that work is... nothing?


The supposed usefulness of the work sacrificed in proof-of-work is to prevent Sybil attacks on the network.

An attacker can spawn millions of virtual nodes, yet they will never hijack consensus because of the objectively-measurable resource sacrifice.

While there may be too much reward for this sacrifice currently, the idea is a sound one.


> The supposed usefulness of the work sacrificed in proof-of-work is to prevent Sybil attacks

The usefulness of PoW is to reinforce a system that produces something. Presumably the higher the value of that something, the more you're willing to waste enabling it. If you're burning through an estimated 110TWh of electricity per year (as of last year, one Netherland's worth) enabling that system, you should be able to show one Netherland's worth of value being created.

> While there may be too much reward for this sacrifice currently

You probably meant it the other way around, too much sacrifice for the reward. Wouldn't the concept of creating value imply that net value was created?

> the idea is a sound one.

In theory many ideas are but the implementation matters. So in practice you'd be hard pressed to show the net value. And I'm not talking about showing me 1 BTC is worth $50.000 more than you can say a defunct company is worth billions because some was doing a pump and dump.


> You probably meant it the other way around, too much sacrifice for the reward. Wouldn't the concept of creating value imply that net value was created?

Without the reward, there would be no sacrifice. The reward drives the sacrifice, and it is strictly higher than the sacrifice, otherwise people would not sacrifice in order to lose money and waste resources.

Unfortunately, right now the value created is less than the sacrifice (and I agree with you), which is in turn less than the reward (see the next point).

> So in practice you'd be hard pressed to show the net value.

The Bitcoin transaction fees are higher than $300k/day [1]. So, at least that much value is provided to those transacting, otherwise they would not transact.

Unfortunately, the current implementation still adds a hefty (100x larger) inflationary reward on top - so miners make upwards of $30M/day. [2]

While such a ratio may have been useful in the early days, now that virtually everyone interested knows about Bitcoin, I argue (and agree with you) that this inflationary reward is excessive.

In addition, I argue that the resource sacrifice is excessive because this inflationary reward is excessive. Had the creator(s) of Bitcoin predicted the meteoric rise of the price, they would have diminished this reward more quickly than the current schedule. [3]

Maybe someone could fork a "Bitcoin Green" or whatnot, with the difference that the reward schedule is sped up, and hopefully it would gather a meaningful amount of hash power to prevent 51% attacks [4].

[1] - https://www.blockchain.com/charts/transaction-fees-usd

[2] - https://www.blockchain.com/charts/miners-revenue

[3] - https://en.bitcoin.it/wiki/Controlled_supply

[4] - https://www.crypto51.app/


But at the end of the day, you can only imagine proof of clapping hands because yes that example is useless.

But proof of work is not useless, hence we don't have to imagine it. Distributed consensus is useful to many people at this point.


> you can only imagine proof of clapping hands because yes that example is useless.

People do it all the time, it must have something. But no practical value. But let's not argue for the sake of arguing. It's like saying Ponzi schemes are good otherwise so many people wouldn't be into them. Or any example of things so many people do that have no value and we'd really rather they stopped.

Proof of Whatever is just about gobbling up some resource (sometimes with incredibly high cost for society in general) for some artificially driven value but of no practical positive consequences to almost anyone in the world. Except for the few that are incentivized to make it look more valuable because they've invested and its actual value never changed. It's still a solution waiting for a problem.

And before you argue more, the computer and internet you're using to post your opinions were created with far less waste and provide far more value. If they go away tomorrow the world will suffer a lot, society will literally regress decades. If proof of waste goes away the "investors" in the pyramid will suffer for being tricked and the rest of the world wouldn't bat an eye.


How many people? How useful? What is the distribution of that useful amongst those people? At what cost does this useful stop being useful?


There is one non-profit token based on BOINC that uses scientific computation as PoW. Their self-awareness with regards to their influence is impressive.


Gridcoin (the cryptocurrency you're referring to) doesn't actually use the scientific computation as proof-of-work. It's built on proof-of-stake. It just also happens to have minting set up so new coins go to people doing scientific computation, in a process that superficially resembles proof-of-work mining. The paying-people-to-do-scientific-computation part is purely something extra and isn't strictly necessary in its design to make it work as a cryptocurrency.

(It's kind of splitting hairs, but I see gridcoin brought up a lot by people arguing stuff like "Proof of stake isn't the answer. Proof of work is still good, we just need more useful proof of work" which is a big misunderstanding of it. Useful scientific computation generally doesn't work as proof-of-work and is not an example of it even in the context of Gridcoin. Only specific kinds of computation can work for proof-of-work, and the computation being useful in other contexts actually harms its usefulness for proof-of-work because it can allow the computation for 51% attacks to be subsidized by the other uses of the computation.)


Point taken. Duplication of effort could be argued is proof of work but perhaps that is splitting hairs.


We had the promise that this would change many years ago, I doubt it will change for many years to come.


As of right now the Ethereum foundation are still projecting ~Q2 2022 for full Proof of Stake.


When working with hard/physical assets, one needs no proof of anything. The existence of the asset is the proof.

To me, there needs to be an effective counterbalance to the need for "proof" in the first place, otherwise we are simply prolonging the petrodollar's capability to exist in a world that is clearly quite ready to abandon it.


> When working with hard/physical assets, one needs no proof of anything. The existence of the asset is the proof.

Counterpoint: Real Property


No, rampant transactionalization/financialization is the cancer. Rent-seeking leeches trying to extract money out of previously neutral activity, growing unchecked. The very definition of cancer.


This is exactly wrong. Crypto is fairly easy to make worthless. Ads are impossible to make worthless.


This comment chain made me envision a potentially terrible future direction.

Imagine a future where websites force people to consent to allowing them use your device to mine cryptocurrency while you use its services or browse its content instead of ads.


Interesting tidbit: a few years back I encountered a Minecraft server with a dynamic map on a website. After checking it out for about 5 minutes I noticed some weird CPU usage so I asked the server admins.

They apparently to came to an agreement with the players where, to support the server, a browser miner would be ran on the dynamic map website.


I would like to get chunks off of a community grid like Nunet... There is a mod that adds LODs these days, CPU-intensive, so with a nice view the real estate does become a little valuable doesn't it? :)


I'd prefer that to ads.

With crypto mining I can buy my way out of it by having powerful hardware, or - in a hypothetical future where this practice is mainstream - offloading the mining to a separate machine as to not affect my mobile device's CPU. Crypto miners have a defined price: give me an average CPU's hashrate while you stay on the page and that's it - so if you pay that you are good.

With ads, buying your way out is impossible - by attempting to do so you just become an even bigger target. There is no define price for it that you can pay to "opt out".


>With ads, buying your way out is impossible

I disagree. It's just not in its final form yet. But take a peek:

Hulu: 6.99 with ads, 12.99 without

Youtube: Free with ads, 11.99 without (or bundled with another google paid service, like Fi)

HBO: 9.99 with ads, 14.99 without

---

It's coming - if you have money to throw at the problem, you can already buy your way out of a lot of advertising.

If you have time and a bit of technical savvy - setting up a pihole or ublock origin incurs costs on time, but also remove ads.

At least with ads the cost is fairly clear, and it's definite. With mining... it's mostly malicious and will consume as much CPU as it can. That actually taxes me more for having a nice CPU.


You can buy your way out of ads in some places, but you can never buy your way out of the invasive tracking.



"Imagine a future where websites force people to consent "

I imagine a future, where people just do not vist sites, where they do not agree to the conditions.

But like others pointed out, those sites exist already since quite a while.


How do you even know what the terms and conditions are without visiting the site? This whole "by using the site you agree" thing is completely stupid.


Isn't that the Brave browser?


No Brave doesn't do that, its default behavior doesnt use crypto, its optional behavior theoretically lets you earn bat but not by processing cycles


Brave has its Basic Attention Token that they give users in exchange for sending notifications with ads in them. They literally pay users for their attention. Doesn't consume any CPU.

I think all advertising is inherently bad but this is the least bad form of advertising I've ever seen. Not only are users compensated, they can actually turn off the ads in the settings and simply use the browser normally, and it has a built-in adblocker even on mobile.


This was a reality a few years ago (in particular, there was a very easy to use Monero cryptominer that ran in the browser) and then every major ad blocker and many major browser distributions blocked crypto mining and it's basically become impractical for the most part. PirateBay did it for a while but I believe they have stopped, you really don't see it anymore.


Frankly I would pick that over ads any day.


Can you mine crypto in the browser? I know there are some native hooks, presumably JavaScript is never going to give a good hash rate, but it would be interesting if sites starting dropping JS tags that silently mined crypto in the background.


WASM Monero miners were a big problem years ago.


Certainly.

https://github.com/jtgrassie/xmr-wasm

Individually hash rate will be quite poor but if you make a mining pool out of all your users everything changes. Monero mining pools even have rules prohibiting that.

https://minexmr.com/miningguide

> Botnet / Web Mining Policy

> Sadly it happens that Botnets often run cryptominers on infected PCs.

> minexmr does not support botnets, web miners or any other illegal activities.

> Any account suspected of botnet mining or web mining will be banned.


Oh certainly. Back in 2018 I found my favorite site to read online comics was turning my phone into a hot plate before I blocked JS.


> because there is such a big economic incentive to abuse such free service.

On my free personal blog, I have nothing for sale, and nobody discovers it in spite of the ones that do read it enjoying it.

Another problem of advertising oligopolies monetizing everything is, if you don't pay them, you are invisible.


You should read their second paragraph.

Unless you were going for some kind of meta-joke/correction, in which case bravo.


It was definitely a serious-toned joke. Bravo from me as well. I don't know how people navigate the web without an ad blocker anymore. Do people actually do that? it's unbearable. Slows pages down, clutters the actual content, wastes more battery on phones and laptops, generally rewards companies for spying on you... it's awful.

Can we go back to when people weren't so lazy as to do everything in their power to not have to give their cash directly to someone? Sure, you don't have to hand your money to a content maker on YouTube anymore, for example, but I wonder how much the lifespan of your device is decreased by the added processing that comes from advertising, and how much that adds up in the end. You may as well just have paid them money directly in the first place.

Apply that thought to any "free-but-only-because-it's-ad-ridden" software/service.


interesting point of comparison. the much bigger cost of ads is probably the attention loss. we don't have much attention capacity. ads increasingly syphon it. I guestimate that attention loss for an average person is worth order of magnitude more than the extra cpu tearing + bandwidth costs required to run ads. but like for your comparison, most users aren't even realising they are paying a significant mental cost by accepting those ads.


It's not just a mental cost - someone has to pay for those ads and all the machinery & various middlemen that make it possible. That someone is all of us; the prices we pay for most goods include the overhead of polluting society with ads. You end up paying a double tax - a monetary one and an attention one.


agreed. it's multiple costs. and we all pay for that waste and the long term consequences of it.


But the example is untrue. The nerds are still hosting free content.


It's incredible how much people can talk about capitalism without mentioning it. It's all about the trends in capital, but never about the general mode of capital. Which i do get to some extent. It does make sense to talk about trends, like advertising swallowing up all aspects of media, but critique can never be made on trends alone. Trends don't dictate processes, trends are the result of processes. If you look to explain this from the bottom up you'll hit a wall of individual decision making. But from the top down you see the influence behind those decisions, it's an economic choice that is necessitated to continue the process of capitalist accumulation in a corporation, country or even market.

It's not only crypto or advertising it's the process of capital using technologies to radically transform its production process in search of more profits. Be it by established corporations or groups of crypto miners. This mode of production will do this to any technology that can advanced the generation of capital. This is the core of capitalism, the revolutionizing of production, not for humanities sake, but for profits sake.

To settle on a single manifestation of this mode as its cause is to wear something like horse blinders and all conversations on the topic will be as narrow as the topic you want to hate on the most. And capital will keep chugging and keep using technology for its own sake at everyone's expense.


Agreed, most of the world's ills could be mitigated with an extreme tax on digital advertising.


As Dan "FoldableHuman" Olson put it [1]:

> The end goal of the crypto machine is the financialization of everything. Any benefits of digital uniqueness are a quirk, a necessary precondition of turning everything into stocks.

[1] https://twitter.com/FoldableHuman/status/1465835083542061059


Both of your examples (cryptocurrency and paid linking) are indeed plagues on quality content, but I disagree that they’ve replaced quality content.

They’re mostly examples of new business models that people have tried to use to generate new types of content and monetization in a crowded space. Sites like HN and commenters here aren’t trading links for money, we’re just sharing interesting content. The obvious paid advertisements (going from 0 votes to +20 votes in minutes despite obvious content marketing) get flagged away quickly for the most part.

Crypto and “web3” especially has become a catch-all for people who want to catch a gold rush or have otherwise run out of ideas for traditional business success. The problem they’re discovering is that the average consumer has almost no interest in crypto unless it’s as a speculative investment to flip to someone else, which means the entire space is basically crypto people flipping things to each other and trying to convince new people to join in so they have more downstream people to flip their tokens too.

But despite the constant efforts to flood our news feeds with crypto stories, most of us navigate the internet without cryptocurrencies or NFTs because they’re entirely unnecessary and you have to go out of your way to do anything with them. Since they bring no actual benefit, we just ignore it. And it’s fine.


It's what happens when you throw a pyramid scheme into something - just like with pyramid/mlm schemes in real life, suddenly everything is about optimizing for the scheme. Every conversation becomes a sales pitch.


That's a good way to summarize the problems that traditional game-devs foresee with NFT games. Game purchases are viewed as speculative investments rather than entertainment.


Devils advocate here, but crypto does disrupt a free sharing economy that's on the internet and part of the magic of it.

Yet maybe there are benefits in terms of making previously economically inviable things viable. People making passion projects are great, but at the end of the day they are creating value for people that's almost never reciprocated in other ways and tokenizing things lets them capture that value or add new value to what their selling.

e.g. with NFTs right now you have three parties, the buyer -> who gets provenance ("ownership") the digital artist -> who makes more money off their art than was possible before because rarity is built into the price now. (Non-fungible, unlike selling prints tshirts etc.) everyone else -> who can still copy and share the work for free, as before.

I feel like this should be a growth engine for digital artists, that has minimal impact on the general public, and everyone walks away happy with the transaction. (Putting aside scams b/c the market is unregulated, environmental concerns, etc. just talking economics.)


Putting away the scams, environmental concerns, and lack of regulation is a big caveat though. They are an inherent cost to the crypto space that currently lack an immediate resolution (to my knowledge).

How much "previously economically inviable things" are captured by NFTs and crypto anyways? I have not seen any potential usescases that outweigh even the inherent environmental cost of cryptocurrency.


The resource curse is that natural resources go together with oppressive government: the former can promote the latter in that an oppressor needs some form of wealth to pay their minions/allies, and international sales of extracted resources are a source of wealth that makes fewer demands on widespread cooperation from the locals than does local development.

How does this map to crypto? Who's the oppressive government we're getting more of?

It's correct that crypto changes the economics, and I'd say that's a good thing. We've tried leaving economics out of our core means of cooperation online for a few decades, and we don't seem very happy with the feudal internet that that led to. Let's try expanding the range of ways of coordinate and to fund work, with e.g. https://gitcoin.co/


I'm sure many people would agree with your diagnosis. What's the treatment, though?


Time.

Crypto is a speculative digital commodity bubble. We haven't seen a real bear market yet. Gold took 7 years from 1973 when you could own gold again to the 1980 peak. All the increase in gold mining lead to a 20 year bear market.

The crypto narrative is fundamentally unsustainable. It doesn't just have to beat the S&P 500 returns, people need to believe they are going to get rich. That November 2021 peak is going to be tough to take out. That is looking like the peak of the everything bubble from COVID. The get rich narrative can only hold for so long until the reality that crypto is a set of volatile currency no one really uses and has been massively over mined for years now sets in.


2017 wasn't a real bear market? It dropped like 95% and stayed at 50% to 80% below peak for a couple of years..


How do you define bear market for crypto? -90% is just another year (or even month) in crypto space.


I mean, I'd define it, much like any other bear market, as a sustained downward trend in the market.


Ban crypto mining. Outlaw exchanging cryptocurrencies (directly) to USD (and other fiat currencies, but USD is the big one for now).


It seems like it’s happening indirectly. I keep hearing stories about places like Coinbase not converting dirty Bitcoin that’s been associated with illegal activity.

Eventually, it seems like this will apply to all of it.


How much would it cost to purposefully contaminate a significant portion of Bitcoin? Just, you know, for reasons.


I don't think banning fiat conversion would stop this.

There are bonds and stable currency issued peg to the defi or crypto world.

It would reduce the value but it's unstoppable.

See fei: https://fei.money/

Proof of work or hardware intensive proofs are only used in a few crypto currency so this is a very blunt response in regards to banning exchange.


Have you seen what happened in the finance world with respect to Russia recently? Feels to me like it is perfectly possible with the right Zeitgeist. Pretty much all the on/off ramps for Russian money are gone at the moment.


Which I acknowledge above. You can ban exchange to fiat but it will only reduce value, not completely eliminate it as there are protocols which derive value from defi ecosystem itself.

I.e people will start directly using these currency for payments (which they already do) without exchanging to fiat. The stability is maintained by the defi collateral and shared acceptance.

So you would have to ban everyone to accept crypto currency and so on as payment but that is very catastrophic decision with large implications.

It also doesn't help crypto completely eliminates centralized authorities when moving across borders.


> people will start directly using these currency for payments (which they already do) without exchanging to fiat

Without fiat exchange, People can only accept crypto in an economically feasible manner as payments up to the rate that they have expenses/outflows that can be paid in crypto. The price will only be able to stabilize at a price that is orders of magnitude lower than today. How much lower exactly depends on the health of the black market for exchanging with fiat.


So let's assume crypto is banned? Why is FIAT much better? What about another 2008? What about Quantitative Easing?

It's not solving much.


Because it's not speculative garbage like literally all crypto is.


Russia is a bit more complicated..half their economy is oil. The chance of import bans and Russia nationalizing oil companies is partly priced in.


Time.

Crypto is a speculative digital commodity bubble. We haven't seen a real bear market yet. Gold took 7 years from 1973 when you could own gold again to the 1980 peak. All the increase in gold mining lead to a 20 year bear market.

The crypto narrative is fundamentally unsustainable. It doesn't just have to beat the S&P 500 returns, people need to believe they are going to get rich. That November 2021 peak is going to be tough to take out. That is looking like the peak of the everything bubble from COVID. The get rich narrative can only hold for so long until the reality that crypto is a set of volatile currency no one really uses and has been massively over mined for years now sets in.

Time will make crypto boring like every other currency/commodity. No one gets all excited about corn futures.


I think the most straightforward thing would be to regulate mining operations as money transmitters and smart contracts as derivatives. Several jurisdictions have made noises about doing this; it seems to be more a matter of political will than any fundamental difficulty.


Although I agree that post crypto incentives have changed. It's a bit much to say that it's "completely changed the digital universe" this group is targeting the primary developer of GPU mining.

In terms of the ask, I think they are being much cleverer than dumb money as unlocking the GPU mining further could create new areas of economic power, i.e ebay old gpus or push NVIDIA to design a specific crypto gpu?

Either way capitalism is what's driving the team.


artificial limits, on bandwidth and energy could be one way to both foster innovation and decrease resource usage


We just saw that those artificial limits (Lite Hash Rate) fostered the "innovation" of hackers attacking Nvidia, eh?


The curse is crypto tokens and NFTs, not Bitcoin or proof-of-work. As we have seen, most of these tokens are just speculative schemes without connection to the underlying property.

On the contrary, Bitcoin, as a seamless Internet-native currency, can actually help making the Internet better, by allowing easier and better ways to monetize content instead of relying on advertising. In the coming years, we'll see more Bitcoin-based services popping up.


I have a dream of solving this with extreme-infungibility of a currency system.

I’m sure every ancient civilization noticed this problem when the ease of trade that came with common currency resulted in the incentive to be conquered and taxed from a foreign land.


Those who had gold but no army, soon had neither. ISTM gold and other substances of "concentrated value" are inherently more dangerous than other things that ancient people could have stored, like lumber or grain.


Can we please just jump to the end and make it illegal to transfer Crypto to Fiat?


I agree with you overall - this does appear to be another resource curse - but I want to point out that the culprits in this matter are the companies that bottleneck the overly-abundant resources for profit (so Nvidia and Google), not the users who attempt to bleed constrained value through that bottleneck. Nvidia used their near-monopolistic power over the graphics card market in an attempt to globally limit the hash rate of mining. I doubt their intent was as altruistic as they spun it either - it seems very obviously a move to force miners to use more GPUs in parallel, to circumvent the limits in each single GPU. I find that to be more concerning than some hackers demanding that they stop.


> it seems very obviously a move to force miners to use more GPUs in parallel

This is very obviously wrong. The economics of crypto mining are almost entirely dominated by power cost. Nvidia's decision here simply means they will use other hardware, not more of the LHR hardware.

Nvidia already can't make enough GPUs to satisfy demand. Why would they try to increase undesirable demand (which is going away soon anyway, at least for ETH).


I'm not sure nvidia actually sees mining as 'undesirable demand' .. sells cards, makes them money. They wouldn't give a shit if people were buying them to use as firewood.


If that's the case, then I stand corrected, but it does not change my point that a single company being able to bottleneck a resource is problematic.


The resources are bottlenecked due to the chip shortage, which happened because the entire automotive industry mis-forecast demand for the pandemic. All the GPU manufacturers have the same problem: they don't want to build GPU production lines for a market that is going away. The problem isn't Nvidia, it's proof of work, which will bottleneck as many resources as it can by design.

This is what happens when you try and build a planetary messaging system with inefficiency as a design goal.


You're likely not going to like this, but you agreeing with their reasoning for bottle-necking a resource does not make it any less problematic that they have the power to do so.


You imagine that you had a vendor selling bread, and you have people who need to eat, and a customer comes and asks for a year's supply of bread so they can have a bread bonfire as a sacrifice to their god so their god can take over the world, you're going to do everything in your power to avoid selling bread to those people. Because they are going to have their bonfire, their god will not appear, and meanwhile everyone will starve.

Nvidia doesn't unilaterally have this power - they are making the same rational choice as all the other GPU vendors, they are just doing it somewhat more forcefully.


I’m not sure Google can be blamed for the bottleneck since user behavior says that people will only look through so many search results before moving on. That’s not abundance since no matter how many results Google shows the people problem will remain.

Advertisers and marketers fighting for top positions are where the real curse comes from IMO.


> So, NVIDIA, the choice is yours! Either: Officially make current and all future drivers for all cards open source, while keeping the Verilog and chipset trade secrets... well, secret OR Not make the drivers open source, making us release the entire silicon chip files so that everyone not only knows your driver's secrets, but also your most closely-guarded trade secrets for graphics and computer chipsets too!

Interesting. Someone got really fed up with them. I don't think their binary blob ever made them any friends.


No single piece of software has wasted more of my time than Nvidia's drivers, mostly (though not exclusively) on Linux. They've rendered my OS unbootable so many times over the years. So many times I've spent whole days of my life troubleshooting, upgrading, downgrading, configuring, rebooting. Then often reinstalling the OS after their installers mess stuff up in ways that are impossible to even know until they pop up and screw you later on.

I don't condone cybercrime but man would I just be ecstatic if Nvidia would finally follow AMD and Intel in developing an open source driver.


Linus Torvalds directly addressing NVIDIA (worth the click: it's only 17 seconds):

https://youtu.be/_36yNWw_07g


Is it the video that I think it is?

Yeah it is~ ^^


Nvidia license taints kernel.


Oh, tell me about it.

+Installs Linux, this time determined to make it daily driver. Why is this so slow though?

- You need to install Nvidia drivers

+ Oh, OK makes sense.

-- INSTALLS NVIDIA DRIVERS --

-- LINUX NO LONGER BOOTS, OBSCURE ERROR MESSAGE, FURIOUSLY GOOGLING ON A TINY PHONE SCREEN TRYING TO RESOLVE THE ISSUE --

Later made myself a Hackintosh, eventually bought an actual Mac. The Hackintosh stuff was much, much more stable than anything Nvidia ever released when I was trying to use Linux. Installing patched kext files, trying make MacOS run at full resolution and with smooth animations on unsupported hardware was much less pain than dealing with official Nvidia drives on Linux.

It's sad to hear that graphics drivers are still not a solved problem on Linux.


> It's sad to hear that graphics drivers are still not a solved problem on Linux.

They are, Intel and AMD have shown that it can be done and it can be done excellently. NVIDIA just decided they do not want to be part of the solution.


Absolutely. On my last NVIDIA device, I ended up using Nouveau (luckily enough my specific card did not have firmware signing - NV110). There were absolutely no crashes, but some games did become unplayable (incomplete features). https://nouveau.freedesktop.org/FeatureMatrix.html

Ever since, I have only bought CPU-integrated graphics (an Intel desktop and an AMD laptop for my cousin). Proud to say I have never sponsored the mandatory signed firmware devices.


My son wanted a "gaming PC" and I gave in with the only stipulation being that it had to be Linux. He plays Minecraft and Cities: Skylines, so it's all good. The Nouveau drivers suck for this. It's a ~3-year old i7 PC with a 1660 GPU. Maybe that's the difference? Switching from Nouveau to the Ubuntu-provided recommended Nvidia driver (510.XX?) gives you a 200-300% improvement!


> Maybe that's the difference?

The difference is that free drivers don't have access to changing the GPU frequency in newer cards. Newer means anything above NV110, that is, 900-series.


Thanks for the info!


Certainly the Nouveau drivers are slower. I sacrificed both speed and capability for software freedom.

I don't know about Minecraft, but Minetest runs well for me on Nouveau, on a 7-year old i7 with a GTX 850M.

I tried RimWorld and Factorio, but they were unplayable due to I believe missing OpenGL features.

I have not tried Cities: Skylines.


Pedant nit: Intel and AMD CPUs also both have mandatory signed firmware with higher privilege levels[0] than Linux. Blame enterprises that want owner-operated backdoors into their own system.

It's still not as egregious as Nvidia's though, which is specifically designed as a defeat device to frustrate the use of third-party drivers. You at least don't need ME/PSP access to boot into a Free OS.

[0] ME/PSP are technically separate processors, but they have full control over the x86 cores. Their firmware can be minimized but they are integral to the boot process and enforce code signing on the BIOS. Speaking of, the BIOS gets to load into SMM, which has been around since the 386 and runs above both ring 0 and hypervisors.


I’m trying to be part of the solution by buying AND graphics card for my Linux box.


Vote with your wallet, very nice!


The problem with non-nvidia gpus is not having access to CUDA. Lots of productivity software target Nvidia for gpgpu stuff as well. As much as I love linux, if I built a gaming pc I'd swallow my pride and install windows with WSL and call it a day. I long for the day when AMD really invests in an open source alternative to CUDA and doesn't just abandon it like they did with ROCm.


While I hate unreliable drivers as much as the next guy this is not just an Nvidia problem. Several Linux kernel developers love to break API just to mess with closed source GPU driver e.g. replace an API for no reason other than hiding a few indispensable functions behind a GPL only macro. This causes nothing but avoidable grieve to users and the driver maintainers.

It’s really interesting to see the difference between Linux and FreeBSD (which doesn’t break kernel APIs for shit and giggles). The damn Nvidia kernel module is still a bloated closed source blob, but not once in >10 years did it break. The largest problems I had with this driver under FreeBSD was after the adoption of KMS because Linux locked away the new memory management API required for efficient tearing free frame swapping behind GPL only macros resulting in massive tearing (there were workarounds like using a vsync enabled compositor adding latency and wasting enough power to cost me 30-60 min battery runtime).


> It's sad to hear that graphics drivers are still not a solved problem on Linux.

This is not a Linux (or BSD) problem but purely a Nvidia Issue. After my last issue with an embedded Nvidia GPU in my motherboard (~10 years ago), I stay far away from Nvidia, I will never but anything remotely associated with Nvidia until they Open Source their drivers. I knew someone who worked there as an engineer and told him that, he just smiled with me knowing I knew it is really beyond his control.

This is 2022, we should be well beyond the point of worrying about Video Hardware in Linux/BSD.


Interesting. I’ve not used Linux as my mainly driver for years. I miss it and I want to come back but I basically don’t have any decent PC anymore so I have the freedom to get/build one.

When I used it, ironically, NVidia was the way to go on Linux and ATI/AMD was a shit show. Is AMD ok nowadays ? My needs are mostly confortable casual gaming where I don’t care having more than 60fps and I don’t play online (so this anti cheats situation doesn’t really bother me other than ideological issues).


I never had any problems with the NVidia driver either, and I can't see how it would prevent the system from booting. (I mean, it's not linked with the kernel and not set to load the module until X starts, so you can get to a single user mode console or SSH in if X breaks. Maybe people think X and booting are the same thing?)

The only complaint I had with NVidia was that a long time ago I had one of the first consumer-grade 4k displays, and it exposed itself as 2 1920x2160 displays. That all worked ... "great". X thought the monitor was two displays, and so anything that cares about the calculation of display boundaries (say, full screen video), didn't work correctly; it would just show on half the monitor. One day I found that a config parsing bug could cause the xrandr extension to be disabled while keeping everything else working, and then full screen worked. (The options to just lie to xrandr or disable it were of course broken, which is why bugs interacting with bugs was the fix. They never fixed the bugs or the bugged bugs, so shrug I guess.) It took me months of pulling out my hair to find that workaround, and having used X since before xrandr was invented it drove me crazy. In a past life, I also pulled out all my hair to get xrandr to work. It never fucking worked. And then the one time you don't want it to work, it works and can't be made not to work. Argh!!! Just send me back to 2000 please!

(Since then I've switched to Windows, and I still have that monitor. It works perfectly under Windows. You wouldn't even know that it uses the hacky Displayport MST stuff. Probably have a nice hack hard-coded right into the driver, and if it doesn't activate for some reason, you're completely screwed.)


Your right I suspect as I have had this problem myself; it is boot-able, but it will break the Xserver in such away that it can't even change to text only terminal to fix it. So I have to go into single user mode uninstall and reinstall the graphics driver then reboot, and it magically works again. I think it is the installer breaking if there is a old copy installed already rather than a bug in the actual driver itself. This happens every 3-4 months on my Ubuntu gaming laptop.


Despite my complaints, for casual gaming on a desktop you shouldn't have too many issues with Nvidia's drivers these days as long as you can use the ones packaged by your distro instead of Nvidia's installer. My troubles have mostly been associated with laptops that have both Intel and Nvidia graphics, or desktops used for ML training where the distro-packaged drivers wouldn't work for one reason or another.

I've heard AMD's drivers are OK these days but haven't tried them myself because I care about ML.


>I've heard AMD's drivers are OK these days but haven't tried them myself because I care about ML.

They are OK but with some caveats like VP9 HW decode not being available in Linux AMD iGPU drivers for reasons, which won't be a problem if you're running an AMD desktop tower PC, but if you're gonna watch Youtube on your AMD laptop, your battery will take a hit along with possible extra fan noise, which is unacceptable for a great media content consumption experience in Linux when even no-name Android phones and tablets have VP9 HW decode in Youtube out of the box. No bueno.

This is such a mess, as VP9 & AV1 HW video decode is not an issue under linux for Intel iGPUs, but AMD seems to not give a shit to expose VP9 decode on linux for their iGPUs. Also, unlike with Intel iGPUs, under linux, radeon-top does not show GPU decode usage for easy diagnostic purposes. I assume it's because AMD bought the video codec engine IP from some third party fabless IP vendor and they don't have the permission to open source the drivers for the video engine block as part of the licensing agreement, like they do for their in house developed GPU block. But still, regardless of the reason, it sucks for the linux end-users.

And another important issue nobody seems to talk about but one should be aware of, AMD APUs (at least to 5000 series) don't have unified memory between CPU and iGPU like Intel and Apple do, so you have to configure in your laptop BIOS how much of your system RAM you want allocated to your iGPU (512MB/1GB/2GB/4GB), with the rest remaining for the CPU. This sucks big time if you're switching between productivity where you need more memory for applications, and playing video games where you need as much memory as possible for the GPU, so you need to keep rebooting and changing in the BIOS the amount of memory you allocate to your iGPU, vs on Intel chips where the whole system memory is unified and shared dynamically between the CPU and iGPU, and it boils my blood as the whole marketing gimmick of AMD's APUs was the seamless merger between CPU and GPU resources together, while in practice their (5000 series) APUs are just a Ryzen CPU with a Vega GPU glued on the same die without them actually sharing any of the resources properly. Maybe the new 6000 series with RDNA2 changed that but I couldn't find any detailed info.

Honestly, as a laptop owner with AMD APU, if you're planning to go Linux, just get one with an Intel chip instead. Much less head aches as Intel Linux drivers and apps are second to none and have a unified memory model.

The performance and efficiency of AMD APUs is still great though, especially under Windows, but the overall implementation as a whole package seems poorly thought out and rushed out the door, especially under Linux compared to Intel.


Isn't this some distro-specific problem?

VP9 should be available since Mesa 18.1 (or so) for VCN-based GPUs (i.e. Raven Ridge and newer). VP9 is not supported in hardware on UVD-based GPUs (i.e. discrete Vega).

VCN 3.0-based GPUs (i.e. RDNA2, the 6000-series APUs) also support AV1 decoding.


>VP9 should be available since Mesa 18.1 (or so) for VCN-based GPUs (i.e. Raven Ridge and newer).

Yeah, in theory it's available, but good luck actually getting your browser to use it to HW decode YouTube videos under linux out of the box.

It usually works fine with video players like mplayer but not in browsers.

There are endless threads online about this not working. I for one, never managed to get it to work in any browser. What a mess.


> It usually works fine with video players like mplayer but not in browsers.

Chrome doesn't support VA-API, so do not bother trying.

Some distributions do have Chromium builds with VA-API support enabled, but YMMV. Chromium uses X11 only and libva requires dri2, so it doesn't work under XWayland, which supports only dri3. (=> Chromium with VA-API works only in native X11 session)

Firefox does support VA-API. There was a period of time, when it did only under Wayland, but nowadays both X11 and Wayland should work. When running under Wayland, use the native version and /not/ via XWayland, because the same thing as with Chromium applies.


The above is a fair summation of Linux's failure as a desktop in a single post.


Maybe it is, but keep in mind, that Chrome is a proprietary application - the Linux community and the desktop communities have exactly zero input, what will and what will be not supported.

On the Firefox side, Redhat and Suse people did the work to support video acceleration.


Minor point. That bios setting does dedicate some amount of ram to the GPU, so the rest of the system can't use it. The GPU can allocate more than that though if the program makes a slightly different allocate call to request it.

Seamlessly sharing memory between the two chips works fine with the compute toolchain (rocm), I don't know how transparent it is for games.


>The GPU can allocate more than that though if the program makes a slightly different allocate call to request it.

Maybe in ROCM, but not in generic Windows and Linux apps and video games that I've tried. They all cap out at whatever you set in BIOS for the iGPU VRAM reservation and don't go beyond that into system RAM. If ROCM has this ability how come AMD's driver, at least in Windows, doesn't do this memory overflow thing for VRAM hungry apps like video games, as that would certainly alleviate this issue.

IMHO, still a worse implementation than the unified model of Intel. Having a large chunk of memory constantly blocked for a HW component, regardless if it's currently needed or not is just horribly inefficient, especially when you're on a mobile device and have 16GB of RAM or less to play with.

Disappointing.


I have built my desktop with top end components: ASRock X570M Pro4, ASUS ROG Radeon RX 6800, AMD Ryzen 7 5800X, WDBlack Nvme, and I have no issue at all with drivers or anything (I also use gentoo with a self-configured kernel, no genkernel). It's my daily driver for working and gaming 4k (with 3 4k monitors)

Linux has no issue with drivers and/or video cards in particular, people just need to pay some attention when buying components, saying that linux has issues with graphics because of nvidia it's a bit unfair, nouveau is not able to use acceleration because the videocard doesn't activate it if it isn't loaded by the official nvidia blob, so if linux is supposed to have issues with graphic card while vendors actively create obstacles, we will never figure the real issue and hold the right people accountable: It's nvidia that has still issues with linux


That works if you don’t care about deep learning or use any apps that need CUDA


Yeah but why do we blame linux if it's the official nvidia drivers that sucks and when the video cards doesn't work with acceleration if you don't use that driver that sucks? What is linux supposed to do?


I don’t care who’s to blame, but I’ll just say that the proprietary drivers work just fine on servers and I imagine they’d work fine on Desktop too if distros installed them automatically. Loads of people try Linux on a PC they already have instead of buying a new one that has upstream drivers for every component. This is yet another reason why Linux desktop will never have >2% market share.


I've had a few Nvidia cards (from 9xx to 20xx) and one (very modern) AMD.

The AMD drivers runs fine. They're functionally inferior to the Nvidia ones (in several aspects), but I had no big problems. I don't play on Linux, though.

I think open source makes a significant the difference - with Nvidia, issues often can't be solved by software devs (e.g. Libreoffice Calc running updating the screen very slowly). I suppose that with open source drivers, devs can at least get an idea of what's going on.

However, there's one dealbreaker with Nvidia - they have such "blob of a cards", that Linux, on a default configuration (which means, ISO with Nouveau) can't run at all with many Nvidia card series. I couldn't even boot with a GT 1030 (among the others).

I actually have no idea how people can install Linux on Nvidia systems, since I had to create an adhoc iso with the binary drivers preinstalled.


> that Linux, on a default configuration (which means, ISO with Nouveau)

To this day I cannot think of a good reason to ship nouveau outside of "lets make users with nvidia cards suffer". I get the idea behind the nouveau project, but pushing an experimental driver that cannot work with 99% of the hardware it latches onto and forces users to disable it if they want a basic working system is actively malicious.


I don't know the last time you used nouveau drivers but this is just wrong.

is it feature complete ? no, but to go ahead and say it's not working for 99% is just lying here is the list: https://nouveau.freedesktop.org/FeatureMatrix.html

what you can say though is that it is a bad idea for GPU that are relatively recent. wont block you from booting or displaying to screen but would probably be slow.

also this is distro dependent, I remember being given the choice with no default one (mean chosen automatically) when installing Manjaro (a derivative of arch linux).

I chose the proprietary driver since they are feature complete but later switched to nouveau driver since they are not prone to breaking on every other update. that was 2 years ago, zero problems since.


for years I'd give nouveau a try and see if it could run Wayland. for years I'd get a black screen and have to repair in GRUB (or, thankfully, roll back in the boot screen on NixOS.)

since it doesn't do CUDA either, nouveau is useless to me. next Linux build I'll either use Intel or AMD, but can't economically do ML on AMD still.


No, it isn't, and the malice comes entirely from circumstances foisted on the Free/Libre Open Source community by Nvidia's inherent user hostility at the behest of corporate America/the music/content delivery industry.

It isn't that hard to write a manual. It isn't that hard to respect user freedom to use tge device they purchased. It isn't that hard to just leave well enough alone.

But no.... Nvidia just couldn't.


>user hostility at the behest of corporate America/the music/content delivery industry.

Meanwhile the company responsible for the biggest security issues in the last decade still provides its CPU microcode as signed binary blob and the FLOS community is fine with that because someone at the FSF drew a random line to stand on. I will accept the FLOS communities hatred of NVIDIA the moment it stops bending over backward for the company that brought us Meltdown, Spectre, RowHammer (who needs consumer grade ECC anyway), etc. .


You won't hear a peep one from me in that regard, because I am of the same line of thinking.


Yes, AMD is fine. Intel works well if you get a computer with an iGPU, but the last time I looked, AMD's iGPUs were a bit more powerful. You'll want a Nvidia card for ML, unfortunately.


The AMD drivers are pretty good, at least a year after the cards are released. AMD has only just started doing the Intel thing of pushing driver support for GPUs before they are released so you often need to run bleeding edge kernel/Mesa or wait a while to get good support.


Amd after their island cards just works.


> It's sad to hear that graphics drivers are still not a solved problem on Linux.

Indeed and still nothing has changed. Last time I checked, installing a Nvidia driver crashed the whole desktop and every-time you boot you Linux distro, it gives you a black screen with X11 or Wayland pop ups.

A magnificent waste of time that was fixing things in order to do basic work. I ended up using Windows with WSL2 and waiting for the new Macbook Air. I never saw the point of dealing with anything Nvidia on Linux these days.


That's weird. I've been using an Ubuntu desktop with a GTX 1080 daily for years now, across multiple major OS versions and driver versions, with zero GPU-related issues.


For me the same, I'm using a GTX1060 with Ubuntu LTS for over 3 years now without any issues, ever. I mostly attribute this to using the Ubuntu and the LTS version of it - it is probably one of the only few desktop distros nvidia tests/optimizes for as part of their QA.

However, during my time at Uni I maintained over the course of a couple of month a patchset against a kernelmodule for my research, and I remember what a mess it was. Slightest kernel updates broke it, and even supporting just 1-2 distros we used in our lab was very time consuming. Even after I left I got a couple of mails from researchers whether I could assist getting it to build on newer versions of the kernel. And even though I had a fraction of funcitonality compared to what NVidia provides, I absolutely understand how difficult it is to maintain a non-upstreams patchset over time - so I definetely believe all bad I hear about nvidia on various distros/setup is 100% true. It is just NVidias fault not to go the AMD route and at least try to get as much as possible upstreamed and open source.

On a sidenote, the only reason I went with NVidia was that at the time due to crypto-hype, a competetive AMD card was 50-70% more expensive (in retail, not the manufacturer suggested price). I'd definitely would go AMD next time.


Desktop is OK for the most part, hybrid graphics on Laptop can still be a living nightmare, though.

Every other driver update breaks the system in unexpected ways, Wayland wasn't supported for the longest time, switching GPUs had no support at all for quite a while and still is a bad joke compared to Windows (e.g. you have to log out and log in again to switch the graphics processor unless you use 3rd party tools).


> Every other driver update breaks the system in unexpected ways

That's almost surely your distro's fault. I used to ran Nvidia for well over a decade on Arch with zero issues, and it's far from the only distro that properly handles Nvidia drivers.

If your distro doesn't distribute an Nvidia driver on their repos and maintains it accordingly with the kernel it ships, it is not an option you can consider if you are using Nvidia.


I’m using Windows with WSL these days. Works great


is cuda usable under wsl now?


Idk I’ve never used cuda, but worst case you can still use it on windows


This was the single biggest "driver" to me moving to Mac from Linux. I had a really nice system that I built and used for work. I was working on a project with a 24 hour requirement, it was only an hour of work. I needed to update for a particular dependency, but decided to update everything and low and behold, my video output broke after the update. I made the deadline, but spent literally all day and night trying to get everything working and done. The next day after a long nap, I shelved what should have been a perfectly good piece of hardware and bought a Mac after using Linux for well over a decade.


This is similarly perhaps my single biggest reason for using the NixOS disto instead of Arch. If one big update breaks everything, rolling back is as simple as choosing the older state from the boot menu and voila. It's helped me sleep so much better with updates both on my personal computers and my server, to know I'm not going to bring it down for more than a minute or two even with a failed update.


This is basically what happened to me, though twenty years ago.

Since then linux has been great for me - on the servers.


> but decided to update everything

We each make that mistake once ;)


Do you now love or loathe your mac and it's cost?


I am not dan-0, I like my Macs, I used to love them, but software quality has gone down. Weird UI decisions, mostly around size, colours (all friggen grey!) and contrast were the older I get the harder it gets to use in default setup. The cost of them are really a bit of a challenge now. A Mac Mini with 16 Gb ram and 2 TB SSD for $1600 is a bit much. And the Pro is just eye watering.

Windows is out of the question in how they treat their customers. Example: Dialogue when rejecting upgrading to Win10, click the X button at the dialog upper right corner and be greeted by an immediate reboot and upgrade.

Linux, life is to short for me to handle the hassle on the desktop. Server is a no brainer.


> No single piece of software has wasted more of my time than Nvidia's drivers

I see you haven't used printers much :)


I had to install a new inkjet printer because kids now need to print out their homework during COVID.

I swear to god if I ever become wealthy the printer industry is what I intended to completely destroy. Not in it for profit. Not positive sum whatever startup thinking. It will be Zero Sum.

Edit: Lasers are fine. That will be left alone.


> I swear to god if I ever become wealthy the printer industry is what I intended to completely destroy.

Not he hero we deserve, but the hero we need!


Damn it feels good to be a gangsta...



Is now a bad time to put down your pom poms and pitch it on kickstarter?


Ironically, printer drivers are pretty ok on Linux.

It's just everywhere else that they are a problem.


A printer has never made my entire system unable to boot. I've experienced an issue where gnome just grey screens and it takes me hours to diagnose and fix.


That has more to do with a screen being pretty essential to running a system than driver quality.


Or tried to install tensorflow ayoo


Eh it’s not too bad on Linux as long as you use the official docker image that has cuda, cudnn, and tensorflow with matching versions. This also lets you use tf1, tf2, and PyTorch in separate containers without upgrading cudnn in lockstep. Larger projects that use several models inevitably require all three. If you install it yourself, you’re in for a world of hurt.


NixOS has this decently solved. “Some assembly required”, but at least you can switch between versions without messing up your system.

So long as the driver doesn’t change, you can even run multiple versions simultaneously and nothing will break.


Imagine having to install a different OS to run something, what year is this?


Nix, the package manager, can at least be used on other distros.


I don't understand how anyone can be okay with:

"Here, run this piece of code. No don't bother trying to read, build or understand it yourself, we know you won't be able to do it, so we put it in a little mystery box over here that you can run on top of your existing operating system. Now please don't ask any questions and go away."


Well that's basically all we do isn't it? Every exe you run on windows, every apt package you get on linux, every docker container, or node module, etc. Just layers and layers of running black boxes that nobody but the original developer knows much about.

But even if you compile it yourself, what exactly are you gaining? Extra work for something you won't look into anyway? You'd think open source stuff would be more peer reviewed, but as a maintainer of a few open company repos it's surprising how much nobody actually cares and will just run whatever.

Most of the time it works fine because the average person isn't a malicious actor, but every so often something gets compromised and it's always discovered by pure chance.


Vouched, but you should know that literally all the comments on your account are [dead].

I can read and understand tf just fine and I’ve had to work around bugs (like the broken image resampling) by doing just that.

Treating tf as a blank box has nothing to do with it. The problem is that nothing none of the ABIs involved here (tf, cudnn, distro libc, nvidia driver, etc) are stable. The best you can hope for is to use the same driver for several containers with matched (tf, libc, cudnn) and that’s worked out well for me in the last couple of years.


Netcat to address?


A better solution would be for AMD to invest in bringing their ML stacks up to date to work with PyTorch and such.


That would be nice. It really boggles the mind how thoroughly AMD has missed the boat on ML. And it really seems like they aren't on any trajectory to catch up even today. I've given up on them. Instead I'm hoping that Intel's imminent entry into the discrete GPU market does better. Nvidia is in desperate need of some competent competition in ML.


You seem to overlook the fact that NVIDIA poured immense resources into their ML software stack.

This was back in AMD's Bulldozer days (2011), when the company struggled both financially and technologically.

Meanwhile NVIDIA sponsored universities with graphics cards and had already developed their CUDA ecosystem (in 2007) when AMD was still busy with the ATI acquisition. In 2011 NVIDIA had an annual net income of about $500 million, while AMD had a net loss of $600 million at the same time and kept struggling for the following 5 years.

In other words, NVIDIA already had an existing ecosystem of professional grade H/W accelerators and S/W infrastructure, when AMD was still a CPU manufacturer without a dedicated GPU division. When AMD acquired ATI, NVIDIA was already in the process of transitioning their GPGPU stack from data centre-only products to consumer hardware.

AMD has powerful ML hardware today, but that's data centre and supercomputer only. They didn't miss the boat on ML - they were busy not drowning while NVIDIA was handing out goodies to academia.


Yeah, there was a lot of poor management from AMD though that led to them being in that situation, let's not pretend like it's just one day "oops we have no money".

(And nor was it "all Intel's fault" either, AMD fucked up a lot in this time period, both technically and in their business decisions)

At one point AMD was set to merge with NVIDIA but the board couldn't get over the sticking point of Jensen wanting to be CEO of the resulting company. Had the board swallowed their pride and let that happen, I doubt he would have led the company down the Bulldozer garden path.

Instead AMD said no, and then way overpaid for ATI, which depleted their cash reserves and forced them to sell their foundries (which was ultimately probably the correct long-term move, but that isn't the proximate reason why they sold them - they were just out of cash at the time). Then they had no money and had to underfund their architectures, and move to "cost reduction" mechanisms in the design and implementation, and had a series of implementation problems and poor designs.

Phenom was late (so much so that AMD had to put out a stopgap dual-socket "Quad FX" system just to try and compete against Core2 Quad) and Phenom had a major bug where part of the cache system had to be turned off, tanking performance by like >25%. And by the time Phenom II came out, AMD was putting non-SMT quadcores against Nehalem. By the time Bulldozer came out, the 8150 was going against Sandy Bridge, and by the time AMD fixed the worst of Bulldozer's performance problems, the 8350 was going against Ivy Bridge, and it still sucked anyway. AMD just executed extremely poorly in this period and a large part of that is the cash shortages resulting from buying ATI.

The next decade of AMD's financial woes largely spring from that moment when the AMD board said "no" to the merger, and the chain of decisions and failures that resulted. And while consoles might have saved the company - they likely would not have been in that position in the first place had they said yes to the merger instead of emptying the bank account to buy ATI, instead of doing a merger-of-equals with NVIDIA.

Also, AMD still isn't taking the steps that they can to increase their ML marketshare. They are actively reducing the support levels of their ROCm package - amateurs can easily get into the basics with a consumer NVIDIA gaming GPU while AMD forces you to buy a $5000 enterprise card.


It's been obvious for 10+ years now that AI was not only incredibly important but also a ginormous market opportunity up for grabs. To the extent that they were cash strapped they should have reprioritized. The effort would have paid for itself many times over. And that excuse no longer holds water now that AMD's market cap rivals Intel's and they're still not investing enough or making the right technical decisions. There must be deeper problems.


They actually have already - PyTorch works straight up with ROCm, and so does Tensorflow. There is a little faffing to do to get it to work but they've made great progress in the last six months.


Except consumer GPUs don't officially support ROCm, despite consistent pressure from users for years. And there's no indication of when that situation will change.

CUDA is successful because the same software works on low-powered laptop GPUs and expensive datacenter GPUs.


Yes, it would also nice if they would implement stuff like CUDA (if thats not illegal) or provide alternatives, which is also used by many ML programs and forces you to buy a Nvidia card


They have ROCm and many popular frameworks support it.

The problem is that NVIDIA has 100% of the mindshare and owns 97% of the dGPU market so it's a self-fullfilling prophecy that proprietary tech like CUDA will stay dominant in the foreseeable future.

Hopefully Intel re-entering the dGPU market will put a dent in that and move the needle a little.


Haven't had an issue with the nvidia proprietary drivers since the 495 series, they've gotten a lot better recently, and more frequently updated, though still less configurable than their windows counterparts.

Currently on driver 510.54 and playing Elden Ring on maximum settings with a 2080 super without issue.

Though you do have to add a kernel parameter to use nvidia's DRM mode for best performance which is non-obvious. And hybrid GPU laptops are a whole other thing, I guess.


Yeah. It directly led me to buying AMD hardware this time and I'm much happier for it. AMD aren't perfect but at least their drivers are in the kernel.


Is t there a ton of IP and patents which can't be made open source?

There should not be much of an issue for Nvidia to open source it otherwise.

Open source doesn't mean free anyway


It's not about making it 'free' in the sense of money. The term free often infers money, but in this regards its more about the philosophy of freedom itself, not cost of product.

We as consumers should not be held back from using hardware the way that we intend to use it. Nvidia does not have any right to hinder us from using what we purchased how we see fit. (Provided that we are not infringing upon copyrights and/or I.P.)


yes, that's always been the primary underlying reason AFAIK, there's too much licensed code involved in their driver stack, and they don't care to spend the time re-implementing those components.


Ironically, they wouldn't need to spend the time re-implementing those componenents and dealing with licenses if they had just made their code open source to begin with.

The community would likely gladly help them with anything we can help with; and licenses would be less of an issue. It wouldn't be the first time closed source stuff has been made alternatives to it in open source specifically for reasons like licensing. MP3 for instance... not a terribly hard thing to deal with anymore, but once upon a time ago..

Anyways. Nvidia's true reason for doing anything is ultimately in their namesake. They want people to be envious. Part of making people envious is to keep things they want away from them.

So, yeah... that's the real reason. If they actually gave a flying fuck about the rest of us, they wouldn't be that way.


I used to cringe whenever I see a prompt to upgrade driver. But it's getting better. Version 460.39 hasn't caused any pain so far.


Just look at the version number, 460.39.


Why don't you just not buy Nvidia hardware?


Because I care about ML which AMD has been utterly failing at for 10 years running.


Same here, I got fed up with it and went AMD (or Intel) only. It just works.


I recently tried wayland with fedora 35. The driver destroyed the system. I switched back from pop os lts to the latest pop os and the performance was even better (compared to lts).

GTX 1080


seriously, nothing makes me feel more inept that wrestling with a new GPU.


Printer Drivers, but those are not brand specific, take the cake here.


Same, glad to know i'm not the only one that deal with this.


I can suggest a good reason why the Nvidia GPU drivers are opaque binary blobs.

Generally speaking, you don't build a multi-billion transistor chip, and manage to ship the A01 spin without doing some magic.

The Nvidia GPU i/o interface is a remarkable mechanism where the resource manager (the code that sits at the very lowest level) can patch the hardware interface to get around hardware issues and give the appearance of a chip that is working perfectly.

I'm not claiming the blobs work well or anything. I'm just saying there's a reason Nvidia does blobs.

Typically, an A01 chip will ship with about 50 bugs that are worked around in the resource manager and driver. They really do not want to tell you about these bugs -- because they are WORKED AROUND. You don't need to see the sausage made, so to speak.


I do not understand your argument.

You suggest that there are work-arounds in the drivers. And then you imply that people do not want to see them.

First of all, I am unsure if people not willing to see the work-arounds is generally true to begin with.

But more importantly, if someone does not want to see something, they can simply not look. I do not understand how that is meant to justify these drivers being binary blobs.


The argument is "It's embarrassing"

Outsiders are irrelevant - if something makes you feel embarrassed, it's embarrassing.


So you are saying Nvidia hardware is crap, and therefore we don’t want to see their drivers?

Either that means that AMD hardware is far superior, or people do actually want to see the drivers (including workarounds).


Isn't that essentially the same as what everyone does with CPUs?


GPUs ship with far more workarounds in the drivers because the programmer only has access to the hardware via drivers that can cover up the bugs, and because they’re not constrained by binary compatibility.


Yes. I think for the CPU they mostly stick those work-arounds into microcode?


Honest question: if you are nvidia, why not publish the interface specification for your device? If there are silicon bugs, publish the errata and the workaround. I've seen SOC manufacturers do similar. What's different about GPUs?


Reminds me of a story where someone joined a company, then immediately fixed a bug in the code that had been bothering them for years, sighed, and put in their two weeks' notice.


I'm assuming you meant this?

https://news.ycombinator.com/item?id=26663798

Comment says it's a joke, however.


Personally, I just switched to AMD and never had to think about GPU drivers again.

Ok, I do have to add the non-free firmware when I install Debian, once every installation. But there are no random problems all the time, no "your card is too old for your computer to keep working", no "your card is too new for your computer", basically nothing more than adding a single package.

I imagine NVidia works really well on Windows, because people keep saying good things about them, and there is absolutely no chance somebody on Linux will ever say something good.


Nvidia drivers are decent on Linux - almost all ML people use it and they're incredibly dependent on CUDA. Nvidia has a vested interest in this because they pivoted hard towards deep learning, but they're also comfortable because there's no good competition from AMD or ARM yet. TPUs are excellent but still niche, despite Google's best efforts to push them. Driver installation on Linux is mostly good I've found, just using native package managers and Conda for dev environments. Gaming also works fine and of course so does mining...

But there are odd annoyances - you get less control over device parameters in Linux. Things like setting the memory clock or undervolting (not just power factor) only seem to work in Windows.

This leak is a pain. If Nvidia refuses and they just leak the code, we're still dependent on Nvidia to approve an open source driver. It's like when hackers try and take code to other companies - AMD wouldn't touch any of this with a barge pole for fear of infringement. You're not going to get magical driver support in Debian from leaked code. Best case someone releases an illegal driver and it doesn't get immediately taken down.


The non-free firmware is a shame. There are basically no modern GPUs that work with linux-libre because of it. This issue doesn't get much attention.


The binary driver made me a big NVidia fan FWIW. It works, consistently and easily, on both Linux and FreeBSD. The unified release makes both of those feel like a first-class citizen alongside Windows. It's always been more reliable than the ATI/AMD drivers, even when the latter are notionally open-source. (I will say that Intel video drivers have also always been good).

Sure, it'd be nice for it to be open-source too, but frankly if the software is good enough that I never want to fix bugs in it then it's a lot less important whether it's open-source.


So long as you don’t mind getting wayland 6 years late.


I use FreeBSD now so it was however many years late for me already. And even when I was using Linux, the benefits seemed extremely marginal - like, sure, in theory I can see some situations where using two screens with different DPI settings would be useful, but it's rarely something I've wanted or needed in practice - when I got a laptop with a 4k screen I did try using it with an old 1080p monitor for a little while but it looked so bad in comparison that I replaced that with a 4k monitor as soon as I could.


BSD makes you a minority of a minority, so that’s probably to be expected. Afaik it offers sandboxing and a better security model, reduced screen tearing, etc. Nvidia dragging their feet has surely impacted on the early adoption of the project.

It’s also created painpoonts. I’d be using fedora, but there’s no assurance on the content within the fedora community repos. I don’t really want to apply trust there, so I stick with Ubuntu for better or worse, all because of nvidia.


Which card do you use?


I've had several over the years; I'm currently on a GTX 1060 (this machine is getting old, but I don't game anywhere near as much as I used to, especially not AAA).


I definitely support making GPUs less attractive to miners. I'd buy a card that could be remotely disabled when usage patterns matched crypto mining if it meant I could spend MSRP on a modern card.


You would allow hardware DRM in your box just to allow Nvidia to "revoke" ownership of your card when you use it "incorrectly".

Yeah I'll pass.


The parent comment makes no mention of giving Nvidia control over the GPU. You know you can have remote access features that are solely under your control, right? That's how SSH works.

You should read comments before you reply to them.


This is so painfully pedantic and obviously not what the original commenter meant. What point would a remote killswitch have to deter crypto miners if the owner of the card (the crypto miner) fully controlled it as you imply. Use your head.


> I'd buy a card that could be remotely disabled when usage patterns matched crypto mining...

What does LHR and SSH have in common? How does 3rd party LHR make any sense.


You may feel you own Windows.


I can see why it's attractive when the alternative is no card at all.


Short term thinking can often appear to be attractive in the short term. It's the long term implications that get you, and by then it's too late.


At what point does short term become long term? Crypto mining has been an issue for people looking to obtain GPUs going back to at least 2016, which is when I built my first PC, and its only gotten worse and worse as time has gone on.


When it spills over and becomes a common practice beyond the scope of the original problem?


Allowing/supporting crypto mining is short-term thinking.

It's a bunch of wealthy people burning absurd amounts of fossil fuels solely for the purpose of making themselves wealthier. There is no benefit to society from allowing this.


> There is no benefit to society from allowing this.

Sure there is. We get a financial system that's independent of banks and governments. I don't care how much energy it consumes, that's a worthy goal.

Also, it's not even 1% of global energy consumption and even that calculation is based on rather questionable assumptions like "energy usage is proportional to bitcoin price". If you want to stop the burning of fossil fuels, what you need to do is advocate against trade with not only China but also all western countries whose luxurious lifestyles are literally dependent on such wasteful use of resources. Gotta level heavy economic sanctions against all the highly polluting developed nations until they put a stop to it.


> We get a financial system that's independent of banks and governments. I don't care how much energy it consumes, that's a worthy goal.

I see it as exactly the opposite.

We live in a society, and as a society, we have agreed upon certain rules about how money is handled. If you want to opt out of those rules, then get out of our society.


> We live in a society, and as a society, we have agreed upon certain rules about how money is handled. If you want to opt out of those rules, then get out of our society.

Even the friends of mine who work in the banking industry will tell you that the current financial system is a rigged game that serves the elite and well-connected at the expense of ordinary people.

Reasonable people can disagree about the relative benefits and harms of cryptocurrency, but trying to make the case that the current financial system reflects some sort of happy societal consensus is roughly on par with being a lobbyist for big tobacco.


I'm not opting out of anything, nor am I leaving. Those "rules" form the backbone of the financial arm of total global surveillance. I oppose them on principle, as should everyone on this site. I want to see them either repealed or completely neutralized with subversive technology such as cryptocurrency. The only problem here is the fact that so much energy is being wasted on the failure that is bitcoin. I wouldn't mind even higher amounts being poured into Monero.

"Agreed upon" rules? This stuff is imposed on us.


Only in the same sense that laws are "imposed on" criminals. You may not have personally agreed to it, but it's part of the legal framework our society rests on.


Criminals like murderers? Yeah, everyone agrees with that, it's not controversial. Total surveillance? Not everyone accepts this fundamental injustice. Their actions violate the principles countries like the USA were literally founded upon. As far as we're concerned, they're the criminals here and this technology is just self-defense against their continued abuse.


They’re not merely making themselves wealthier - they’re providing a service. How is this different from an airline or any other user of fossil fuel?


Because the service as it stands is of dubious (or negative) worth. So far its main uses are enabling illegal activity and MLM-style get rich quick schemes. For all of the hubbub about the crypto-revolution, we are over a decade in and 99% of its use falls into those two categories.


I was sympathetic to that view right up until the Canadian government froze the bank accounts of the truckers. Now I'm openly supportive of the activities the regimes don't like in crypto.


> they’re providing a service

Doubling the price of GPUs is a pretty bad service.


The alternative is not "no card" it's "don't buy a top of the line card"

Every expensive product with potential commercial use has this situation. Anything you can easily make money with has a price that reflects it. People who want stuff for limited use cannot justify those kinds of prices so they have to buy older stuff, lesser stuff or stuff that needs effort/time dumped in in order to work.


The alternative _is_ no card. Even 5 year old cards are twice the MSRP if you can get them.


Is there some practical way to delineate the two product usages that isn't arbitrary I wonder?

Maybe make mining only cards more performant and affordable than consumer graphics cards, so that consumer cards will drop in price from less demand?

I find it hard to believe that a use case specific device wouldn't be able to dethrone graphics cards as the better choice for miners at the right price, and it should be cheaper right? No need for all that extra graphics related componentry like visual ports etc.

I know stuff like that is already on the market, but it's usually more expensive as they are from more niche manufacturing. NVidia has the scale to make a similar device more affordable I imagine.


You would have to saturate the market with specific mining devices. Like with bitcoin ASIC miners, the crypto mining demands have to increase to the point where using gaming graphic cards becomes unprofitable. Otherwise miners will buy the specific mining devices and the gaming devices. The problem is that Etherum was designed to resist ASIC approaches.

Instead of locking cards down, Nvidia could manage the supply. They could sell their cards in cooperation with Steam: Only people with an active steam account are eligible to a new card. If their performance stats don't improve after the purchase, which means they have resold their hardware, they won't be allowed to buy another card.


> If their performance stats don't improve after the purchase, which means they have resold their hardware, they won't be allowed to buy another card.

And then you have people who lose their cards, or it gets damaged, or who don't want Steam, etc. It's a big can of worms.


Not wanting Steam is the biggest issue. They could include further shops and even individual games. There also needs to be an option for people who want to do scientific computing. Having vouchers from scientific institutions would exclude everybody who is independent. The approach has to be refined a bit, but the general principle is that you can link cards to individuals.

To balance the downsides, Nvidia doesn't have to go all in on 'Steam sales', like Sony, who sells some of their PS5 to registered gamers but not all of them.

On the other hand, I don't see the problem with damaged or lost cards. Damaged cards can be treated like guarantees: send in your damaged card and buy a new one. Lost cards on their own are a problem but how do you lose your card? You have to lose your entire gaming rig which most likely means that it is stolen. So send in a copy of your police report. That way, miners will be punished for wrong legal proceedings if they make a business out of 'losing cards'.


> They could sell their cards in cooperation with Steam: Only people with an active steam account are eligible to a new card. If their performance stats don't improve after the purchase, which means they have resold their hardware, they won't be allowed to buy another card.

I have a baby. I do anything on Steam once a month at best. Crypto miners could pretty trivially make their use cases look like they spend more time gaming than a good 30% of Steam's active user base. This is a non-starter.


There are more mining rigs than people behind them, so limiting the number to number of people would be some improvement for sure


> Is there some practical way to delineate the two product usages that isn't arbitrary I wonder?

Highly unlikely. There might be reasonably reliable heuristics to identify gaming operations, but any signal for mining is likely to trigger for other long-process number-crunching too.

The idea puts me in mind of the other side: the police once raided a home because surveillance via helicopter identified, from the heat pattern, the property as like to have a loft-based cannabis farm. Turned out that the loft was full of equipment running coin miners. Simple checks get significant false positives.


Oh I meant specifically avoiding those kinds of arbitrary locks and checks, by delineate I meant make two separate product lines to cater for the two different markets. My hope would be that miners would buy the one better for mining, taking some pressure off the graphics cards for graphics market. Imagine a "3080 Mining" card that has no monitor outputs, no huge memory reserves, no extra-curricular processing units. It should be much cheaper.

But perhaps miners would just buy both mining and non-mining cards, since it's not a zero sum purchase, every card added to the stack is worthwhile.


Mining does have strong indicators though: mining memory access is nothing like memory access for a game or application.

Normal games and applications won't just max the whole memory bus at 100% for hours and hours. Even in a high-load situation it will still bounce around 90-100% or whatever. Mining absolutely slams the memory bus 100%, because current implementations are Proof Of Bandwidth so that's the stat you need to maximize.

Normal games or applications will go out of their way to align threads in a warp to access contiguous memory blocks ("coalesced memory access"), because that can be efficiently handled as a single request and broadcast to every thread. Mining can't do that because every thread is working in a totally different place in memory determined by the DAG. So normal applications will have lots of coalescing happening, mining will have zero.

Normal games or applications will have cache hits, because they're re-using at least some data frequently. Mining will have an (effectively) 0% cache hit rate because the DAG is sending threads to random places in memory and memory is much larger than cache.

These are simple things that can be metered with O(1) performance counters on the hardware. If the card sees a load like that, run in cripple mode, nuke the memory bus to 10% performance. Then you can also add the capability for a whitelist - the driver can attest that a certain binary has been signed by windows from the appropriate vendor, and TPM can attest that the driver and hardware hasn't been tampered with (Remote Attestation). So no need to worry about legacy code or DIY cuda code unless it happens to look like a mining application, and then you can whitelist any edge-cases.

TPM is going to be mandatory for OEM Windows 11 systems - home builders technically don't need it, but then if they happen to run some code that performs 100% random unaligned memory access you'd have to get it signed. But most code doesn't behave anything like that, so that is a real edge-case unless you are trying to write a miner.

You may be morally opposed to the above, but it's not impossible at a technical level, mining code actually looks nothing like normal code at a perfcounter level.


Just wait. GPU mined currencies is pretty much exclusive to Ethereum which has dedicated plans to migrate away from PoW. Monero is set on PoW but uses a psuedo random mining challenge to thwart dedicated hardware and target all purpose CPUs - its targeting, not sure if its what you want.


Crypto mining is just one of many factors driving availability down and prices up. If it disappeared tomorrow, you still would not be able to buy a GPU at MSRP.


Eh, you can fix that problem much simpler: nvidia can raise MSRP.


That or they want to remove software limits on performance to mine cryptocurrency faster.


In fairness, it's hardly an outrageous view to want to remove artificial performance bottlenecks on a piece of hardware you've spent a considerable amount of money on.


Except there is nothing "fair" about this. In all fairness these crypto-bros could buy the cards specifically made for mining, but they won't because they don't have any resale value.

Only reason they are buying GPUs is that they can run it until something better comes to market and still sell it to some gamer who is actually going to use the hardware for its intended purpose.

More power to Nvidia. If I could make the decision the hashrate of any cryptomining on gaming cards would be 0. Let the crypto-bros buy the crypto cards and leave the gaming cards for people who are actually going to enjoy them.


I don't care what the use case is. I don't support artificial performance limitation on expensive hardware. It's as simple as that.


I do care when this use case is harmful.

We shouldn't need to limit hardware for specific usages, but we already see that people won't stop themselves from wasting energy and accelerating world climate if this gives you more wealth and power.

I'd also be up for limiting all military weapons from being so destructive across the globe for example. It's mind blowing how we managed to create so many harmful tools in such a small time span of our history.

But I agree with you when the use case is actually useful for humanity.


Likening Crypto Mining to military weapons is ridiculous. Not to mention you as those taking your position on crypto often to have painfully failed Chesterton's Fence.


There's no likening, it was just one more example in the bag of hardware used for harmful activities, which is a very big scope.

I'd love to learn that I'm wrong on the crypto one, give how well widespread is, but I haven't seen any fully convincing argument.


Woah there. I don’t like the crypto culture and mindset as the next guy, but there are plenty of other options to preventing use case you presented. If systems aren’t closed enough already…

And as far as the gaming goes, I cannot see how playing games is more noble usage of gpu than crypto mining.


I've noticed a lot of directed outrage as a result of people unable to find a MSRP graphics card that I would almost treat as funny if it wasn't so scary to see how the internet culture changes. I can't speak to the motivations of grandparent or other single individuals, but I have noticed in other communities a lot of moral arguments being used very selectively that reminds me of my sociology class on the origin of the drug war and how moral outrage at drug use and the harms of drugs were pushed as the reasons for banning drugs, yet such logic was not consistently applied to drugs based on their actual harm to a community.


I think I get what you are saying, but I think it’s less of a social problem, and more of a resource scarcity problem that humanity isn’t used to solving.

As much as people are aware of resources being limited on our planet, rarely anybody actively thinks that it’s already becoming a reality, yet it seems like extreme addiction to technology has only just started.


"I'm just gonna artificially raise price of this limited resource even though there already exists hardware specifically made for me, but after the next generation comes out I can re-sell this one to some sucker"

If you don't see anything wrong with that then that's about it. I really hope US or China or someone is going to regulate cryptocurrencies to shit so cryto-bros can stop destroying the planet and normal people can again afford GPUs.


I am probably not that informed on who does what artificially and intentionally, but I still don’t see a point in justifying gaming usage over crypto usage. As far as destroying the planet argument goes, gamers could equally be limited to using a specifically designed devices for optimal gaming and entertainment purposes.

Apart from the ability to check the GPU health, GPU providers could easily implement some hardware “calculation counter” or design specific solution so resell value could be more easily evaluated.


> As far as destroying the planet argument goes, gamers could equally be limited to using a specifically designed devices for optimal gaming and entertainment purposes.

gamers frequently are, you just described a console.


Yes, I did. Intentionally. If you forbid consumer GPUs for both sides, limit or cripple the gaming tech development in a sense it offers lower variety of products, you are going to see a lower demand for gaming in general. Thus, “saving the planet”.

It’s not something I propose, but giving hypothetical example of what would really be fair for both sides


>gamers could equally be limited to using a specifically designed devices for optimal gaming and entertainment purposes.

We are trying, but crypto-bros are buying our cards. C'mon pay a bit more attention.


If video games are that important to you buy a video game console. The availability there is also bunk and has nothing to do with mining - claiming something which uses energy is automatically 'bad' is silly and relies on common ignorance and frustration. Cars use energy, lights use energy, video games use energy - crypto is new and you don't understand it past that fact that it uses energy so you can't tolerate it. If you don't want to get rid of everything which uses considerable energy then you need to determine the value of each thing individually, and if you say crypto is worthless, you should have a non-circular reason for that claim. Clearly the market disagrees with you.


>If video games are that important to you buy a video game console.

and play PC games how? for some reason I thought people around here were smart


Trust me, I'd take your problem seriously if only those cards would speed up growing up for you, guys.


I hope you dont have hobbies, since apparently hobbies are now childish.

Maybe you should grow up a bit :)


So basically you are surprised that people want to buy a higher value and better product instead of lower value?

Also, for "normal people" there are plenty of 1 and 2Gb videocards at reasonable prices, unusable for mining.


Not surprised (why would that surprise me?) just that this throttling is justifiable and I would go even further


You aren't entitled to cheaper video cards just because you don't like what other people are doing with them


Conversely miners aren't entitled to cheaper video cards just because they built a business around consumer hardware.

Companies seeking to maximize their revenue is very much the nature of the free market. It's their business, you don't get to tell them how to run it. You're a customer in a market, if you don't find the product acceptable then you can go elsewhere. There are competitors offering products as well, and as a whole this determines a market rate.

If your business is no longer profitable (or profitable enough) paying the market rate, then you go out of business. That is also how the market works. Many, many businesses would be far more profitable if they could force their suppliers to cut their revenue streams.

Miners have responded to the shift in the power dynamic by throwing a tantrum and attacking and blackmailing their suppliers. Bioshock nailed it: laissez-faire is great when you're the one on top, but as soon as someone else out-competes you, or exerts their own market leverage, it's an unfair and ridiculous imposition on your own right to profit, and it's time to shout and flip the game board.

This is exactly what you see with the whole "gamers aren't entitled to cheap cards" thing you said, that was great when miners had more market power than gamers, but everyone leaves off the whole "and miners aren't entitled to cheap cards either", which is equally true. Suppliers are taking note of that market power and moving to take a cut of the revenue for themselves. Customers are free to re-shuffle to new suppliers if they no longer find the terms acceptable. And that's how the free market works.

As always - businesses that are not agile enough to adapt, will "exit the market", and create room for newer, healthier businesses.

And remember, this has been status quo for a long time. If your business depended on CAD, it probably sucked when ATI and NVIDIA started releasing workstation products and artificially limiting CAD performance on gaming cards. The world moved on though.


> Miners have responded to the shift in the power dynamic by throwing a tantrum and attacking and blackmailing their suppliers. Bioshock nailed it: laissez-faire is great when you're the one on top, but as soon as someone else out-competes you, or exerts their own market leverage, it's an unfair and ridiculous imposition on your own right to profit, and it's time to shout and flip the game board.

#notAllMiners

All you're saying is that businesses should sell to whoever makes them the most profit - so how does that explain Nvidia cutting value and lowering the price of their cards to sell to gamers rather than miners?


Seeing as Nvidia agrees with me, I think you are wrong. In any case the sales have started to drop so if Nvidia doesn't do anything after crypto fad ends they'll be out of business since people aren't buying their GPUs anymore.


So you believe that Nvidia needs crypto to survive but knowingly and intentionally hampers their cards' mining performance? You seem confused.


Not what I said at all. Re-read.


wrong comment


> And as far as the gaming goes, I cannot see how playing games is more noble usage of gpu than crypto mining.

As usual the gaming crowd is utterly lacking in self-awareness.

"Hey, those crypto guys need to stop doing dumb things with GPUs so me and my friends can use that fancy hardware to waste hours of our lives pretending to be soldiers and race-car drivers on our computers in the basement"

Say what you will about the crypto crowd, at least they know people think they're ridiculous.


That's how the market works though. Your AMD 290X had artificial performance bottlenecks created (gimped FP64, driver performance limiter for enterprise software, etc) so that AMD could sell more Radeon WX cards. Your AMD APU has ECC artificially disabled so that AMD can sell more Ryzen Pro APUs. Your Epyc has its overclocking controls artificially disabled. etc etc. Those are accepted and normal practices to determine "what you can do with the hardware you paid for" in the industry.

Miners are just mad that they got a free ride for a lot of years and are now being shifted to their own segment to try and control the infinite demand they tend to periodically create. But they are a money-making asset in a business, nobody cried a tear when AMD and NVIDIA forced Raytheon to pay a premium to buy Quadros to run their CAD software at the full performance levels the hardware is capable of.

Also, generally speaking this arrangement is beneficial for consumers: if you outlawed artificial segmentation tomorrow, companies aren't going to hugely lower their prices and give up all that revenue from the enterprise market, they are going to raise prices in the consumer market. The alternative to gimped Celeron chips isn't that you get Xeon capability for Celeron prices, it's that you pay much closer to Xeon prices for your celeron. Which is like, several times as much.

That R&D has to be paid for somewhere, and margins are not all that huge considering the total lifecycle (the chips are cheap once you make them, but the R&D for the first chip costs billions). You can't just give up 80% of your enterprise revenue and make a go of it. If we really did have Xeon for Celeron prices, the alternative would be much longer product cycles and other belt-tightening in the R&D department. The beige box market has a huge business component too and they won’t have any qualms about an extra $200 on every cpu if that’s what it costs. It’s just gonna suck personally for you as a consumer.

Consumers are the “price sensitive” market that benefits from price discrimination, in this instance, and product segmentation is how you allow that. Take that away and those price-sensitive markets are the ones that will pay more, because business pricing is very inelastic and quantities are very large. They don’t care about you buying one celeron every 5 years as much as the business who buys 1000 office desktops every 3 years.

https://en.m.wikipedia.org/wiki/Price_discrimination


There should be no software limits in the first place.


Wouldn't that just end up increasing the difficulty of mining in the long term since it would all be opensource? I feel like any advantage they get by patching the software would be short lived with many people following suit, but I could 100% be wrong about that.


Bingo, that's what this is.


That cybercriminal’s name? Uhhh Tobias Lorvalds…


There are two scenarios where carrying out their threat plausibly helps the open source community:

1) A source drop demonstrates that Nvidia incorporated GPLed code into their drivers. This is, honestly, unlikely - Nvidia has sufficiently competent lawyers to ensure that everyone they employ is extremely aware of what the consequences of that would be

2) The source drop includes the private keys used to sign Nvidia GPU firmware blobs. Nouveau is currently entirely hamstrung on the last few generations of Nvidia cards because they run extremely slowly unless appropriate signed firmware is loaded, and Nvidia refuse permission to distribute that firmware. I'm not aware of any case law around whether private keys are copyrightable (I'd assume not, given that they're supposed to be randomly generated), and whether it's a DMCA violation to make use of leaked keys if you don't violate any other technical protection mechanisms. This would potentially (given a lot of work) allow Nouveau to implement equivalent firmware and sign it, but this would presumably still just result in Nvidia switching to different keys for the next architecture.

Obviously the calculation differs if Nvidia choose under duress to release the drivers under an open source license, but that doesn't seem likely - this still very much reads as an attempt to extort Nvidia into removing restrictions on crypto mining rather than an earnest attempt to improve open source support for their hardware.


From the Nouveau devs themselves: https://nitter.eu/_Lyude/status/1498811646697000961

They claim that leaking anything would actually make their work significantly harder.


Using leaked source code to Nouveau is illegal whatsoever, and the mere existence of the leak means that devs have to more thoroughly audit incoming contributions. The similar thing happened for ReactOS in 2006; they had to audit all the existing source code for more than a year.


I doubt the hackers are actually trying to help open source, it probably just occured to them that open sourcing would be a way to somewhat future-proof them not putting those restrictions back in for the next generation of cards

I'm not sure if they actually have the expectation that nvidia will meet their demands at all, I certainly wouldn't. Because why would nvidia trust that their word that they won't leak anything is meaningful?


> I'm not aware of any case law around whether private keys are copyrightable

That's often up to debate, apparently. I think most recently, Widevine private keys regularly get DMCA'd.


Distributing the keys is illegal, but what about using they keys and distributing the resulting signed firmware?


IANAL but I could imagine that this leads to different problems in some jurisdictions.

Depending on how you look at it using another's entity's cryptographic key to sign something that then "pretends" to have been produced by that entity might classify as forgery.

Though this is just my personal thoughts, not sure if that would hold up in practice. Technically you own the hardware, so "forging" the signature yourself for your hardware probably wouldn't be an issue, but distributing it might be.


> Depending on how you look at it using another's entity's cryptographic key to sign something that then "pretends" to have been produced by that entity might classify as forgery.

Perhaps, if you were attempting to make that claim to an actual person in a commercial context, deceiving them for personal gain. But in the context of "pretending" only to the hardware? Unlikely, as hardware has no standing in court. Even for-profit distribution shouldn't be an issue so long as it's clear to the human recipients that the signature is only present to fulfill technical requirements and the item wasn't actually produced by the other entity.


Good point, but also good luck explaining that to a judge.


The judicial system started to realize the finer details in software. For example, in Google v Oracle, verbatim copying of source code was still found a violation of copyright, but using APIs was not (it was deemed fair use).

https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_...


If NVIDIA's firmware's signature private keys are in a standard file instead of being in a HSM (Hardware Security Module), as all serious companies do, then NVIDIA can only blame themselves.

I seriously doubt that firmware's signature private keys are in hacker's hands.


This made me think. When I was lead developer for a very big online music provider our private keys were stored in plaintext in the registry of the Windows servers. Never thought about that before. That would have been an ugly leak.


> and whether it's a DMCA violation to make use of leaked keys if you don't violate any other technical protection mechanisms.

It might depend on jurisdiction. The devs could limit liability by not running without the user providing the secrets at build or runtime.


w.r.t. 2 this has been a thing with blu-ray and PS3 encryption key leaks [1]. I don't know if courts ever ruled on it

[1] https://en.wikipedia.org/wiki/PlayStation_3_homebrew


a fun exercise would be to put something copyrighted in a key. sort of like they did with a prime number that one time


>"We decided to help mining and gaming community,"

I don't understand how this demand helps the gaming community. Wouldn't this drive up the crypto-motivated demand for GPUs even more and hurt the gaming community?


> I don't understand how this demand helps the gaming community.

It doesn't. This is just miners co-opting an old argument to gain sympathy.


Eh, it absolutely helps the Linux gaming community. Not so much the Windows one though


Removing LHR doesn't help Linux gamers at all. What would help is if they leaked the source code so people could patch nvidia's shitty drivers.


They modified the demand.

On Tuesday, Lapsus$ modified its demand. Now, the group also wants Nvidia to commit to making its GPU drivers completely open source. If Nvidia does not comply, Lapsus$ says, the company can expect to see a new leak that would include the complete silicon, graphics, and computer chipset files for all its recent GPUs:

So, NVIDIA, the choice is yours! Either:

–Officially make current and all future drivers for all cards open source, while keeping the Verilog and chipset trade secrets... well, secret

OR

–Not make the drivers open source, making us release the entire silicon chip files so that everyone not only knows your driver's secrets, but also your most closely-guarded trade secrets for graphics and computer chipsets too!

YOU HAVE UNTIL FRIDAY, YOU DECIDE!


Sounds like Nvidia just needs to patent its trade secrets until Friday, thus making them useless to everyone for the next 20 years when they'll be so outdated nobody will use them anyway. Regardless of what is released if the license doesn't allow anyone to use it it's effectively useless. Aside from maybe a few pirated driver builds that will fix some linux issues and also install background bitcoin miners.

I seriously doubt Nvidia will cave.


There's a reason why Nvidia hasn't already patented this stuff. Even if patents can't be directly copied, they can inspire the competition to do similar work.


Except no one professional would touch the leaked source with a 10 ft poll. Expect continued clean room reverse engineering or get sued.


Of course, a cleanroom procedure is a given in this case. History has shown that it is entirely possible though - see ReactOS for one very public example.

But also, not everyone cares. See game modders who routinely use source code leaks to make game patches. I'm not talking about big companies that want nvidia would want to sue, but individuals and small groups, who are not a big enough target and would likely remain anonymous anyways.


ReactOS isn't necessarily the best example: https://news.ycombinator.com/item?id=20341933


That thread goes on to describe several very plausible reasons that a clean room implementation could have resulted in all of the coincidences the commenter pointed out.


Developers of the open source nouveau driver for nvidia GPUs certainly won't:

https://nitter.eu/_Lyude/status/1498811646697000961


I honestly don’t care about nouveau. It’s a fine project, but as a user I know it as the preinstalled software that cripples my servers until I install the proprietary drivers.

If nvidia releases the drivers as source-available rather than open source that would still be incredibly useful since you can distribute patches. Their drivers are still missing basic features like fan speed control on headless machines.


In my experience nouveau was way better than the proprietary drivers, which would crash a lot, while nouveau was stable. This was long ago though, maybe nvidia are better now.

I don't think you can distribute patches against proprietary source-available drivers, since modifying proprietary code is against copyright law.


Did you leave power savings enabled? Cause that’s a recipe for disaster on Linux. My ML servers have years of uptime with “nvidia-smi -pm 0” in crontab and auto updates disabled. The latter prevents the CUDA version from getting ahead of the kernel module version.

It’s a polished turd, but at least it works which is more than I can say for nouveau

And yes you can distribute patches. Copyright only prevents you from distributing the modified version.


This was like 15 years ago (no idea about power savings stuff), I wouldn't touch the proprietary driver any more though.

nouveau has been extremely stable for me when I used it up until last year.

Patches contain parts of the original code, so I would class them as modified versions of the original code. Also the initial modification isn't allowed either, I wasn't really referring to distribution of the patches, just creating them isn't allowed AFAIK.


You’re talking about the EULA, not the copyright license. The EULA can certainly claim you’re not allowed to do that, but it’s also unenforceable in most places so who cares?


I'm talking about neither the EULA nor the copyright license, but about copyright law, which doesn't allow modifications by default. Only a EULA or other license can then allow modifications.

OTOH, even the GPL is pretty much unenforceable these days, unless you are well funded enough to afford time consuming legal cases.



Of course they will review it in their spare time. They wouldn’t copy it, but they could certainly learn insights and apply them in their architectures.


No, they won't. Any competitors and any open-source projects will advise their engineers to stay as far away from it as possible.

It's a clean-room engineering situation, if you've even looked at it you're legally contaminated and you're now a liability to the parent company/project's IP. And now those projects have to be hyper-aware of any contributors who might "sneak in" a bit of NVIDIA code that gets the whole project taken down.

https://nitter.eu/_Lyude/status/1498811646697000961


If they release Nvidia's code it can't really help Linux compatibility out without being an IP nightmare. Miners won't have a problem using it, though.


Not necessarily. There is no law against writing an implementation of the drivers as long as you don't copy the IP-protected code but only use knowledge about how the code works.

If you need to be on the safe side, the person writing the free code should never look at the IP-protected code but instead read a summary of the findings by someone else (clean room implementation).


It doesn’t even help miners really because the difficulty will just adjust. I don’t think these guys even understand the basics of what they do.

The only way I could see it helping is if they have all Nvidia cards and other people have some AMD lol.


It hurts the resale value of those cards for gamers. Hence in the long run, it will also keep gamers from upgrading that card as easily, because they will get less for their existing card.

It's a flimsy argument, but it's the only one I know so I figure that is what they meant.


I'm not even sure this is really miners, or just garden-variety ransomware/extortion hackers who are trying to co-opt several old arguments (first LHR, now open-source) to gain sympathy.

Even "we're doing this for the miners because we oppose artificial product segmentation!" is far more sympathetic than "we're doing this for ourselves, because we like money".


For a long lists of reasons; I don’t like DRMesque features like LHR and don’t like the crypto craze.

(And to be clear, cyber-extortion is clearly a criminal activity.)

BUT, I think the ”helping gamers” sounds a bit hollow.


What long list of reasons? LHR has absolutely zero impact on games. If anything, its presence was helping gamers because miners were at least somewhat put off buying LHR GPUs, making them more available to gamers.


"You don't own your hardware" is a very long list in my book.


By that metric you never did, and LHR doesn't change anything here. It's just yet another out of million optimizations baked into the drivers already, except this one is on BIOS level.


Ever heard of Code 43? Or the dozens of other software locks nvidia uses to be able to sell you essentially the same silicon at 4 wildly different price points? You never owned it and you won't now.


They were trying to make their products less attractive to people who were buying out their traditional customers. You aren't even going to own a GPU if they were all hoarded by "crypto" miners.


That isn’t anything game-specific, rather something that applies to anyone who bought an NVidia card.


It helps them in one dimension and harms them in another. I think the ridiculous scarcity and price mechanics of graphics cards right now is a way worse issue than drivers, and we can advocate to have that fixed. Doesn't matter if they fix the drivers while cards are still 2k+ because I won't be using them anyway.


> I don't understand how this demand helps the gaming community.

Agreed. But I don't understand how this helps the mining community, either.

Removing the LHR restriction would lead to a vast increase of the network hash rate, and consequently the difficulty would also increase, leading to lower rewards. The result would be that (1) non-LHR card owner's rewards go down significantly, and (2) LHR card owner's rewards probably wouldn't change much, perhaps even go down as well.

At least 80% of Nvidia GeForce cards sold since Summer 2021 have been LHR. That's a lot of hashing power locked up.


Yep, it’s the red queen’s race problem. If LHR is lifted, your hash rate will rise, but so will everyone else’s, so you are either (1) no better off or (2) worse off because other people may use this increased efficiency per chip to buy more chips, thus increasing their hash power proportionally. No one will ever decrease their hash power.


Won't your hash rate rise more than the average hash rate since not all cards are LHR cards?


Yes, your slice of the pie would improve more than the average slice, but the size of the overall pie would be much smaller.


At an individual level though, increasing the hash rate of your setup without impacting energy usage means more efficient mining (cheaper to run and faster to recoup investment). Long-term, the least efficienct miners get forced out until there is no profit to be had for the effort involved.


> increasing the hash rate of your setup without impacting energy usage means more efficient mining

No, it just means you get more MH/s. It's also (probably) not cheaper to run, unless the cards actually draw less power as well.

Whether that's cost-efficient or not depends on the reward. And with hundreds of TH/s suddenly being freed from LHR, your individual reward will all but certainly go down.


Yes, but no. The ransomers probably invested a considerable sum buying up every last used LHR card beforehand, so they'd stand to gain if Nvidia changed their policy. That's likely how they'll get caught.


If they had tons of LHR cards and the code to unlock them, the last thing they would do is share the unlock code with others (= the competition).


It's not like typing in a Faxanadu code. The ransomers are sitting on a mess of confidential information they lifted from Nvidia and they likely have zero clue how it all works. Keeping it to themselves was probably the original plan, but once they got a taste for how tragic engineering is behind the curtain their only chance for recuperating the investment became (1) forcing Nvidia to write the driver update for them; or (2) leak it and hope some open source developer volunteers.


The article is confusing. It has conflicting information (the hacker group seems to have released conflicting statements).

First, it's about LHR only.

> We want nvidia to push an update for all 30 series firmware that remove every lhr limitations otherwise we will leak hw folder. If they remove the lhr we will forget about hw folder (it's a big folder). We both know lhr impact mining and gaming

Later, it becomes about entirely open sourcing the drivers:

> So, NVIDIA, the choice is yours! Either:

> –Officially make current and all future drivers for all cards open source, while keeping the Verilog and chipset trade secrets... well, secret


Just sell more GPUs. Nvidia makes more money. We get more and better GPUs.

I'm not sure how this is bad.


So I take it you haven't tried to buy a 30-series GPU for gaming, then. It's impossible*, and that's because of mining*.

* Yeah, yeah, I know, supply chain this and pandemic demand that, but it's really mining that's the culprit.


I bought one the other day, no hassle from a local store. They had a pretty wide select of 30-series GPUs. The cost of electricity is high in Australia though, so it is certainly less economical to mine here.


Around here that's a rare story. Actually, I know nobody that did try this and succeeded. But then we don't have true offline retailers around here; those with offline stores sell most of their GPU stock in their online store at inflated prices. So I tend to chalk that up to survivorship bias ;-)

That's in Germany, we have some of the most expensive electricity in the world, and then the prices exploded since about 09/2021: Previously the cheapest offer was about 0.27€/kWh or maybe even 2c less if you're lucky; now we're at 0.35 to 0.45 for new contracts, depending on the region. I haven't checked the last few months (and especially not in the last week), but when I checked in early 12/2021 mining still netted some money (IIRC in the ballpark of 3 Euro/day/GPU).


Is it so? I looked up those GPUs at Russian online stores, and they are available for order, although the prices start from $1230 for RTX 3060.

If you think properly, it doesn't make sense to pay such a price just to kill some time playing games. A cheaper card would fit this purpose better.

Also I cannot blame the miners. If I am not missing something, currently a GPU mines cryptocurrency worth 50% of it cost per year (in countries with cheap electricity). I doubt you can get such percent in a bank deposit or other kind of investment.


$1230 for an RTX 3060 is absolutely insane, and is in no way a fair or reasonable price for that graphics card. The 3060 is supposedly a lower-mid range card, which have traditionally sold for closer to $300-400 (which is what the MSRP of that card is supposed to be).

By comparison, $1230 is over twice the MSRP of 2016's GTX 1080, which was a high end card at the time (and is roughly comparable spec-wise). It used to be that paying over $1000 for a graphics card got you the highest of the high end cards, not mid range stuff.

Obviously it's not just crypto miners raising this price. The chip shortage is real, there's been quite a bit of inflation since 2016, and PC gaming is more popular as it's ever been. But crypto miners are absolutely having an effect, and there's no real way to frame it as benefiting the average graphics card consumer.


Nvida should be able to meet market demand. You just don't like crypto maybe. If Tesla was out of Models 3s you would blame...? Or correctly assess that Tesla should ramp up production.


If con artists convinced speculative investors to spend millions of dollars on Tesla Model 3s, light them on fire, and dump them into the ocean, I would not blame Tesla for the reduced availability of Model 3s.


Tesla still got paid.


that's irrelevant to the point about demand


But thats not what he or she was responding to.


You realize it's not simple to just build the chip production facilities to facilitate this? It requires major capital investment and planning.

Looks like you also missed that the chip shortage is affecting automobiles. Many articles exist that can explain better than I can about why this isn't a quickly resolved issue.


It's not a supply issue.

By design, proof of work requires even more resources as more resources are spent on it. One might say that it expands to consume all available supply.

Because of that, no matter how many cards Nvidia makes, crypto miners will want them all, as long as there is money to be made in crypto.


Not true at all. If enough hardware is available, pool only expands until reward covers electricity cost plus small margin. I.e. bitcoin mining electricity-cost bound for the long time now.


Electricity costs plus cost of capital (etc).


in practice that's not even true because some miners are willing to run negative margins because they extract value elsewhere in the process

(the most prominent example is stolen electricity, but also for many miners it's a form of cash laundering. You always get less money out than you put in, but it's clean cash. In China's case they can't get money past capital controls, but you can mine at a loss and sell the bitcoins in a jurisdiction of your choice, that's worth something to them. Also, many miners are front-running exchange trades or extracting other forms of economic value.)


Interesting. How does running a mining operation help you front-run exchange trades?


standard front-running stuff. you can see the transactions that are in the mempool, you can either submit your own higher-fee transaction (likely to be included before theirs) or if you're a miner you can try to mine blocks with your transaction included and theirs specifically excluded so that yours goes first.

that essentially tells you that the price is about to go up or down, so you can make money on the movement.

https://cointelegraph.com/news/front-running-flash-bots-and-...

https://coinmarketcap.com/alexandria/glossary/miner-extracta...


Thanks.

Most of what you suggest doesn't benefit from having significant mining capacity yourself. (Though one thing does.)


There are limits to manufacturing capacity, especially such advanced manufacturing.

I feel like if Nvidia could, they would have scaled up. Why wouldn't they want to meet demand?


Uh, because you can inflate MSRP into the stratosphere and blame it on crypto miners?

Cryptocurrencies give NVIDIA the perfect bullshit excuse that people won't ponder too hard over, because the story of modern media is simplistic... every story must have a "good guy" and a "bad guy". It has to be dumbed down for all the idiotic plebs like us.


> Uh, because you can inflate MSRP into the stratosphere and blame it on crypto miners?

Cards are still selling for significantly more than MSRP, so if this was the plan they're doing a terrible job of it.


Nvidia doesn't make their own silicon, they go through TSMC's foundries, which means they're competing for a finite amount of production slots.


I thought their gaming line of 3000 chips is being produced at Samsung


Doesn't really matter. 3000 chips are on the cutting edge node and thus competiting with everything else that also wants to make chips on it.

It really is multi-modal market. And demand for that node, is different from automotive woes...



If you have been watching, this has literally changed in the last two weeks (also, compare those prices to MSRP). It's not a coincidence that this is happening at the same time that Ethereum moving to proof of stake has become real to people.


>It's impossible

There's plenty listed on sites like ebay or rakuten. It's definitely possible to buy. Else you can go buy it when it gets restocked somewhere else or go buy it at a physical store.


> There's plenty listed on sites like ebay or rakuten

If you ignore the 90% to 200% mark up price from scalpers, sure.

> Else you can go buy it when it gets restocked somewhere else or go buy it at a physical store.

Ah yes, how to forget the mystic websites that have stock for more than 30 secs or the physical store that you don't need to line hours before and be in 10 telegram channels to know that.


No? I can buy a 3080 and get it under a week from multiple vendors. Sure, it's expensive. It's not impossible though.


I think the argument is that driver source being open could help the (tiny!) Linux gaming community, but I agree. This is about mining.


Percentage wise it looks small but numbers are impressive. Monthly active steam users using Linux was reported at over 1 million last year https://www.gamingonlinux.com/steam-tracker/


That will easily double in the next year with the release of the $399 Steam Deck and accompanying Proton and Steam improvements. Linux is finally about to become a serious gaming platform. (No thanks to Nvidia)


Steam deck uses AMD GPUs though, so this wouldn't help.


But the improvements Valve is making to Steam, Proton, and other areas along with game developers paying more attention to Linux compatibility will benefit all Linux users, not just Steam Deck users, and make Linux a more popular gaming platform even among people with Nvidia PCs.

Yesterday I installed Steam on my Linux machine and downloaded a random game that has no Linux port and played it. It worked perfectly, just like Windows. It was amazing.


How is that relevant to Nvidia GPU source code?


As stated upthread, "the argument is that driver source being open could help the (tiny!) Linux gaming community", and the further argument that the community will be getting larger and not only because of people using Steam Deck hardware.


I see, makes sense.


it absolutely cannot, because no official developer is going near this with a 10,000 foot pole.

https://nitter.eu/_Lyude/status/1498811646697000961


OSS terrorists blackmailing a corp into opening their code under threat of releasing IP/trade secrets - cyberpunk future here we come :)


My time has come, finally! ;)


I don't think I've seen beetlejuicing on HN before. Well done.


I doubt nvidia would have complied with their demands anyway, but the new demand "make all your code open source or else we'll release it ourselves" gives them absolutely no incentive to comply at all. If the choices are you can have your source out there but unlicensed so it's illegal for anyone else to use it, or you can choose to legally let anyone else use your source -- why would nvidia ever choose the latter?

The original version of the demand at least gave nvidia some business incentive to comply. I don't think they've thought through their demands very well.


That's a good point, although they are threatening to release slightly more than just the drivers ("the complete silicon, graphics, and computer chipset files"). But the legal vs. illegal access is definitely interesting and probably not something they took into account. I don't see, say, AMD stealing from "publicly stolen" technology, I can't imagine the insane lawsuit that would ensue. Maybe some Chinese company might do it...


I don't think this group is thinking within the constraints of the law. There can be no trust in this, the data is out there and no longer in Nvidia's control. Even if they fully accept the terms, the group can still do what they want.


who would fab their chips though? i can only imagine tsmc getting an obvious nvidia knock off and refusing to fab it for them for fear nvidia would take their cards somewhere else to fab.


Your argument doesn't make sense. If TSMC are the only people who can fab them, why would they been concerned about Nvidia going elsewhere? If there is an elsewhere, that's where the chips can be fabricated.


Think of all the security vulnerabilities and patent violations that would become apparent in case the Verilog is released too.


> I don't think they've thought through their demands very well.

If you, as a person with technical skills, aren't making money hand over fist with a normal job in the tech sector and instead are doing illegal things like this, it's very likely that there is something wrong with your judgment in general.


And what about when you have enough money?


unless my reading comprehension is failing me, they are threatening to release nvidia's hardware design files, not the driver source code.


They want the driver code public but not the hardware code.


> We decided to help mining and gaming community," Lapsus$ members wrote

Oh please. You have crypto and a vested interest in making GPU’s viable again.

Sorry but I have zero sympathy-crypto miners exacerbated the silicon shortage and leveraged their increased capital to make consumer cards so inordinately difficult to get. So no, suffer.


This. Tried to get back to PC gaming for a while. I play on PS5 and XSX but damn, with the price of a performant GPU I can buy 3/4 PS5s or 1 PS5 and a lot of games for it.


You’re definitely right about the price, but I bought a PC anyways for the freedom it gives.

You can emulate older games, get games for 1/4th of the price on second hand market and choose your store.

Paying 60$ for 5 year old games on my switch changed my view on console gaming.

Even if there’s a lot of room for games in the price difference between console and PC, I’d rather pay the higher fixed cost up front and then have cheaper and more games.

Maybe it’s just an ideological thing though and doesn’t make complete economical sense.


> Paying 60$ for 5 year old games on my switch changed my view on console gaming.

LOL. That's my very same reason. Got a Switch in 2021 and I was surprised Mario Kart 8 Deluxe (a game originally for Wii U) was still full price.


Plus I don't usually buy games on consoles for full price... I wait or just buy used


Valid point. Nintendo Games are still particularly expensive, even used.

Zelda Breath of the Wild still costs 40$ or more where I am.


Much more likely they don’t have crypto and want to earn it with the excess gpus they have

A balance of crypto is unrelated to the mining market, but i understand that was an impassioned response where all things crypto are in the bad bucket in your mind

I’m writing more so for anybody passing by that doesnt understand how they’re separate


> Much more likely they don’t have crypto and want to earn it with the excess gpus they have

Doesn’t matter if they have some or want to get some, crypto miners actions had a significant negative impact on the price and availability of GPU’s, so I wholeheartedly mining workloads being crippled on these cards.

> but i understand that was an impassioned response where all things crypto are in the bad bucket in your mind

Between their affect on GPU price and availability, their energy impact, NFT’s and multitudes of other scams, the crypto community isn’t doing a whole lot to not land itself squarely in a lot of people’s “bad bucket”.


I'm not a big fan of limiting hardware to run only approved software. One day it's for limiting crypto mining, the next day it will be for limiting our ability to use strong encryption. It's a slippery slope.


It would be much easier if crypto could just die out.


Why? I saw a good use for this week when journalists used it to purchase a car in war torn Ukraine.


Thats wonderful. Im sure a handful of victories can be listed. I highly doubt they wouldn't have found another way to purchase a car, though.

Anyway, I need not remind you of how big a boon cryptocurrencies have been to criminals and con artists. I think it should be pretty clear to anyone on this site why someone would be against them, just like it's clear to me why someone would be for them.


> I highly doubt they wouldn't have found another way to purchase a car, though

There was no cash. ATMs we’re dry.

> boon cryptocurrencies have been to criminals and con artists

It is probably overstated.

Orders of magnitude more criminal activity would be going through cash and even orgs that are meant to KYC, such as Western Union or even iTunes gift certificates. How many grannies are sending crypto vs providing logins to their bank accounts?

I’m not a crypto-apologist, I just haven’t heard any solid arguments against it that can’t also be said about cash or the traditional banking system.


> There was no cash. ATMs we’re dry.

paypal, venmo, cashapp etc would have worked the same. for all of them and for crypto you need to be in the ecosystem (need wallets, accounts, apps etc) and internet access.


All very US centric apps (with the exception of Paypal), but PayPal would gouge them on fees.

I guess what I want to get across is I’m not evangelising crypto, I just don’t necessarily want the space to die. I want them to keep iterating on the concept and see what other benefits can be generated.


I highly doubt they wouldn't have found another way to purchase a car, though.

Exactly this. I'm guessing if they needed food they'd be out of luck.


You’re probably not wrong.


If war broke out in my country I’d be glad I could take my money across the border by memorizing some words rather than dragging silver or possibly now worthless currency.


Got a link to that? Cryptocurrencies seem to me to be a poor choice in a situation where there might be issues with infrastructure like the electrical grid or internet connectivity.


> Got a link

  Got a car today. Bought it with #bitcoin  as all ATM's are almost cleared out and the credit card terminals are down. Hopefully, it will allow us to report a little from #Donbass while also keeping us safe and granting us an escape if things starts heating up here. [1]
[1] https://twitter.com/efmikkelsen/status/1497283116028686336?s...

They did end up both getting shot in the leg(s), but are now safe


Afaik, you send bitcoin by radio waves from a transmitter.

To run that transmitter, you will need electricity. The electricity can be made in such situations via a generator/inverter by burning diesel.


Was that single car worth the energy waste equivalent of a small country?

Detonating a nuclear bomb to kill one person isn't exactly a praise for its effectiveness, either.


A single transaction does not use the energy equivalent of a small country. Last estimation I've seen said they used around 1719.51kWh. That's still a whole damn lot (around 60 days of energy use of an average US household), but you won't power a country with it.


> Was that single car worth the energy waste equivalent of a small country?

What if this energy waste enabled the purchase of 1,000 cars for people in desperate need? Would it then be worth it?

What if it were a million cars, or a billion?


It won't, and it's a very important technology since it's the only way for humans to store digital property without relying on some government or corporation.

Also, it ever gets attacked, much more will be at stake, including your freedom for choosing what software can be run on your computer.


> It won't, and it's a very important technology since it's the only way for humans to store digital property without relying on some government or corporation.

Proof of work is the thing people have a problem with. There's lots of cryptographic techniques that you can use without proof of work.

And the only thing the likes of bitcoin need proof-of-work for is to avoid double-spending attacks.

Not quite sure what you mean by digital property. I can store valuable files just fine without proof-of-work. And we can copy them and give them to everyone in the world, too, at almost no cost.

Cryptocurrencies (and similar) allow you to keep a digital ledger. But the meaning of that ledger is still something we construct socially.

As a silly example: people agree to treat bitcoins as fungible. But they are perfectly traceable, so people could also agree to reject some specific bitcoin (and any descendent bitcoin that can be traced back to the banned ones; so if you mix them in a transaction, you just spread the taint).

For a slightly more real world example: I can imagine if Satoshi rose from the grave and held an auction to sell off the first bitcoin ever mined, he would get more than whatever bitcoin is currently trading for.


Cryptocurrencies, not cryptography. My software freedom is not dependent on the existence of cryptocurrencies. My hardware freedom is currently significantly hampered by them.


> Be on the side that empowers people with technology, not the side that wants to take it away.

I'm not really advocating for taking it away. I'm happy with a natural death or a market induced death. I'd definitely advocate for hardware that's terrible at mining because I'm not really being given a lot of options. You can't really say "freedom to use your hardware as you will" out of one side of your mouth while saying "you can't want hardware with restrictions" out of the other. It's a paradox. We both want the hardware that works for us, don't we? If NVidia can make cards that suck at mining, it's my economic freedom and right to choose to support that, and I would, just like it's your economic right and freedom to buy all hardware in existence to waste energy.

I can tell you 100% that cryptocurrencies are the opposite of empowering for me and those around me in my current life.


And that is where the contradiction exists — let’s say you get your way, and governments start arresting people doing arbitrary computation on their graphics card — let’s say hashing. That would very bad for everyone’s freedoms, software and hardware. The same anticryptocurrency playbook is what was used against cryptography in the 90s. Technology that challenges the powerful is dangerous!

It’s ok to not like cryptocurrency and not use it. Others can, and will. Be on the side that empowers people with technology, not the side that wants to take it away.


If we just arrest people for selling them for money instead, 99% of the problem would go away overnight. People can trade them as stamps or even barter them if they're so inclined, nobody has to be arrested for performing hash collisions on strings.


freedom for me, but not anyone else. Genius.


That reply probably sounded pretty smart in your head. Maybe you think I'm suggesting that we should outlaw or criminalize them? That makes sense based on your previous comment. But I said no such thing. Wishing it ill is not the same as taking away your freedom.


How on earth does it stop reliance on a government? Or specifically, prevent reliance on some entity monopolising force? This crypto wet dream, like all libertarian wet dreams, ultimately relies on there being a benevolent entity that prevents someone with a bigger gun than you from marching in and taking whatever they want.


There are plenty of examples of hardware being artificially constrained, and low hash rate GPUs are hardly the first. Slippery slope fallacy aside, why would you need a top of the line GPU to do strong encryption? That's something you can do with a pen and paper if you're motivated enough.


I just hacked my 3ds because nintendo is closing the eshop permanently. It uses GPUs to exploit something in the OS based off of known keys. The power needed to crack the code in 1-5 minutes requires powerful GPUs. Are you suggesting if NVIDIA decided to block something like this, I do it with pen and paper?


Yeah any artificial limitation on strong encryption would be on the decrypt side, not the encrypt side.


and you can send messages for free by lighting a fire and creating smoke signals, so why bother having the internet? /s


> why would you need a top of the line GPU to do strong encryption

He's referring to the fact that this is an assault on general purpose computing, not that he specifically needs a GPU to do encryption.


> I'm not a big fan of limiting hardware to run only approved software. One day it's for limiting crypto mining, the next day it will be for limiting our ability to use strong encryption. It's a slippery slope.

I'd assume that developing extra features costs money. If a buyer doesn't need a feature, why should they have to pay for it?

For example, if a gamer does not need drivers for a CAD program, or crypto features, why should the gamer have to foot the bill for developing these features, and suffer the consequences of them (e.g. scarcity due to cryptominers buying all gaming cards) ?


Because the hardware already does all of those things whether you paid for them or not. I don't care what the company thinks, if I pay for hardware I want the ability to use it to its fullest extent and without any question about my intentions. That is what it means to have computing freedom: I shouldn't have to answer to some bullshit corporate concerns over what I'm doing with my property.

It's one thing to test your chip and turn off defective parts so that you can sell a downgraded but still viable version and increase yields. Selling the exact same chips to everyone with software that locks out features just to segment the market is unacceptable and frankly quite offensive. It's the sort of thing that makes everyone wish they get cracked on principle, just to make things sane again.


> Because the hardware already does all of those things whether you paid for them or not.

AMD and Intel GPUs have the hardware capability of being great at CAD, but in practice they suck and nobody uses them because AMD and Intel do not have driver integration & certification with CAD software.

Why? Cause developing that costs money.

NVIDIA hardware and software only has this _because_ people that need it pay 10k$ for GFX cards that do these things.

If you'd wanted gamer cards to do these things, you'd need to add to the price tag of gamer cards the cost of developing and maintaining these things, which would make them more expensive.

The claim that this should be free because software and hardware costs nothing once you already have developed is illogical. CAD software, GPU driver software for CAD, and GPU hardware to meet CAD's demand, continues to evolve, which costs money.


There is a difference between hardware that simply doesn't support a use case and hardware that technically supports stuff but was artificially locked to no longer do so.

If the hardware was actually different then of course your argument would make sense, you could build cards that can only be used for a certain use case and sell those, but in that case you wouldn't have to lock down the card anyways


Intel have been selling cpus with binned cores for ages for performance reasons, and some IBM mainframes require you to license some features in hardware which you’ve purchases.

I know it’s counter to human nature to be denied use of the physical object that you bought, but it’s not new territory.


Reply in this comment thread: https://news.ycombinator.com/item?id=30568961


> The claim that this should be free because software and hardware costs nothing once you already have developed is illogical. CAD software, GPU driver software for CAD, and GPU hardware to meet CAD's demand, continues to evolve, which costs money.

I do agree with you, developers are expensive. And if i came across that way then that wasn't my intention.

If a gaming CPU doesn't support CAD tasks, never did so in the past and nobody expects them to suddenly do so.

However, what i personally was referring to is not lacking Software Support for a certain feature set but software throttling of Hardware.

Sadly it is something that happens a lot. Phones throttling because they received software updates, graphics cards pretending to be artificially slow because drivers include code specifically to slow them down.

In my opinion there is a big difference between "Sorry, we don't support that use case, but good luck to you" and "We are going to invest time to stop you from doing X"

The first is fine, it happens all the time. The second is (in my eyes) not.


If by software throttling hardware you mean LHR.

Then as other have mentioned there are two options for GPU vendors:

* Hardware unblocked: GPU can be used for mining, scarcity becomes worse, good luck finding gaming GPUs for less than 2000$ for as long as it is profitable to use them for mining (~2 years maybe?)

* Hardware blocked: GPU can't be used for mining (profitably), gaming GPUs can be bought by gamers (still scarcity due to COVID, but not as bad)

The main reason only NVIDIA does this, and AMD and Intel don't isn't because of "freedom", but rather because AMD and Intel GPUs are so bad at compute that nobody can use them for anything but gaming anyways.

Independently of whether one agrees with nvidia's decision here, and whether it actually achieved its goal or not, the thought process isn't really hard to follow.

And I mean, before LHR people were complaining that miners were buying all nvidia gpus, and they couldn't get any, and asking nvidia to do something. nvidia did something, and now that people got to get the GPUs for gaming, they are complaining that they can't mine on them, which is probably the only reason they actually were able to get one in the first place.

So :shrug:. To be honest, it was foreseeable that people were going to complain either way.


Isn’t this the case in many different products? E.g. I can buy a car which has certain features “locked”, because it’s cheaper for the manufacturing company to have a single production line, and helps ease supply chain / inventory management as well.


Yes. And I think the formulation 'I'm not a big fan of' is exactly right here:

It's a bit annoying, but also a trade-off I'd be willing to make for the right advantages.


I absolutely hate Nvidia on my Ubuntu machine geared towards development. It's been about 4 years since I've entered the CUDA(and cudnn) scene for AI development.

You'd think I'd be a guru in the installation process by now but I still occasionally make my system unbootable for one reason or another with new releases, or the whole thing "just don't work" outright.

It is really, really frustrating since every solution is basically "try again or start anew, we don't have the source so can't help you much sorry".


I would rather outlaw cryptocurrencies for being a harm for the planet with their crazy energy demands, besides also fueling criminals.


Be careful what you wish for. While I agree the current cryptocurrency is a mess, but prohibiting a pure computation cause more harms than good.


The computations are ok, but exchange with, well, non-crypto currency could be prohibited. Crypto would lose its worth of you can't convert it to dollars in a feasible way. Much of the market is driven by speculative investment and get-rich-quick type of schemes.

Of course, criminal organizations could still use it, but I believe it would be harder to launder money using crypto if banks no longer supported it.


I think a ban on buying and selling crypto might just be good for the cryptocoin ecosystem. The whole idea of cryptocurrency idealists is that it replaces cash transactions, so if there is any value in the system besides pure speculation, the crypto coins should still work as advertised. Services can still accept the coin and goods can be sold for the coin, but you can't launder money through crypto networks anymore.

It's crazy how banks are willing to scrutinise anything related to sex work to an extreme extent yet seem perfectly willing to support the latest weekly pump and dump by letting people exchange money for cryptocurrency at random exchanges.


That ship has sailed. FP64 is artificially limited on AMD 290X graphics cards, as is workstation performance in enterprise software, and neither of those ideas was new at the time.

That ship sailed decades ago, in fact.

And in none of these cases, by the way, is computation prohibited. LHR cards are still capable of mining. A 290X can still run Solidworks. The choice of "prohibited" here is deliberately inflammatory - a word like "slowed" or "reduced" is more appropriate.

(But it is true there are cases like ECC on AMD APUs where it actually is fully prohibited.)


I'd like to engage with this opinion because I genuinely have no idea how this is the conclusion people reach (that crypto is bad for the environment).

The configuration of transistors through which electrons are punched is obviously not the problem (at least I hope it's obvious).

So it must be about proof of work chains, right? That they create high demand? That's about the only point I do understand, but I think you'd be hard pressed to meaningfully translate a bitcoin transaction to a carbon cost. There are some mining farms powered by green energy (less altruistic and more about being off grid).

But if we're talking about energy usage and environmental damage, we have to talk mining.

There is just no debate that mining is THE singular (keyword) greatest threat to the environment. Fracking, acid mine drainage, offshore drilling, sonic bombardments of coastlines, the ridonculous requirements of smelting. The list goes on and on.

We are stuck with that industry for a long time though. I'd argue the minerals that constitute the technology you're using to view this comment may have done more damage than bitcoin and co have.

In a nutshell, here's my problem with this argument: To suggest that demand is responsible for the damage caused by supply is just a crazy warping of reality. That absolutely does not make sense.

There are other, more pressing issues with crypto (like the lack of regulation around fiat exchanges and how they constantly get rugged) and those criticisms are valid, but they get lost in the noise of "how we generate energy is not as bad as how we use it". I just can't get on board with that point of view.


Cryptocurrencies have shown that you can transform energy into money without any of that messy middle step of doing something useful.


They prevent double spending attacks.

That service is useful for some people.

You can of course disagree. Just like you can disagree that building and fueling yachts for rich people is useful.


This entire chain of replies is skewed. Yes, real world mining is a serious offender, but that doesn’t exempt crypto mining. Have one look at one photograph of a crypto mining farm for one second and this will be obvious to anyone.

On the point of double spend - collusion in bitcoin mining is very possible and decentralization is not as strong as claimed. There are tons of studies about this.

https://cowles.yale.edu/3a/parlour-miner-collusion-and-bitco...


> Yes, real world mining is a serious offender, but that doesn’t exempt crypto mining.

Agreed, they have essentially nothing in common apart from the name.

> Have one look at one photograph of a crypto mining farm for one second and this will be obvious to anyone.

Why? You can get pretty bad pictures of eg factory farming, too; still that doesn't mean people will stop eating factory farmed meat.

> On the point of double spend - collusion in bitcoin mining is very possible and decentralization is not as strong as claimed. There are tons of studies about this.

The pdf you linked says nothing about double spending.


> Agreed, they have essentially nothing in common apart from the name.

I don’t know if it’s your English, but you just said you agreed that crypto mining is an offender on the level of other forms of mining.

> Why? You can get pretty bad pictures of eg factory farming, too; still that doesn't mean people will stop eating factory farmed meat.

Except there are people who stop eating meat, reduce their intake, and there are advances in meat alternatives in the market. This is whataboutism 101. It’s quite frankly ignorant.

> The pdf you linked says nothing about double spending.

The PDF I linked is about bitcoin mining collusion, one of bitcoin’s flaws. If you knew anything about collusion and the consequences of a potential 51% attack, double spending is one of the central topics.

Your entire post sounds like paid misinformation.


> Your entire post sounds like paid misinformation.

I wish.


Yes, go on and ignore all the points made - especially your incorrect point about the value of cryptocurrency pollution in providing double spend protections - a feature that can easily be abused by a handful of powerful miners. Sweep it under the rug as you go on justifying the rampant climate impact of cryptocurrencies.


How is that different than central bank printing? Print money and hand it to your buddies so that they buy new yachts instead of producing anything of value?

At least with crypto we know where the money is going and potentially how it is spent.

I can’t believe people still argue against cryptocurrencies in 2022 when it has completely proved itself superior to fiat in every way possible, including overall energy usage.


Nice red flag: people that claim crypto is bad because it uses energy are people that actually trust Elon Musk's arbitrary opinions. Elon knows less about cryptocurrency than a fair amount of degenerate twenty year old speculators.


I think you overstate the overlap of Musk fans and crypto climate impact critics.

You would be mistaken to think that Musk fans are in large part anti-crypto. It’s a weird notion for you to even submit.


Musk has a lot of influence for casual environmentalists - he has the biggest electric car company after all. His misunderstandings of crypto and what's important had far reaching affects on adoption and plenty an ignorant friend have tried to lecture me reinforced by what they view as such an authority.


Resource extraction and usage is harmful, we do have many energy-inefficiencies that I for one would love to see fixes.

However, that doesn’t excuse cryptocurrencies wasting egregious amounts of energy on top of those inefficiencies.

By its design, it’s antithetical to serious energy and performance optimisations (e.g. won’t do away with mindless hashing, so their TPS will be lower by definition) so even if we are able to drastically improve the energy efficiency and minimise the environmental harm of our resource extraction and processing, crypto currencies will still have the worst efficiency.


[flagged]


I'm no fan of banks, but that doesn't mean I'm about to trust the same small group of people that brought us The DAO, MtGox, OpenSea and the endless churn of crypto scams.

Your arguments are the same ones that argue for cash, and not more layers of unaccountable middle men or networks that are controlled by a minority of wealthy stakeholders. I even made the argument for cash using that same exact anecdote about Cyprus that you used about a week ago[1].

At any point the government could order exchanges to take a haircut on deposits and transactions. If that happened, 99% of people who hold cryptocurrency would not have the ability to liquidate it without taking a haircut. Even LocalBitcoins needs to follow KYC laws in the US, and will comply with laws of the country as long as they want to operate in the US. If people used any legal exchange in the US, their wallets are associated with their identifying information and there will be a record of transactions on the blockchain that point back to them should they try to escape the haircut.

[1] https://news.ycombinator.com/item?id=30436347.


> What will you do when the government decides to nullify years of your life spent working?

The US Government has nuclear weapons. If the US Government decided one day to start actively screwing me over...

I'm fucked.

There's no two ways about it. Them zeroing my bank account is going to be the least of my concerns.

The argument about "if you use crypto currency you'll be safe from an evil government" holds no sway. The government knows where I sleep.

The only way to not get fucked by the US Government, is to make sure the US Government is run by good people.

You cannot mitigate an evil government. An evil government has so many tools at its disposal, an individual can do nothing.

Hence, I'm forced to trust the US Government. Because if the US Government is untrustworthy, we're all screwed.


Good news! Some of the megacorps nowadays have power almost equal to that of governments! /s


[flagged]


>>Bitcoin has been around 13 years at this point and you still don’t get why folks need to use energy to secure their money.

In those 13 years it has proven to be nothing more than a Ponzi scheme for cryptobros and probably one of the most wasteful endeavours that humanity has ever started. One can only hope that the next 13 years kill it for good, either by imploding internally, or it being outlawed in every country that can.


> Ponzi scheme

Doesn't meet the requirement: New investor money to pay old investors.

Which is not to say it's problem free, but that problem is at the exchange level. Art suffers the same flaws. And Yachts. And on and on and on. That's why we have regulations.

Desperately need regulation at the fiat/crypto exchange


Yes it does - Bitcoin is mainly used for speculation. Paying for groceries with Bitcoin is too expensive (transaction fees) and you typically cannot use your Bitcoin savings as a collateral for another investment due to the price volatility.

This means that if you want to cash out, you need someone else to invest their "new investor money" in Bitcoin.


Nah, a lot of things are that without being a ponzi, like Amazon stock. It pays nothing. Does nothing. Requires you to find someone who will pay more than you paid and sell it to them to make money.

It’s not.


> it being outlawed in every country that can.

Being pedantic for fun. Isn't all crypto currency already outlaws? They operate outside of all the financial laws and regulations we have. In fact I think some proponents are citing this as a feature.


>>Isn't all crypto currency already outlaws

No? Here in UK you can legally buy/sold/hold crypto. HMRC even tells you exactly how to pay taxes on it with official documents explaining the process.


Oh, I didn't know. I was trying to use the original definition of outlaw, where a person is excluded from protections of the law. Do crypto transactions enjoy the same level of regulatory protection (assuming UK has any) that normal cash transactions do? Like if I am defrauded in a rug pull, can I sue the rug puller?


Well since it's a civil matter sure, you can sue whoever defrauded you. But no, general protections enjoyed by other financial transactions don't apply generally.


That’s a pretty hyperbolic response. There’s always a risk that mismanaged use of energy results in us never reaching Kardashev Type 1 (i.e. we destroy the habitability of the planet before we get there).


That is true. However I am responding to another sheer Luddite attitude on a tech forum. Hating a new technology for the sake of it is not progress.


So is liking/hyping a new technology for the sake of it (or for the sake of getting rich).


Sure, but one side can be substantiated.

The Satoshi protocol is not about money.

It is a distributed, trust-less consensus algorithm. Finance was just a good candidate to start with.

But we can expand that much further.

Some strange examples I hear about how this is silly:

"You can't have a smart contract act as an employer because it doesn't exist" But neither do companies? They're an abstraction that we agree exists, enforced by law.

It needs to be enforced by law because of the issue of trust.

But if I have a mechanism that removes trust (merkle trees) then what's the issue? The contract will do what it states it does because that's the only option it has.

Granted, it's a touch utopian and I'm not advocating for setting this system up tomorrow. But it's absolutely an achievable goal


Crypto-currencies and smart contracts change some aspects, but they don't remove trust completely. And people don't want it to remove trust completely.

Remember when ethereum forked, because a smart contract many people cared about actually specified something different in its code than what the humans intended it to mean? See https://en.wikipedia.org/wiki/The_DAO_(organization)


> But if I have a mechanism that removes trust (merkle trees) then what's the issue? The contract will do what it states it does because that's the only option it has.

The problem I have with this line of reasoning is it does not consider the interface between on chain and real world realities. Some of it is easily doable, like money (if I have a token I have a token), but many are not at all easily doable like land use. I can write I have a piece of land on the chain but that does not give me any control over it. If I have to bring in the government to enforce land usage then why not let them do the owner tracking too? If they disagree with the blockchain its not like I can do anything about it.


Sure, but that doesn’t apply here. I am a technical individual who can make a case for the technology, and have for many years. All you precoiners do is complain that everyone else is getting rich and/or destroying the planet. It Just isn’t so!


What case do you want to make about technology though?

That it is cool is I think pretty much accepted by everyone. The objection there being just because something is cool doesn't mean we do it especially if its at the detriment to the planet we live on.

That it is useful? If you have time, please make that case. To avoid having a flamewar over it, lets try to be concrete and cite an actual project (even if in progress)


> There is no better metric of human progress than energy usage.

True, but when energy is used to manufacture products or services. When you turn energy into money there's no progress involved and no new jobs are created. The only linked activities which would benefit would be energy creation and the manufacture of mining hardware. Everything else will suffer when more and more money will be diverted to buy more energy (whose prices will of course be dictated by the highest bidder) and more mining farms. Of course I expect people with vested interests in cryptocurrencies to not care at all.


Countries are get kicked out of the global banking system, and WESTERN countries start confiscating bank accounts. Inflation at levels not seen in 30 years.

And that’s just last month.

Bitcoin is a product. It autonomously provides a Service. Sound money. Hard money. Censorship resistance and decentralized peer to peer money. It’s ok if you still are skeptical, but don’t wait too long!


Are you suggesting that it would be a good thing if Russia used cryptocurrencies to evade sanctions?


I am suggesting a monetary system that is beyond the influence of the state is a good thing.


Good for criminals and rogue states, certainly.


Sure, but also the dissident, someone fleeing a war zone, someone having their life savings destroyed by bankers and politicians…

It’s ok if you don’t have a use for decentralized money. I hope you never need it, and continue a sheltered life of luxury and convenience.


Someone fleeing a war zone probably shouldn't depend on currencies that require Internet access.

For defending one's life savings, I'd recommend investing in productive enterprises rather than speculative and volatile get rich quick schemes, but to each their own.

For dissidents, crypto might be some small help, but not much. (For example, crypto did help WikiLeaks accept donations, but Julian Assange is still in prison.) Even with cryptocurrency, dissidents still need to reveal themselves in the act of protest.


I'm sure this info is very helpful to the people who invested their funds in the Russian stock market - are they not productive enterprises? Now completely useless as money.


> Inflation at levels not seen in 30 years.

Depends on what country you look at.


Yeah, let's piss away a bunch of energy for no good reason and then say we're a more advanced society because of it!


You are mostly describing the mineral mining sector though :D

"Shiny metal, ug ug, me dig up".

Which is not to say gold has no practical use but let's not pretend that's the only reason why we're mining it


Would seem to be a rather misinformed comment even to a crypto-layperson.

I’m surprised we don’t have a chorus of cyptofiles informing you that crypto emits less greenhouse gases than the traditional banking system, and there are coins/tokens that don’t require melting chips to establish proof of work.

I can’t talk to the use of crypto currencies for crime, but I would be surprised if the total volume of transactions were greater than that of traditional currencies (ie cash).


A couple of million transactions per day: https://www.statista.com/statistics/730806/daily-number-of-b...

That's less than the number of private transactions in a moderately sized city, never mind business or financial transactions.

Random comparison: NASDAQ alone sees 30-40 million transactions per day. http://www.nasdaqtrader.com/Trader.aspx?id=DailyMarketSummar...


It’d be real interesting to do an analysis of the last mile power usage of the credit card networks / banking system. I wonder if the giant volume and ubiquity of credit card terminals tips the scales.


...but I would be surprised if the total volume of transactions were greater than that of traditional currencies (ie cash).

You cannot compare the two. If you are poor and someone gives you crypto, you still won't have better food or a better apartment until you turn the crypto into cash. It isn't surprising that a less useful, simple form of transaction notation (which is really what money is and always has been) has less transactions than the sort of currency that can almost always buy you food, let you watch a movie, and pay your rent.


Symptom of being early, or more precisely, of having low liquidity comparatively. When the market cap increases liquidity rises and price stabilizes which lead to the final requirements for layer 1 of a currency.


Cryptocurrencies are pretty green besides bitcoin and ethereum


Also worthless, because they just keep printing them and giving them away to VCs, who then sucker engineers into building em a product/ecosystem.

You will see it eventually David!


> Also worthless, because they just keep printing them and giving them away to VCs

You literally just described what's happened with fiat currencies in the past decade.

Endless printing of money and giving it away to big banks.

I guess you'll see it eventually randomhodler84.


Except fiat currencies have a value in the market based on the economies of the countries that issue them.

I have yet to see a cryptocurrency that has a value in the market based on anything other than "a bigger fool might come along".


I guess you don't understand what value the seal of approval a dollar has - for that is what Bitcoin generates but without the backdoor devaluation and cronyism of private banking.


Sorry, that sentence doesn't make much sense as it is, could you please expand on it?


Why don't you start by explaining what you are confused on?


I know, that is why I am bullish on Bitcoin and bearish on shitcoins.


The relevant metric is probably something like electricity spend (or ton of CO2 emitted) per dollar of economic value created.

Bitcoin and ethereum burn a lot of electricity, but they are also worth a lot.


However, one could argue that a proof of stake Ethereum (for example) would still be worth as much while at the same time drastically reducing the amount of CO2 emitted. So if we measure it that way then moving away from proof of work should be the way to make everyone happy?

No more crypto miners buying graphics cards, far less CO2 emissions, crypto retains it's value since most owners don't really care about how it is made anyways and only see it as an investment opportunity.


Yes, if proof of stake can be made to work as advertised.

(Modulo some unforeseen consequences, of course.)


Could you tell your daily routine? I would like to calculate your carbon footprint and compare with mine to see whose is higher. By it, we can figure if you or me (an average cryptocurrency user) is harming the planet more.


I wonder how professional these criminals are. A crime syndicate would silently ask for money. This seems either a false flag operation for a group wanting something else, or much more likely a kid playing around and finding an unlocked door.

That last case, a kid pissing of a powerfull entity with a crime, generally does not end well for the kid.

I don't see NVidia publicly giving in and loosing face, so there seems no upside either for the criminal.


So you think nVidia will want to see the release of all of its chip schematics? That's an interesting position, but I fail to see how that would be a good idea for them. Perhaps if they are hoping the hackers will be apprehended before Friday.


It's a standard rock vs hard place situation. But

1) why should nVidia trust the attackers? nVdia might give in, and the files might still be leaked. The attackers should be able to guarantee they won't release the files AND nobody steals the files from them. Hard sell for the attackers, especially with a kid profile .

2) These files are legally toxic. You can't look at them and then publicly act on their content. So anything a third party can do has to happen at arm's length, parallel construction style. This also goes for open source devs, who can't permit nouveau gettibg kicked out of the legal repositories.

3)It is well possible the leak is not as damaging as it looks. People in the industry swap jobs all the time, and take knowledge with them. People accidentally do small leaks all the time, being sloppy with data entrusted to them. It seems reasonable for other big organizations to already have some level of knowledge of the content.

On the other side of the coin is the fully legal loss of control of their software. They also open themselves up to future ransomers ('Danegeld').

I'm not saying high level people at nVidia aren't swearing loudly right now, but a 'let the chips fall as they may' response seems most likely to me, especially combined with a 'we'll very publicly sue the attackers in the ground, as an example for all wannabes' response.


> 1) why should nVidia trust the attackers? nVdia might give in, and the files might still be leaked.

They might, but they will surely be released if they choose to ignore the attackers. Given the choices between a bad outcome happening potentially and surely, which one would you choose?

> 3)It is well possible the leak is not as damaging as it looks. People in the industry swap jobs all the time, and take knowledge with them. People accidentally do small leaks all the time, being sloppy with data entrusted to them. It seems reasonable for other big organizations to already have some level of knowledge of the content.

I still fail to see the logic. Assuming the attackers are honest, the choice is between: a) drivers public b) everything (which includes drivers) public

Option a) garners significant sympathy. What is the incentive to choose b)?

> They also open themselves up to future ransomers ('Danegeld').

This is the only argument that makes a bit of sense to me.

> especially combined with a 'we'll very publicly sue the attackers in the ground, as an example for all wannabes' response.

That can only happen if they are caught.


> These files are legally toxic. You can't look at them and then publicly act on their content

I keep reading this, but is this just a US issue? I assume people in other countries can look and clone these?

Given close to half the worlds population live in India and China, and the US make up < 5% of the worlds population, this doesn't seem like a big concern.


What's to stop the group from releasing them anyway? What happens to the files on the hacker's drive - surely the hackers won't delete the files? That would give up the leverage they have. If the files are out there, they could even be leaked by accident (hacker loses laptop, drama in hacker group causes member to rebel, rival group hacks this group, etc)

Risk-wise, I think nVidia has no choice here. They should assume the Verilog source will eventually be made available. The only question is whether the time between now and then is valuable enough to give in to the demands.


There is no guarantee that the complete archive won't at some point be released, that is true, but there is a sizeable, reasonable chance. The alternative is that they are definitely released.

Given this, what is the game theoretical reason for nVidia not to even try preventing the release of the complete archive? I think there is none: they should release the drivers as open source and try to contain the damage.


They won't want to, but I suspect they would prefer it to the actions demanded by the ransom.

Nobody legitimate can do anything with the released information if the hackers leak it. It's a huge patent red flag. They could with the drivers open sourced.


NVIDIA could have violated some patents with their design themselves (even unknowingly). There are legitimate reasons why NVIDIA wouldn't want the leak to happen, even if other companies likely wouldn't touch it.


I doubt China would have any qualms using the information


If you're in certain countries you can't do anything with the leaked information officially. But you can still look at it and learn from it without telling anyone. As long as you conceal the trail, which shouldn't be that hard, you're good to use it.

On top of that, there's the issue of potentially violating patents the other poster raised.

I'd say those are pretty good arguments for nVidia to try to prevent this scenario.


NVIDIA's terrible software (drivers and CUDA) have wasted months of my time. They have what is essentially a monopoly on hardware for "Deep learning" and so this should come as no surprise. Of course to a certain extent this isn't their fault, if AMD or Intel got their acts together they could come up with a much better solution, and if it were free/open-source software people would even pay a premium to be free of NVIDIA's cruel grasp.


> Nvidia introduced LHR in February 2021 with the launch of its GeForce RTX 3060 models. Three months later, the company brought LHR to its GeForce RTX 3080, 3070, and 3060 Ti graphics cards. The reason: to make the cards less desirable to people mining Ethereum and possibly other types of cryptocurrencies.

What a bizarre move. Why would they do this?


It was an attempt to decrease demand for the cards, thereby reducing price.

The simplest market solution to high demand and inadequate supply would be to raise the price of the cards. This would maximize short-term profits but Nvidia felt it was not the right strategic move. Why not? Here are some guesses:

- pricing gamers out of the GPU market would damage Nvidia's historical market - video games

- pricing AI users out of the GPU market would damage an exciting new market Nvidia is working very hard to cultivate

- tying a large fraction of your business to cryptocurrency is risky given the volatility of that market

I think Nvidia sees gaming and AI as viable long-term markets and feels very invested in maintaining the health of those markets and Nvidia's participation in them. They are much less interested in cryptocurrency mining customers because they don't see a long-term future for themselves there. Nvidia is willing to take a short-term hit to profits to get more of their products into the hands of gamers and AI researchers.

However Nvidia has not completely forsaken short-term profitability. They have substantially raised the price of GPUs and they are making considerable amounts of money selling GPUs to miners. They are trying to take a middle path and balance competing interests.


That argument falls apart when you realize that LHR didn't cripple nVidia's cards, it merely reduced their mining performance to being just a bit stronger (but still stronger) than AMD. Had they really wanted to kill miner interest in their cards they would have reduced hashing performance by something like 90% with LHR. I think all they were ever really trying to do was decrease the relative desirability of nVidia cards specifically by miners, not eliminate it entirely.


So if they ever have a bad quarter increase the LHR over night and force crypto miners to buy the ‘new’ cards to stay competitive


Gamers by new GPUs every cycle to every third cycle. Keep them covered and having hardware and the developers will continue supporting your hardware too. Decent market, with growth potential from countries getting richer.

AI, get your tech in, get them them invested on software level and as it scales demand will go up and stay up. Just see how long Intel has been around. Similar position and inertia is sure way to make money.

Crypto on other hand purely chaces profits. And you can't really rise the prices for partners, that is board manufacturers too much. So even if end user prices are high, not all or even big chunk transfer to Nvidia...


Nvidia as a hardware manufacturer has long lead times and low flexibility. As a result they prefer predictable and dependable sales, not random boom and bust cycles. The gaming market is dependable, mining isn't.

They don't mind selling some GPUs to miners, of course. But mining is disrupting the supply of GPUs to the gaming market which could hurt future growth of the gaming market in general as well as Nvidia's share of it. And Nvidia is a company that has no qualms about artificially segmenting markets to apply price discrimination or otherwise manipulating customers as much as they can get away with. So they are trying to separate the markets for gaming and mining by any means they can think of.


Why is the mining market not dependable?


It crashed back in 2018 and dumped a lot of cheap GPUs on the aftermarket. Nvidia plans production runs years in advance and can't handle the randomness of sudden crypto crashes


huge variations in the price of crypto lead to enormous mining boom/bust cycles that are impossible to plan for.

Semiconductor fab planning operates at least a year in advance, and even then there are limits, TSMC has other customers and NVIDIA can't order 100% of TSMC's supply even if they wanted to.

But in a day, crypto prices can 10x, or 0.1x, and this occurs relatively frequently (potentially tied to large capital injections from a questionable "stablecoin" called Tether). When that happens, mining suddenly becomes 10x as profitable, and miners hoover up every card on the market for the next 6 months, so the market goes through a prolonged shortage and periods of very high prices (as market economics dictate). Then when it crashes, miners dump all the cards back on to the market, and then NVIDIA has to drop prices to compete with a flood of used cards. It's all basically impossible to plan around given the realities of silicon production.


1. Positive PR from "helping gamers"

2. Product segmentation (nvidia also make mining cards)


3. To actually help gamers and other legitimate users of GPUs


I’m sorry you are being outbid by another guy who wants to mine with a graphics card, but who are you to say that guy securing a censorship resistant and decentralized monetary system is less legit than your leisure time amateur rocket league tournament?

If the owner uses it to train a ML model? Is that legit then? Or can general purpose GPUs only be single purpose for games. For gamers?


I think my position is well known. I am not the first person to argue that cryptocurrencies are negative because they generate waste by design and are a technical solution to social/meatspace problems. I personally don't really need a GPU right now (lucky for me I can probably hold off a couple more years), but I am pretty concerned about the future of the planet in general and have some increased existential dread and angst arising from the fact that we are finding new and exciting ways to accelerate global warming.

We can just dispense with the part of the discussion where we argue about why we think it is or isn't legitimate and just agree that we have mutually incompatible world views.


Our views are mutually incompatible, but only one of us is correct. :-) time, the universal currency, will be the final arbiter.

I do hope it works out for humanity tho, maybe we are alone in the universe and we are squandering the most unique and precious outcome of random chance, life.


> decentralized monetary system

Binance. Mt Gox. QuadrigaCX. Crytopia. Poly Network.

Yeah, decentralised.

Also, censorship resistant? Tell that to people having their accounts on the "decentralised" money system being banned for using crypto that's gone through a mixer.

The public blockchain makes censoring a given user even easier, because your transaction log is public knowledge.


> Mt Gox

I always loved this one. Magic The Gathering Online: Exchange.

No, that's not a joke, the owner took a pre-existing website for trading Magic cards and turned it into a crypto exchange. And no, that's not a typo, it's not "Magic The Gathering: Online Exchange", they weren't even trading the real cards, they were virtual cards for the game Magic The Gathering: Online.


It never actually sold cards. That was the original plan, but Satoshi’s invention changed the scope a little.

They also got robbed before they opened to the public, were insolvent for all their life, found a sucker to take the blame (who did serve time), then founded another shitcoin, fell out with the others, founded another shitcoin.


never stop the hustle!


Or what about that guy finalizing their crypto rug pull? There are just too many bad actors in the crypto community. Most of it seems to be scams and over hyped coins that are completely useless.


Don’t disagree with you. But none of that has anything to do with folks doing PoW to secure a ledger. Infrastructure providers can’t really be blamed for the bandits. Nobody likes thieves, and Bitcoin specifically limits the power of the biggest ones.


Yeah I think that was point 1. If they succeed in getting reasonably priced GPUs into gamers hands (While AMD can't) they get a lot of good PR!


Good PR and actual goodness are sometimes correlated and sometimes not.


Nvidia's 30 series GPU launch has been a mess. It has been over a year now and gamers still can't get their hands on the cards. Since they can't seem to make nearly enough of them, they have tried to stabilize demand by cutting miners out of the market.


Gamers were skipping new cards in favor of cheap used mining cards. This move ensures that those mining cards (which conversely cannot be used for gaming) go to landfill, and don't eat into NVIDIA profits.

There's also their claim that it's good for gamers. It's certainly not.


nVidias moat is not just their hardware, but also software - e.g. CUDA (you need a nVidia card), RTX and DLSS (these are big flagship features of 3xxx series and need special software support).

If people using the nVidia software libraries - ML edu organization, gamers and game developrs - can't get their hands on their software, they won't use nVidias proprietary libraries and ignore their market differentiation features. This makes nVidia's moat against competition worse in the longer term while serving finnicky cryptobros in the short term.

nVidia doesn't want that.


To make the cards less desirable to people mining Ethereum and possibly other types of cryptocurrencies.


I don't get that logic: "We decided to help mining and gaming community", while in reality removing that LHR feature would only help Ethereum miners.


There is something to be said here for individuals burning electricity to get back some of the extreme purchase cost of their GPU. It won't cover the entire difference between a normal listing price and what you actually pay for a GPU these days, but it helps a little.

If you're a gamer and you've bought a 3070 then you can earn back quite a bit of your spendings by brute forcing some crypto calculations when you're not gaming. It'll wear out the GPU a bit faster but all in all it's probably a decent deal.


Not sure if hackers understand that NVIDIA is not necessarily the only victim if they leak the hardware. This hardware folder potentially contains IPs from other companies and the standard cell library from TSMC


It's an advertisement. Imagine the offers they must be getting from crypto cartels. Even if Nvidia commits to open source, there's no guarantee nothing will get sold off elsewhere, just not in the big public way that will momentarily tank NVDA then probably make it skyrocket just as the chip supply returns another great crypto rush on GPUs begins, if it already hasn't.

It's exceptionally good timing for some, isn't it?


Imagine if they actually leaked the driver source and verilog files though...


It's the 'secret Coca Cola recipe' leaked to Pepsi situation: The source code is still protected by copy rights and patents. AMD cannot just take it.


Of course, but having it is better than not having it - not just for their competitors, but their users. We could find any fishy things they might be doing in there, make the public aware and pressure them to stop. With a better understanding of the driver internals, we (well, not me, but smart people) could create patches to fix common issues, enable features that were locked away...


There are countries that don't care about US IP laws. China comes to mind.


And, with the current crisis in mind, Russia. They'll be looking to expanding their local semiconductor production from now on.


Having chip designs without a state of the art fab won't help them any more than having software without a computer would.

For China and Russia to replace TSMC they'd need TSMC's (and ASML's) secrets, as well as nVidia's.


That's if we're assuming they want to directly reproduce existing hardware - I imagine they'd actually leverage it as something their engineers can learn from for future advancement


I can tell you Russia is so far behind in technology they won't be able to do anything with this knowledge.


AMD cannot just take or copy it. But they can indirectly take advantage of techniques that will be leaked.

For example, we know that NVIDIA's SM are split into slice which allows to easily adjust the size of a SM without rewriting the whole code, but we don't know many details about it and in particular we don't know how exactly the interconnect between register banks and operands collectors is built within a slice and between slices.

This does not seem to be patented, and this is only one very specific example


> We decided to help mining and gaming community

How does removing LHR help the gaming community?


It doesn't, it's just a PR statement to get more people on board.


Read the article?

Nvidia released it to make GPUs less desirable to crypto miners so gamers can actually get their hands on some.


Read the OP's question ?

If "nvidia released it [LHR] to make GPUs less desirable to crypto miners", then, "How does removing LHR help the _gaming_ community?"


Allows owners to make back some of the money spent on the hardware via mining, a la NiceHash and friends.


Cybercriminals will release demands or rationale like this in an attempt to sway public favor as well as have the company spend resources on the demand/threat instead of remediation or forensics. They're still criminals threatening and extorting others at the end of the day.


I find three components of this very interesting:

They’re requesting an LHR unlock over a ransom, surely you could get $100m out of nvidia with this stuff, how many 3 series cards do you need for LHR to be more equitable, especially when everyone gets the unlock. Maybe it’s about reputation?

The fact that they’re ransoming data instead of crypto locking isn’t something I’ve seen before. It’s inevitable once targets start backing up, but interesting it’s the first move they made.

The fact that they changed their ransom to include open sourcing drivers. This surely reduces reputation, and makes them look amateur. IANAL but blackmail coercion surely invalidates any otherwise legally binding obligations, right?


This is pretty high on my Cyberpunkometer


Clickbaity title, the demand is the obvious one:

Miners aren't happy Nvidia doesn't want to be part of their game and demand this mining ban is lifted.

I really hope Nvidia doesn't even consider complying. Screw these people's game and their extortion.


Absolutely insane breach overall, hackers got everything (including unreleased products) and the only potentially mitigating factor is the verilog files for their chips MIGHT be encrypted. If not, it's nothing short of catastrophic.


I think Nvidia has to accept that all that stuff is no longer secret, and work from there. They would be nuts to comply with the demands and I'm sure Lapsus$ knows that but it's good publicity for them (Lapsus$).


Probably a stupid question, but even if the hackers release everything, wouldn't be illegal for competitors to use that information since NVIDIA most likely has already patented that?


I'd go a step further and say that any work on the open source Nouveau driver would instantly become a legal hell. When the proprietary code gets dumped illegally, you have to be extremely wary with new submissions to the project because one stolen method and nvidia's lawyers are all over you.

This is why leaked Windows source code is actually terrible for products like Proton and ReactOS.


Of course it would be illegal, but how would they prove it? No doubt competitors would take advantage of having the entirety of Nvidia’s architecture.


Employee calls Nvidia and reports it, receives fat reward.


If it’s patented then it’s already public.


We don't negotiate with crypto terrorists — Nvidia


They say they want to help miners, but...

Congratulations: They just gave lawmakers another arguments to move against crypto currencies, especially PoS based ones.


You mean PoW right?


yes


The US is a small part of the world. Just a reminder. You can open a bank account in the EU that allows depositing cryptocurrency. The US is on the decline. It's propaganda is great, though.


Chine -> banned crypto currencies

EU -> plans to strongly control or outright ban crypto currencies

US -> might also move against crypto currencies

Idk. about India, Japan etc. but if EU and US moved they will likely follow.

Just to be clear I'm speaking about crypto currencies like we think about them now, I'm sure there is a future for blockchain based currencies, potentially even spear headed by governments.


Doesn't India pretty well want to control their own money? Even more strongly than others?

Not that they at same time aren't willing to provide services, just paid for in fiat...


What fire said.

Also I have hardly any idea about how India or e.g. Japan stand wrt. crypto currencies.

But probably they do want a strong control about their own money.

More or less any stable country wants that and would only give that up for major benefits (like joining the European trade Zone).

Though one think important to realize is that most crypto currencies are not currencies but securities. The reason for it is that they (especially BitCoin) fail some of the major aspects of a well working currency. Like it MUST NOT be deflationary (main point!), must be cheap to move (for companies at least), must have ways to constraint money laundering (which excludes full anonymity), must be as stable as possible, and other points.

Crypto "currencies" are more like securities, like stocks. But not backed by any real world value and mostly unregulated and 100% a bubble (question is how big).

Unregulated (and partial unregulateable), non-real world value backed, inflated stock should ring alarm bells, which is why I expect more regulations in the future.

Regulations which likely also "accidentally" will hinder (or even outright prevent) crypto currencies which actually act like currencies as long as they are not state backed.


The parent is hinting at Central Bank Digital Currencies, which would in fact give them total control of their own money ( moreso than they have over the physical variant, I believe )


If they have the source, can't they just make their own firmware? It's probably not that simple I guess. I'm not a hw guy.


Oh, I really wish those hackers would release the sources rather than pursue their dumbass crypto-mining demands... "We decided to help mining and gaming community" - hurting the gaming community, helping the get-rich-quick "community".

My own C++ wrappers for the CUDA APIs (shameless self-plug: https://github.com/eyalroz/cuda-api-wrappers/) would really benefit a lot from behind-the-curtains access to the driver; and even if I just know how the internal logic of the driver and the runtime works, without actually being able to hook into that logic - I would already be able to leverage this somewhat in my design considerations.

Also, NVIDIA has definitely had this coming in its choice to promote its own proprietary execution ecosystem (CUDA), rather than build everything up over OpenCL or over both; and in keeping so much of its code clsoed.


Hypothetically speaking, they could release everything and it still wouldn't damage Nvidia much other than PR and marketing.

The drivers code cant be used in open source, as it will be IP theft. Their competitors could see how they are doing things, but it isn't copy and paste, those techniques would still requires years of understanding and implementation on their own hardware. Assuming they are applicable. Even if you have the blueprint of their GPU, you cant bring it to TSMC, Samsung or Intel to build it. Their competitors could see it but again the magic isn't really in the GPU itself. And again just like software, even if AMD and Intel did spot a few moves to copy it will be 3-4 years before it appears on market.

I dont know if those code include CUDA, which to me seems to be the most important bit for Nvidia.

I also think there are lots of upside for Nvidia to be playing as the victim and just their strategy. From IP licensing to drivers. It will be interesting to see how this play out.


Their Chinese competition would love it. They have a huge need for GPUs in China to run their surveillance infrastructure. That infrastructure is mostly run on NVIDIA chips. I am sure the CCP wouldn't care about the IP issues. They have fabs that could produce good enough chips (12nm), though not as good as TSMC (5nm).


> That infrastructure is mostly run on NVIDIA chips

Not anymore.


Having the source leaked can be leveraged as a hindrance to third parties.

While it's closed source any coincidentally similar code is just that, a coincidence.

When it's out in the open coincidence becomes a potential denial of progress attack. How would you prove without doubt that something with clean-room created? Sounds time consuming and costly.


Definitely an easily condemnable criminal act - but I completely see where they're coming from.

That said, given they have all the code themselves, they can just release everything and we'll remove LHR by ourselves and finally build OSS drivers.


I don't approve of blackmail, etc.

But, it seems a pretty reasonable request! If you buy a graphics card (or anything) artificially constraining it via software seems wrong.

It reminds me of how phones are updated to run more slowly once newer versions are available. It also reminds me of how white goods (washing machines, fridges) have small plastic components

Anyway, this is some sort of modern version of 'planned obsolescence'.

How any artificial obsolescence is legal or moral, when we are also worried about wasting resources, I don't know!

Well, I do know - its cos companies want to make as much money as possible, while letting the public carry the weight of their mistakes.


It's Friday. Did the group follow through on their threat?


Will they go after Intel and demand they open up the ME next?

One can dream...


I imagine Nvidia doesn't own the rights to the complete source code of the driver. Even if they wanted to open source it things move slow in big companies.


I would imagine that ownership of Nvidia's drivers would be more clear cut than most open source projects. Wouldn't it only be Nvidia's employees and contractors that have worked on the driver code? Then by default Nvidia would own the rights.

If Nvidia doesn't own the code, do you then imagine they're in a sort of licensing agreement with the rightsholders that allows them to distribute the binary blobs but not the code?

I don't understand what situation would lead a corporate-sponsored closed source project such as Nvidia's drivers to be in a position of complicated ownership.


It could be as simple as using a third party library or snippet of code which they don't have the license to give out the source code for.


AMD and Intel did it. The excuses ring hollow.


At least from my understanding AMD did so by making a new open source driver while the old one remained closed.


That's a fine way to do it and Nvidia should have started 10 years ago.


Irrelevant. It would be relevant if others could write an open source driver. Which, effectively, they can't.


That's why tegra "crippled" driver went into libdrm just recently? Bad timing? Or is this to prepare code "release" namely some full open source driver support? That said, "open source" is not enough anymore: Lean code using simple and very stable in time C (C89 with benign bits of c99/c11 and no more) is as much as important nowadays. Cyber Robin Wood.


Hackers trying to make sure crypto users can use GPU's, limiting the supply to gamers? god the Cyberpunk 2077 release was more of a disaster than we thought


By the way, ‘lapsus’ as a slang-ish Russian word was around perhaps until 1970s or 1980s. Quite rare today. Means ‘a dumb mistake’.

It comes from Latin, and Wiktionary says that it's used in English and Romance-family languages, but I've never seen it outside of Russian. It might be used in other Slavic languages, though.

If the word is not around in Western languages, then presumably it would point to Russian origins, or someone really did their homework to misattribute blame.

(Or I might've missed the news where those guys are already uncovered—so far I only saw it being mentioned that the attack is from South America.)


It is used in e.g. fancy academic settings for writing errors.


"Lapse of judgement" is a fairly common expression for a mistake


‘Lapse’ yes, ‘lapsus’ no.


Would the release of the chip schematics allow a competing nation-state to build its own GPUs? And how likely is that to happen should the files be released?


For a long time I lived thinking that I did something wrong working with nvidia drivers. Ubuntu installation comes with no problem every time I did it, but there was tremendous amount of issues which comes exact from installing nvidia drivers. It was so irritating and not intuitive that I always thought that this incosistency and issue generating flow is just matter of time. But it is still in game and installing CUDA Toolkit gave you that feeling like you are applying on junior software engineer to nvidia and it is a technical interview.


Does the LHR impact the same calculations performed under gaming load?

I feel like nVidia aren't _that_ stupid. But I admit I know nothing about any of this


Pity they don't have the driver source code to make their own uncrippled firmware...


They do actually, in fact it was already leaked like a week ago. (the integ gpu drv RAR file has it + CUDA and more goodies)


Integrated NVIDIA GPU? What do you mean?


A quick Googling shows that "integdev_gpu_drv.rar" is the filename the hackers released.


integdev_gpu_drv.rar


hi


So when do we ban cryptocurrencies already? The only use of this crap is crime.


Never, actually. Also you are wrong that the only use is crime. There are literally millions of stories in the last 13 years of cryptocurrency that demonstrate this. But of course keep screaming at the sun if you wish.


Well, many of the especially relevant stories where still illegal.

Sure ethically they where "right" and not illegal from a EU POV, but still illegal in the country they happened.

Besides that most other stories are people getting rich by high-risk speculative investments. But the problem with mass investments into high-risk speculative investments is that they tend to create bubbles which can destabilize the economy, so not really that grate either.

There are a lot of good use-cases for the technology behind crypto currencies.

But much of the current crypto currency landscape is not grate a lot bubbles, a lot pyramid schemes and scams, a lot of things which conflict with money laundering regulations. And many other problems.

I'm fully sure that the related technology (blockchain etc.) will have a future, I'm not sure about crypto currencies like we have them now.


Millions? I'd struggle to name 3.


I couldn’t name 3 baseball players, but that doesn’t mean none exist. It just means I’m ignorant of baseball.


Still "millions of stories" is quite the exaggeration. Or do you count "someone bought and sold crypto coins" as a story?


I'll start: the 10k pizza


It's a (relatively) poor argument, but the US dollar* sure facilitates a lot of crime.

There's an argument that if you want to eradicate a great deal of crime, just change the color of your money, and offer a 1-1 exchange at the bank. An awful lot of dirty money would have to surface, and can be asked about.

*other currencies are available.


You realise crypto is more stable then most developing worlds governments? Does that mean the use only use of 3rd world country money is crime, too? What about iphones that expire after 10yrs of usage and get bricked by the manufacturer? is that a better use of your USD?

really, what is the scam? Cause if u invested in bitcoin 10 yrs ago, u could trade that now for a new car without a sweat. They were right. The cryptobros were right that hedge funds would join the game. And they will be right when you can trade that bitcoin for a house while the iphone sits in history.


> You realise crypto is more stable then most developing worlds governments?

It's not.


Source: "trust me, bro" Here you go, kid:

https://www.statista.com/statistics/326707/bitcoin-price-ind...

https://www.statista.com/statistics/371895/inflation-rate-in...

A great insight would take into account more prominent cryptocurrencies and 3rd world countries but you don't throw pearls before swines.


The claim was "more stable than most world governments".

And inflation isn't the only criteria of stability for a government.

But you're correct about pearls before swine though.


The hostilities will only cease, if UBI removes the incentives.


Unusual demand or plausible deniability?


simple @NVIDIA: threat to reduce LHR to 10% if anything else is released


crypto continues to pay dividends...


Ban cryptocurrency yesterday.


The heroes we need but do not deserve.


Linux is the terrorists’ operating system.


I don't understand why gamers should have any priority over miners and programmers in access to video cards. Mining or computing is a serious business. Gaming is just a way to kill time and kids would get much more benefit if they read books or did sports instead.

Also, gamers don't need top video cards, 1 or 2Gb cards are fine for most games.

Therefore, all restrictions on hash rate should be removed.


Gamers and programmers don't deserve any more or less priority than any other group of people. They're simply a stable market, extremely unlikely to suddenly go away in a poof, like it could happen to e.g. Ethereum if they (ever) switched to Proof of Stake, or if a few western countries got fed up and banned them.

This is simply nVidia trying to extract as much profit from the situation, without undue risk of having massive unsold inventory, but also without pissing off the 2 big markets which will certainly exist tomorrow, to curry favor with the one which might not.

> Also, gamers don't need top video cards, 1 or 2Gb cards are fine for most games.

yeah and 640k should be enough for everyone.


I don't know in which world you live in, but gaming is a serious business as well. Specially the gaming community.


Gymnastics doesn't seem to be useful either. Maybe we should stop people from doing that: They almost certainly injure themselves, and even if they don't, who benefits from someone doing a 180 split?

I don't like baseball either. The game sucks. It doesn't help with productivity either. Students should be discouraged from playing that too.


I really hope you are joking ?

You know the gaming industry is larger than the MOVIE + MUSIC industry combined right ?

> 1 or 2Gb cards are fine for most games

Erm whot now ?


If you're mining, buy ASICs that do the job. That way the capacity is available for people who want to game casually, game professionally, render, or compute.


Requiring an ASIC to mine centralizes mining. Everyone should be able to contribute to the consensus process which is why GPU mining exists. There is a large number of people who already have comupters with graphics cards in them.


> Requiring an ASIC to mine centralizes mining.

As compared to graphics cards where people are largely buying cards and dedicating them to the task, how does the use of ASICs centralize mining?


>how does the use of ASICs centralize mining?

Because any gamer with modern hardware can choose to mine and make a little bit of money




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: