>destructive "write lifespan until ultimate failure" real world tests on SSDs
>from 7 years ago
It's from 7 years back for good reason. They stopped doing those tests when it became impractical as endurance increased. The drives are now good enough that you can't wear them out fast enough to make sense in a review setting
The whole review industry just stopped scrutinizing SSDs several years ago, right around the time manufacturers started cutting features like power loss protection and DRAT/RZAT along with switching to TLC and QLC.
I find it highly improbable that you couldn't wear out a 3-level-cell or 4-level-cell consumer grade SSD which is capable of 300-500MB/s writes with a 24x7 automated test script in just a few months. Maybe even just a couple of weeks. The published total DWPD (drive-writes-per-day) endurance on these is not that great.
Even assuming a conservative 300MB per second, there's 86400 seconds in one day. That's 25920000 MB per day. Or close to 26TB per day. The samsung 960 Pro 2TB is rated by its manufacturer for a total 1200TB of write endurance lifespan.
Or at least leave it running for a couple of weeks and then see what the SMART-reported remaining write lifespan data reports it to be, versus the brand new out of box baseline.
>Or close to 26TB per day. The samsung 960 Pro 2TB is rated by its manufacturer for a total 1200TB of write endurance lifespan.
Right. So around a month and a half. In a world where hardware news drops simultaneous by multiple outlets literally within minutes of news embargoes being lifted.
That's a lot of time investment to get result that are boring AF ("we tested them. they work"). Have very little real life consumer relevance. And the manufacturer that sent you the review sample definitely doesn't want to see (focus on edge case negative).
>Or at least leave it running for a couple of weeks and then see what the SMART-reported remaining write lifespan data reports it to be
Yeah that would make a bit more sense. Run various units down to 95%. That said the resulting story would still have watch paint dry appeal only
> In a world where hardware news drops simultaneous by multiple outlets literally within minutes of news embargoes being lifted.
Any outlet interested in journalism rather purely PR can purchase a retail sample on release and publish an endurance report at a later date, however long it takes.
From my experience people buy storage whenever they have a need for something faster or larger, unlike CPUs and GPUs which have peak interest around their release dates. Storage is evergreen in that sense.
> That's a lot of time investment to get result that are boring AF ("we tested them. they work").
What's boring about that? This is extremely valuable information for any perspective buyer. Either way you gain reputation for being a trustworthy outlet that people can rely on for accurate information.
> Have very little real life consumer relevance.
I disagree and I think most consumers would be extremely interested in durability of their storage devices, especially since most of them rely on it in the absence of backups.
> And the manufacturer that sent you the review sample definitely doesn't want to see (focus on edge case negative).
That's hardly relevant. Informing potential customers about extremely serious flaws in the product is quite literally their job - at least if they wish to have any semblance of integrity, trustworthiness, and respect.
Many choose to sell out and simply echo the approved selling points they receive directly from the company, but not every outlet does this and it shouldn't be held up as something that tech journalists should aspire (or be allowed) to do.
>from 7 years ago
It's from 7 years back for good reason. They stopped doing those tests when it became impractical as endurance increased. The drives are now good enough that you can't wear them out fast enough to make sense in a review setting
...unless fundamentally broken like these