Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If i google for it specificly, the paper in nature states:

"This study had some limitations. Mammograms were downsized to fit the available GPU (8 GB). As more GPU memory becomes available, future studies will be able to train models using larger image sizes, or retain the original image resolution without the need for downsizing. Retaining the full resolution of modern digital mammography images will provide finer details of the ROIs and likely improve performance." https://www.nature.com/articles/s41598-019-48995-4

Here they use a Nvidia V100 https://www.researchgate.net/publication/336339974_Deep_Neur...

Which yeah okay is more reasonable than i thought. But the advantage will still be at who ever has the hardware and thats just cheap for google.

You wouldn't need a market for models, you would just use whatever research delivers from whoever has the most accurate data & hardware.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: