Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It will always be cringe due to how so-called "AI" works. Since it's fundamentally just log-likelihood optimization under the hood, it will always be a statistically most average image. Which means it will always have that characteristic "plastic" and overdone look.


The current state of the art in AI image generation was unimaginable a few years back. The idea that it'll stay as-is for the next century seems... silly.


If you're talking about some sort of non-existent sci-fi future "AI" that isn't just log-likelihood optimization, then most likely such a fantastical thing wouldn't be using NVidia's GPU with CUDA.

This hardware is only good for current-generation "AI".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: