Hacker News new | past | comments | ask | show | jobs | submit login

The original article talks about moral (not even legal) objections to learning from copyrighted data.

Your comment implies that DALL-E 2 is morally okay, because they don't distribute the model ("copy everything it knows to a USB stick") but only sell access to the algorithm to generate images, while Stable Diffusions open source model is a problem because it can be copied.

Most people would take the exact opposite stance I guess.




> The original article talks about moral (not even legal) objections to learning from copyrighted data.

but I am replying to

"My brain has been trained on even more copyrighted material. Every book I read, every tv show I watch, the toys I played with as a child. It's hard to imagine that I could come up with anything that is not inspired by copyrighted work"

Difference being your brain has not been trained by someone (for profit), you have trained it using YEARS OF YOUR LIFE TO ACQUIRE KNOWLEDGE AND EXPERIENCE

which is morally acceptable (does not imply that the use you do of it is legally acceptable), given that you paid a very high price, sacrificing your own time for the objective.

And that your knowledge is only yours, you can't transfer it to anyone, it doesn't even show up in your DNA.

> Your comment implies that DALL-E 2 is morally okay, because they don't distribute the model

Implication doesn't mean what you think it means.

My comment doesn't imply anything of the sort, you are

> Most people would take the exact opposite stance I guess.

https://en.wikipedia.org/wiki/False_dilemma

https://en.wikipedia.org/wiki/False_premise




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: