Hacker News new | past | comments | ask | show | jobs | submit login

Yes, that's was the original promise of compilers.

COBOL was designed for normal business people use, remember?

You'll just have to program in AI understandable language, I'm sure there are going to be lots of quirks and tricks similar to the languages today.




Right, but what do you think about the point ch4s3 made below, namely that AI is non-deterministic? Isn't that a core difference? The COBOL compiler produces the same output for the same input, i.e. the compiler acts as a pure function.


There's nothing requiring non-determinism in these models, it's just there as a means to increase variety. There are obviously scenarios where this makes less sense.


Kinda, there are GPU kernels that are sped-up by being non-deterministic, so you also gain performance with non-determinism


Doesn't 0 temp give you determnistic outputs tho?

Also you need to translate/expand once (or multiple times, add tests, pick best benchmarked solution)

Where this could be useful would be in handling updates of packages and API's by itself. if you integrate only by prompt/words the AI can generate the appropiate latest lib integration that happens to work with your system or whatever


Even temp=0 isn't fully deterministic for some reason, most likely floating point non-associativity and related issues.


Is that supposed to be a good thing when you're generating code?

Accepting varied non-deterministic input, great. The same input generate different code each time, not the "feature" that you'd think it is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: