Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Somehow a whole industry is now fine with Heisenbugs being a regular part of the dev workflow.


the salary raise and promo project industry within large corps is fine with that

there is everyone else who is supposed deliver software that works, like always, and they are not fine with built-in flakiness


If you wanted to you could make the LLM return entirely deterministic results, but it wouldn’t be very helpful since a semantically identical prompt could still create an entirely different result with a single character difference.


I'm not fine but the board decided like that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: