Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
RobotToaster
7 months ago
|
parent
|
context
|
favorite
| on:
Can LLMs write better code if you keep asking them...
IIRC there was a post on here a while ago about how LLMs give better results if you threaten them or tell them someone is threatening you (that you'll lose your job or die if it's wrong for instance)
__mharrison__
7 months ago
[–]
The author of that post wrote this post and links to it in this article.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: