Hacker News new | past | comments | ask | show | jobs | submit login

Perplexity?



This is a measure of how well the supplied text matches what the model itself would have produced.

A low perplexity means the text isn't massively different from what it might have output itself (which might be an indicator that it was produced by a model), whereas a high perplexity suggests it's the kind of semi-random nonsense you'd expect from a student. ;)


Poster was illustrating the point being made.


No, poster was using a word form the linked article. Click the "download PDF" to read past the abstract.


If half of commenters had skimmed through the article instead of just commenting after reading just the title or, at best, the abstract, they would have answered their comments themselves.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: