Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is sort of right. Reformer has taken books as a single input and not just in split batches. There are pretrained book-trained models on HuggingFace now. I'm not aware of book-length summarization models yet, but that's not due to the input length issues of the previous generation of xformer models (maybe due to lack of an obvious training set?). So your SQL example should be a thing of the past before long.


I think they didn't release any benchmark results for reformer, so yes, it can take whole book as input, but quality is unknown.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: