Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Local LLM with the entirety of Wikipedia accessible via RAG (or better, newer technique) is legitimately a little super-powered assistant. It works when the grid goes down, and the searchability is orders of magnitude better - especially when you're not sure what you're looking for.


What better technique are you referring to?



Are local LLMs available on phones yet?


MLC have demos that work on iOS and Android:

- https://llm.mlc.ai/#ios

- https://llm.mlc.ai/#android


This model came out a few days ago and runs on an iPhone. https://huggingface.co/NousResearch/Obsidian-3B-V0.5


From their README:

    <|im_start|>user
    What is this sign about?\n<image>
    ###
    <|im_start|>assistant
    The sign is about bullying, and it is placed on a black background with a red background.
    ###
Any idea how you fill out that <image> bit?


No, but the GitHub repo linked from there includes a Gradio Demo UI which has an image field, so it should be possible to reverse engineer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: