Hacker News new | past | comments | ask | show | jobs | submit login

This looks very promising. Thank you for making it open source! At first glance, I couldn't find if you can run it using local models (ollama?). Is that at all possible?

Edit: for anyone else looking for this, it seems that you can: https://github.com/browser-use/browser-use/blob/70ae758a3bfa...




Yes! People love Deepseek-Chat / R1 and the new Qwen versions. It works with ChatOllama. However, Llama itself does not work very well with our tool calling and is often confused by the structured output format.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: