This looks very promising. Thank you for making it open source! At first glance, I couldn't find if you can run it using local models (ollama?). Is that at all possible?
Yes! People love Deepseek-Chat / R1 and the new Qwen versions. It works with ChatOllama. However, Llama itself does not work very well with our tool calling and is often confused by the structured output format.
Edit: for anyone else looking for this, it seems that you can: https://github.com/browser-use/browser-use/blob/70ae758a3bfa...