Only in response to some classes of requests. They didn’t go into detail about when but they said that the local Siri LLM would evaluate the request and decide if it could be services locally, in their private cloud AI, or would need to use OpenAI. Then it would pop up a requesting asking if you want to send the request to OpenAI. It doesn’t look like that would a particularly common occurrence. Seems like it would be needed for “answerbot” type of requests where live web data is being requested.