I don't for one, but I still think there could be legitimate safety concerns. LLMs are unpredictable, and the possibility for misinformation in pitching them as search aggregators is pretty large. Disinformation can have, and previously has had, genuinely dangerous effects.