This is how it'll get us all. You'll decide to double-check with other sources to see if the advice from the LLM is actually safe, only to find all those sources will be written by LLMs too.
People already generate books with `AI' so the odds of there already being a `cookbook' on Amazon that contains such a recipe is not 0. Good luck explaining to someone why the recipe they saw in a physical book they bought from a reputable retailer ended up killing their family.