> recently the best AI models are veering into not sucking territory
I agree with your assessment.
I find it absolutely wild that 'it almost doesn't entirely suck, if you squint' is suddenly an acceptable benchmark for a technology to be unleashed upon the public.
We have standards for cars, speakers, clothing, furniture, make up, even literature.
Someone can't just type up a few pages of dross and put it though 100 letterboxes without being liable for littering and nuisance. The EU and UK don't allow someone to still phones with a pre-imstalled app that almost performs a function that some users might theoretically want. The public domain has quality standards.
Or rather, it had quality standards. But it's apparently legal to put semi-functioning data-collectors in technologies where nobody asked for them, why isn't it legal to sell chairs that collapse unless you hold them a specific way, clothes that don't actually function as clothes but could be used to make actual clothes by a competent tailor, headphones that can be coaxed into sporadically producing round for minutes at a time?
Either something works too a professional standard or it doesn't.
If it doesn't, it is/was not legal to include it in consumer products.
This is why people are more angry than is justified by a single unreliable program.
I don't care that much whether LLM's perform the functions that are advertised (and they don't, half the time).
I care that after many decades of living in a first world country with consumer protection and minimum standards, all of that seems to have been washed away in the AI wave. When it receeds, we will be left paying first world prices for third world enquiring, now the acceptable quality standard for everything seems to have dropped to 'it can almost certainly be used for its intended purpose at least some times, by some people, with a little effort'.
I agree with your assessment.
I find it absolutely wild that 'it almost doesn't entirely suck, if you squint' is suddenly an acceptable benchmark for a technology to be unleashed upon the public.
We have standards for cars, speakers, clothing, furniture, make up, even literature. Someone can't just type up a few pages of dross and put it though 100 letterboxes without being liable for littering and nuisance. The EU and UK don't allow someone to still phones with a pre-imstalled app that almost performs a function that some users might theoretically want. The public domain has quality standards.
Or rather, it had quality standards. But it's apparently legal to put semi-functioning data-collectors in technologies where nobody asked for them, why isn't it legal to sell chairs that collapse unless you hold them a specific way, clothes that don't actually function as clothes but could be used to make actual clothes by a competent tailor, headphones that can be coaxed into sporadically producing round for minutes at a time?
Either something works too a professional standard or it doesn't. If it doesn't, it is/was not legal to include it in consumer products.
This is why people are more angry than is justified by a single unreliable program. I don't care that much whether LLM's perform the functions that are advertised (and they don't, half the time). I care that after many decades of living in a first world country with consumer protection and minimum standards, all of that seems to have been washed away in the AI wave. When it receeds, we will be left paying first world prices for third world enquiring, now the acceptable quality standard for everything seems to have dropped to 'it can almost certainly be used for its intended purpose at least some times, by some people, with a little effort'.