ELK could never deal with my logs which are sometimes-JSON. Loki can ingest and query it just fine. Also the query/extraction language makes a lot more sense to me.
Elasticsearch can store arbitrary text in log fields, including JSON-encoded string. Elasticsearch can also tokenize JSON-encoded string and provide fast full-text search over such string in the same way like it does for a regular plaintext string.
why do you need storing JSON-encoded string inside log field? It is much better parsing the JSON into separate fields at log shipper and storing the parsed log fields into Elasticsearch. This gives better query performance and may also reduce disk space usage, since values for every parsed field are stored separately (this usually improves compression ratio and reduces disk read IO during queries if column-oriented storage is used for per-field data).
The problem is that log messages are collected from many researcher-authored applications. Some of them are JSON, in which case I want to parse them and store the parsed JSON as you say. But some of them are not JSON, in which case I need to store the text.
I was not able to do that with the log shipper. If I configured parsing, then the not-JSON messages would get dropped.
If your source emits logs in OpenTelemetry format, using an OTel Collector inbetween you could do sometimes-JSON parsing of log content before the backend.