1. It's by a data journalist. Datasette was originally designed for data journalism (though it's useful for all sorts of other things) so it's really great to see it being enthusiastically used in this way
2. Jeremia talks at length about Datastte's URLs! This is a key design principle of Datasette: anything you see (a filtered table, SQL query results, a configure chart visualization) should be reflected in the URL of the page, such that you can bookmark it and share it with others. And any time you can see data you should be able to add .json to that URL (or .csv or, via plugins, other extensions like .atom or .geojson) to get that data back in a useful format.
I think this is quite generally a quality feature for web apps and dynamic web interfaces, that should always be considered.
If they get too large/unwieldy there is always the option to use base64 encoding or other formats and tricks. Human readable URLs are always nice but I rather have something ugly than no URL state at all in many cases.
If the output changes frequently over time, so it might get stale, then one can still add a timestamp and an additional warning in the UI with a link to recent data.
What I'm saying is that there is rarely a good excuse to not do this.
In fact I think when I haven't been following this principle it was typically due to self-inflicted, bad decisions paired with time constraints.
On 2., Tim Bray's write-up on uri/urls as the integration point powering AWS is a good write up on the virtues of urls at scale, as a universal integration point.
1. It's by a data journalist. Datasette was originally designed for data journalism (though it's useful for all sorts of other things) so it's really great to see it being enthusiastically used in this way
2. Jeremia talks at length about Datastte's URLs! This is a key design principle of Datasette: anything you see (a filtered table, SQL query results, a configure chart visualization) should be reflected in the URL of the page, such that you can bookmark it and share it with others. And any time you can see data you should be able to add .json to that URL (or .csv or, via plugins, other extensions like .atom or .geojson) to get that data back in a useful format.