What has been working surprisingly well for the sites I maintain is to have a simple but custom "captcha" like "Enter 294 here:" (it can even be static), and to exclude the pages that have submission forms from search engine indexing.
I had a form that got about one spam message per day.
In late 2021, I added a trivial hidden-by-CSS “If you are human, leave this field blank (required)” <input name=username> honeypot. (More details: <https://news.ycombinator.com/item?id=37058847>.)
For two and a half years, this filtered out all spam, except for one message in early 2023.
But I started this comment with “may not” because since 2024-02-10, I’ve received approximately 268 spam messages, of a few different patterns (still all very easy to identify visually). So some refinement of the idea may be needed. (I have no idea how many more have been filtered out; I never bothered tracking that. But I imagine that it’s still doing something useful.)
This is, of course, low-value-target stuff, scattergun spam rather than targetted spam.
From my experience with coding parts of Un-static [1], the advantage of having a single source for submissions for thousands of forms, is that you can filter out these more easily as well. As you can create partial fingerprints. Then just compare similarity between incoming submissions on other forms. And of course start blocking if you receive a scatter-gun message that matches partial fingerprints received across an increasing number of form endpoints.
Definitely. Any kind of unique check (another example is just a uniquely named version of a classic hidden honeypot field like https://dev.to/felipperegazio/how-to-create-a-simple-honeypo...) is usually enough on it's own until you're a higher value target.