Yeah, I only want Javascript for when it actually speeds up a site. Which, from my experience, is simply:
AJAX for all form submissions
Service Worker for caching
Turbolinks[0] for page navigation
And those are easy to implement as progressive enhancements. If JS is disabled, the submit button does a regular page submit, the Service Worker is simply not registered and instead uses your web server's cache policy, and your links remain as regular hyperlinks.
Honestly wouldn't bother with forms failing over. That sounds like double the work to support a few people. If you want to participate, enable JS, otherwise view the site as read only.
Not sure if I follow? For a form submission, you'd set it up like a normal HTML form. The <input type="submit"> would do it's regular thing. If Javascript is enabled, then you'd hijack the button with event.preventDefault and do your AJAX.
If I'm client side, as well as server side checking inputs with JSON responses, there is going to be overlapping of work. Particularly annoying when carrying over field/errors across submit pages. Old school submit, check, refresh, show errors, is an ugly experience for end users.
Of course it can be done, not something I'm going to worry about as a solo operating and developing multiple ventures.
Sure, it requires some careful planning. You'd set up the html form with method="post". Then have the AJAX send the request with its content type as "application/x-www-form-urlencoded". At least with Node Express, that will have the requests handled the same.
I understand how this works, I've built a billion products by this point. It's much easier and user friendly to post the form through AJAX, get a JSON response on success or an array of error codes for each field. Dealing with error messages and passing around form information is not something I want to deal with any longer in OG form submit.
[0] https://github.com/turbolinks/turbolinks