Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am not deep into the matter but I would have given the answer: Async code is making code that would have been blocking non-blocking in a manner other stuff can still happen while it is being completed.

Since I work a lot in embedded loops where long running blocking snippets could literally break your I/O, lead to visible/audible dropouts etc. this would be the obvious answer.



But that's the thing: async, by itself, doesn't guarantee that anything is non-blocking. For your fiber (or any other kind of user-land abstraction) to be non-blocking, you MUST ensure that it doesn't perform any blocking call.

All async does is give you (some of) the tools to make code non-blocking.


Yeah, sure I mean in embedded-land any async snippet could perform any number of things, like firing a delay command that puts the whole processor to sleep.

This could potentially be avoided by clever enough compilers or runtimes, but I am not sure whether that would really be benefitial.

I am a fan of making things explicit, so the closer peoples idea of what aync is and what it isn't matches reality the better. Alternatively we should get the definition of what async should be clear first and then make the adjustment to the abstractions so they give us the guarantees people would natuarally assume come with that.


Yeah, I'm insisting because I recently reviewed a PR with async code calling blocking code, which made the entire exercise pointless. And that was from an experienced dev.

There used to be a few compilers that used static analysis to predict the cost of a call (where the cost of I/O was effectively considered infinite) and in which you could enforce that a branch only had a budget of N. Modern architectures tend to mess up with any finite value of N, but you could fairly easily adapt such techniques to detect unbounded values.


The entire point might be to offload the blocking call and do something else while it's blocking.

There's a style of "asynchronous programming" where everything is designed to be non-blocking and there can be asynchronous programming with blocking code. In fact the first style can be emulated by offloading every blocking call to a different thread/greenthread/fiber and that's basically what's happening under the hood unless there is some fundamental support for non-blocking at the lower levels (sometimes all the way down to the hardware).


You can run blocking code asynchronously from other code. I think the parent means something like if you have a blocking operation (function call e.g.) on a thread you can create another thread and run that blocking operation there, magically transforming a blocking call to something that looks like a non-blocking call (thus allowing other code to do something else instead of waiting)


cries in Python asyncio


sympathizes

If you look on the bright side, it looks like free-threading is approaching, and OCaml has demonstrated how, by removing the GIL and adding exactly one primitive, you can turn a powerful enough language into a concurrency/parallelism powerhouse with minimal user-visible changes!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: