Hacker News new | past | comments | ask | show | jobs | submit login

>Because it's not a huge game changer.

That is just not true.

1 - This helps parallelize regular python code too, like the kind that goes into writing web services.

2 - While you'd still write native extensions to get optimum performance for single threaded code, having the thread dispatch layer in python makes parallelizing execution convenient and robust. For example, see the comment from people who maintain scikit-learn below. I'd love to see python get composable task parallelism like Julia where you can freely spawn threads within threads without worrying about their lifetimes.

link: https://github.com/colesbury/nogil/issues/36#issuecomment-10...




That does not sound all that enthusiastic for a "game changer":

> helps parallelize

> convenient and robust

So nice but not too noticeable?


Well context helps determine impact.

What percentage of compute time is python?

What percent of Py compute time would be cut?

Who's using python?(banks, stem researchers, everyone else)


That has nothing to do with what I wrote and does not address the question?

> ...bunch of questions...

Yeah, if you claim that something is a "game changer" you should provide some arguments, not questions/"help determine".

It might very well be a game changer but thud far you have only suggested there might be an argument for that, if ...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: