>And anyway, from my experience, Python has never been the cause of the slowness.. It's more about tweaking that DB query, smartly packaging the assets or using a better algorithm.
That's because you constrain your use of Python in I/O bound problems, like webpages and such. If you had any CPU bound problems, then you would have found Python is quite slow -- and you can get 2-20-100 times the performance in another language.
Python is very slow (compared to what it can be), which is what necessitates tons of workarounds like: C extensions, SWIG, NumPy and SciPy (C, Fortran, etc), Cython, PyPy, etc. A whole cottage industry around Python's slowness. Which is nice to exist, but not an ideal solution: suddenly you have 2 codebases to manage in 2 different languages, makefiles, portability issues, etc etc. All of those are barriers to entry.
There are definitely limitations of the implementation of CPython itself, which do not need to be there if the language were willing to ratchet down the dynamism in certain cases. (Even without that, there are potentially implementation avenues that would have permitted better concurrency than the current codebase.)
However, implementation aside, I think it's invalid to assert that the existence of a cottage industry is inherently a sign of weakness; it's also a sign of just how pervasively Python is used. Find me another language that's used as widely, as popularly (i.e. not just two random guys using it that way), and in as many different kind of settings as Python. (Java might be a contender, except that it's not really used for the really high-performance scientific workloads like Python.)
Python doesn't add integration requirements. If you want to just use PyPy for something, then use it in good health (without fearing that your e-penis is tiny and will hamper your 'performance' because you didn't do whatever coldtea said is necessary for ULTIMATE SPEED).
Why the juvenile mockery about "e-penis" and "ULTIMATE SPEED" and such mocking? Are we 12 year old here?
I made my case explicitly, and it's about technical limitations of CPython that can (and, at some points WILL) be overcome. PyPy is a step in that direction, though not mature yet.
Python, in the form of the standard and most used implementation for scientific, technical etc programming, CPython, DOES add integration requirements.
It necessitates the use of C/Swig/Cython etc extensions for performance critical stuff. And it necessitates it, not because it puts a gun in your head or says so in some contract clause, but in the pragmatic way, that people find it's not fast enough for their needs in pure form.
That's why all the popular related Python projects are not written in pure Python, but in C/Fortran/et al (NumPy etc).
That's because you constrain your use of Python in I/O bound problems, like webpages and such. If you had any CPU bound problems, then you would have found Python is quite slow -- and you can get 2-20-100 times the performance in another language.
Python is very slow (compared to what it can be), which is what necessitates tons of workarounds like: C extensions, SWIG, NumPy and SciPy (C, Fortran, etc), Cython, PyPy, etc. A whole cottage industry around Python's slowness. Which is nice to exist, but not an ideal solution: suddenly you have 2 codebases to manage in 2 different languages, makefiles, portability issues, etc etc. All of those are barriers to entry.