I'm using Arch and the transition was smooth for all the packaged libraries including Numpy, Scipy, matplotlib, and friends, as well as pure Python packages like mpi4py and petsc4py which I install from upstream since I track development versions. The real pain is for projects that are not pure python libraries, but want to ship portable scripts. Since the "python2" link is not ubiquitous, they cannot put "/usr/bin/env python2" in the scripts (like Arch packages do, perhaps just using sed). Unfortunately there are still modern installs of Python that do not have "python2" links, so this is going to be a long process when it's not feasible to support both python-2.x and python-3 within an identical codebase. That is challenging if you have to support e.g. python2.3 which will be around for a while longer (RHEL4). Also, RHEL5 (rather, the CentOS system I have a login on) does not have a python2 link (it has python2.4) and that will be around for another four years.
I think Arch's decision was a bit premature, but waiting isn't going to make the issues much less painful if your project will always have to support an 8-year age span of distributions.
As a result, #python has been full of people complaining their Python installs are hosed. Just like ruby1.8 -> ruby1.9, this was a bad move, and we told them (Gentoo as well) before they did it. I can only fully support my friend and very experienced Python hacker Allen when he says:
20:20 <dash> well that confirms my impression that arch was invented by a bunch of guys who thought gentoo was too stable and easy to use
I've been using it for over two years now. Makes a great workstation OS. You get the latest releases and all their attended bug fixes, security updates, and yes.. new bugs.
But at least in the Python world, there's still virtualenv, buildout, pip, etc. I rarely install system packages for libs anyway. I don't really see how this is going to bring down the house. At the least if it does work out well and makes enough people happy, there will be more reason for people to hurry up the transition to py3 already.
(An aside: as much as I disagree with py3 forgoing practicality for purity, I wish the transition would be over already so we can get on gettin on).
I noticed the news post a day before my latest update, so I was prepared too. One should always follow package source's and general news of the distribution one uses.
Arch has the least amount of breakage of any distro I've ever used. Slack, Redhat, Gentoo, all of them would occasionally shit all over themselves, and it would sometimes be tricky to fix. The couple times that Arch did that, it was really easy to fix, because things were "normal". The Arch philosophy seems to be to defer to upstream as much as possible, which makes things easy to fix.
You can't always decide something is good or bad based on number of people running to IRC asking for help. Well let me make another "tongue in cheek" -- many o f them should be told to RTFM and read their distro's announcements". Don't support lazy folks who fail to do that.
Not a good move. Considering the large base of programs that expect Python 2.x, the safe thing to do would be to keep /usr/bin/python pointing at the 2.x binaries while using /usr/bin/python3 for the 3.x ones. At some point in the future if Python 2.x really is dead, you can just point both python and python3 at the same thing.
This seems more like somebody trying to evangelize for Python 3 and against Python 2, by purposely breaking a lot of old software, and quite antithetical to the Robustness Principle that I would expect most good software to at least attempt to shoot for.
Hopefully other distros will take a more conservative path. Python 2.x is going to be around for a long, long time.
The platforms making bold enough breaking changes to progress are usually better for it. One case Apple on intel/OSX. Granted this is painful for legacy stuff but that only helps keep things up to date. If people don't start doing this Python 3 will never progress.
However, there should be no 'default' python. People should go get 2 or 3 as needed. If there is a default it should be the latest version.
I agree with that generally, but in the case of Python 3, there are real roadblocks at this point, not just intertia. Central pieces are missing. There is not a single postgres adapter that works reliably with Python 2 and 3. It's impossible to write robust web apps using Python 3 because of the WSGI conundrum.
Actually, the situation is very damaging for Python, because Python has momentum _now_ and many new users are confused and put off by the fact that they cannot use the latest Python version for their projects.
Granted, these issues affect mainly web development, which may not be the main concern of a Linux distribution.
Did Apple really make a 'breaking change' with the switch to Intel? PPC apps ran with a PPC emulator. As far as I know they didn't break, they were just deprecated.
There's a big difference between breaking and deprecation.
It's not around as `python2` on other distros though, which is the problem. Either all other distros have to change to conform to arch, or arch has to change it back to how it was.
Chances are exceedingly good that Arch package maintainers will be checking their Python applications and updating them to use /usr/bin/python2 if they don't run under Python 3. Actually, this should have already happened since the change has apparently been in testing for awhile.
I'm not really familiar with what Arch package maintainers do but aren't package maintainer mainly packager and not coders ?
I imagine that someone could easily be packaging 10 Python apps or libraries. He can't be upgrading all that code to Python 3 as it is not always a trivial task and would require a lot of work and constant patching if the developing team keeps working with Python 2.X
But overall, I think this is a good way to push developers to upgrade or at least prepare more to move to Python 3.
You're misunderstanding. Arch is still shipping Python 2.7 with the python2 package.
So package maintainers don't need to rewrite upstream code if it's not compatible with Python 3; they just need to update dependencies and set paths to point to the /usr/bin/python2 binary.
I failed to find a reference, but I seem to remember the Python team deciding at some point that they intended to keep the name "python" for the Python 2.X binaries perpetually, and require Python 3.X to be invoked as "python3". Arch might be alone in making this change, and inconsistent with other Python distributions.
Arch has done this before. They only provide Ruby 1.9, even though 1.8 was (and remains today) the norm. I tried pointing this out and they told me to use a community package, which is unreliable and a huge pain. Their package philosophy is bleeding edge to an absurd degree, as it simply doesn't work. I stopped experimenting with Arch because of it.
Ruby 1.8 is god awful and should be taken out back and shot. Several times. In no way should anything be done to support 1.8 as "the norm", especially after 1.9.2.
Also, it's not like anyone should be using the system Ruby anyway. That's really bad practice in my opinion and there's no excuse when rvm exists.
RVM reads .rvmrc files in each directory you change into. How does it do this? By redefining 'cd' to be a function wrapping the real CD command, with some extra RVM stuff in between the changing of directory and return.
My problems ('cd' with no arguments didn't take me back to my home directory any more) occurred about a month ago (I was using the fish interop mode - that might have been the problem), and it's probably been fixed since. But the whole thing scared me out of continuing with RVM.
If there's a bug in RVM, I sure as hell don't want it to manifest in my 'cd' command. I need to be able to change directory, and my scripts rely on it to always work.
Well if you're not living on the bleeding edge of Ruby why would you live on the bleeding edge of Linux with Arch? Any job that requires Ruby 1.8 shouldn't be using Arch.
Would you consider Linux 2.6 'bleeding edge'? I mean, they have kept the 2.4 series going. Just because some development occurs in parallel, doesn't mean the new version is 'bleeding edge'.
The Python 3 migration process, for actual Python software in the wild, is and has always been meant to be a multi-year thing. As such, Python 3 is still very much "bleeding edge" in the sense that most projects are still working on their migration, and likely will be for some time.
I don't understand why more people don't get that. Guido has always said it will take a few years for it to all get over to P3. The Django folks are just now deciding when to get Django over to P3. If I have been watching the release notes for P3 versions as they come out and it still seems they are "cleaning up".
On the other hand, making your software work with Ruby 1.9 is something you'll have to do anyway, sooner or later. As long as there are no release-critical bugs in Ruby-1.9 ... which I guess aren't, because otherwise I don't think Arch would provide them.
Tangential, however I have found rvm to be incredible helpful in such situations. The ability to run multiple versions of Ruby with various sets of gems makes developing Ruby 1.8/1.9 apps painless. So much so that this is my default mechanism of running ruby anywhere (dev/prod).
This is one of those special case situations, since 1.9 has been the stable branch of Ruby since January of '09. That said, you're right, a sizable number of projects have stuck with 1.8.x, but you also can't blame the guys from Arch, either.
I'm grateful for arch and its users, like gentoo users before them, for being a giant expendable human wave of beta-testing to make my life with foss easier.
This has been in testing for some time in Arch. The Python maintainers decided to hold off on releasing 2.7 until all packages depending on Python could be rebuilt to make the switch from UCS-2 to UCS-4. Now Python 3 and Python 2 are both UCS-4, and the 'python2' package is 2.7.
I updated Arch on my workstation and server today and can confirm that everything "works for me" except for some python packages installed manually from AUR (e.g. offlineimap).
Most Python programmers only use a handful of significant third party libs. I use fewer than a dozen. What is more significant is WSGI, which still has not been finalized for Python 3. There are two competing, unimplemented PEPS: 444 and 3333.
I have also heard some comments about the Python 3 standard library leaving some things to be desired. Perhaps someone more knowledgeable can elaborate.
for web (framework). Google App Engine still depends on Python 2, Django has some hesitates for the Python 3 port[1], the thread has some discuss about Python 3.2, peps 333(3) too.
and have a Python 3 version itself is not the same thing it can be put into production as it's tried-and-true Python 2 version.
As I see it, the change is minor and doesn't have any practical consequences, except for:
(1) the extra work for Arch packagers, who will need to change, in all packages, every Python2.x code that expects /usr/bin/python to be a Python2.x interpreter.
(2) making Python 3.x feel like the de facto standard in the distribution sphere.
This update broke the python-cheetah package which I was relying on.
For now, I have to use a self-compiled 2.6 python package. I have locked the python and python2 packages from further updates for now. I hope this situation get better soon.
I recently started a new project but had to go for 2.7 instead of 3.x due to the lack of 3.x support in Django.
I would have loved to use 3.x. Oh well.
Use Buildout and get a python shell for any version you want in your project directory. Then add a few things to your .gitignore and distribute/deploy.
I think Arch's decision was a bit premature, but waiting isn't going to make the issues much less painful if your project will always have to support an 8-year age span of distributions.