Did the timelines seem awfully aggressive to anyone? (June 1 service stops, July 31 all data is erased)
For a service that we were suppose to be syncing our lives to, that seems like a really abrupt, customer unfriendly ramp-down.
I would have expected something more like:
1. April 1 - no new accounts.
2. May 1 - can no longer add files to your existing account.
3. May - Dec - nagged/reminded constantly to pull your files down.
4. Dec 31 - All accounts closed, data "erased"
5. [BONUS] March '15 - Data actually erased to provide a few months of emergency
recovery for the few folks that didn't know and are emailing frantically that
their family photos are up there.
Almost everyone will have at least one completely up-to-date copy of their files. It's just a matter of signing up for a competing service and re-uploading. Four months seems like a long enough time to accomplish that.
For something as important as "file storage" 12 months would be a safer option to make sure everyone was alerted and reacted. Or Ubuntu may keep the files as torrents/tars after that for up to 12 months and ask for a payment of for download.
The problem with an extended shutdown is that it eliminates any sense of urgency to act. If I know I only have 2 months to find a new provider, I'll start searching and migrating now. Give me a year, and I'll ignore the problem for 10 months before acting.
Freezing the accounts would be enough to create this sense of urgency, you wouldn't be able to modify or upload new files on the system. You would have to look for an alternative at that point...
yeah that's what i meant. People in special conditions who won't check their emails may lose all files... freezing in 4 months and keeping for 12 months may ensure nothing valuable is lost.
It's not only about file storage. Ubuntu One offered a Web service to programmatically access the files. Imagine you've developed an application that interfaces with Ubuntu One and is mission critical for your business.
I bet you can count the number of people who did that on one hand. They're not shuttering the service because it was popular. They are open sourcing the code though, so you can run your own version and not change any code.
Not essentially different from if you developed your business application on a PaaS which went bust or raised prices. These things happen unless you keep them in mind
Two months notice plus two months of shutdown time? Luxury. If you want to see abrupt then MochiMedia's 2 weeks notice announcement that they were going away and taking with it probably the largest ad network for Flash games along with metrics, highscore systems, versioning, distribution and more was abrupt!
Everyone will wait the last second to get their data back (and complain about the cramped service) anyway. And it's not like an OS upgrade which could potentially break your apps or lose your data. It's just a mattee of pushing the sync button to get all your data back on your local drive. People will largely have time to find an alternative solution or buy a small storage device until EOS. Afaic, Ubuntu is pulling the plug on a very good service in a reasonable fashion: we should be thankful they kept it running for so long.
It was barely used and very easy to replace with free services like Dropbox. I'm sure the few users it did have are savvy enough to migrate off within the relatively short time frame.
The cost of switching is generally very low and it's quite likely that most people with data in the UbuntuOne system that they wanted to keep were already using another service like DropBox simultaneously.
The biggest cost here is to the Ubuntu reputation. They stand on the promise of LTS releases: with the quick timeline for removal of support from what could be a critical piece of customer infrastructure they signal that they are willing to sacrifice relationships for convenience.
Oh, no I wasn't implying they were going to launch a campaign of evil and despair against their users :)
Regardless, for an online backup/sync service, I think a very compassionate ramp-down plan is on the order of 6 months with a few secret months of emergency recovery in the cases where people "forgot their child birth pictures were on there!" and the like.
If it wasn't a "backup your life" service, sure, a more aggressive ramp-down is fine... but for a service that tried for a few years to convince you to stick you life in it, I think it's overly aggressive.
I can't imagine their burn rate is _so high_ that it is not something they could have funded for a few extra months.
Anyway, just my opinion... a few of you have pointed out "how long does it take to move your data?" -- sure, if you are paying attention, have a stable/non-metered connection in a 1st world country and are always ontop of things the moment they happen... you are right, a few days maybe.
I am thinking of that small 10-15% of stragglers that miss the emails, miss the announcements and legitimately have important information sitting in One and don't realize it's gone until too late.
Apparently that's for free users, for annual subscribers "[...] the service will remain available until 1 June and data available for a further two months".
Not that it changes things a lot, but two months is definitely better than one. They probably have data to back their decision.
EDIT: I misread the post, there's not difference. Sorry for the noise.
You're not familiar with most online services out there? Most of the time you get only weeks if not days before you can do anything about your data. This is actually better than most.
That always seemed to me a bandwagon feature, like Microsoft's SkyDrive: they saw DropBox succeeding, they figured it would be easy to emulate, and they learned otherwise when they tried it. The only way this change affects me personally is that it gives me one less thing to go in and switch off when I install a new Ubuntu system, but I'm glad they've got a pragmatic attitude toward the possibility of spreading themselves too thin.
Yeah, and honestly the setup in early versions was horrible. I tried so many times to make it work and gave up in the end. They shouldn't have released it so early back then when it was obviously not production ready.
I have heard that most of the trouble everyone had with PulseAudio had less to do with PulseAudio itself and more to do with terrible audio drivers that did all sorts of awful things when commanded to do relatively basic tasks by anything more complicated than Alsa.
Not for me. ALSA was fantastic for me, I had custom routing set up with LADSPA processing, very low latency, and multi-stream mixing. Pulse doesn't do nearly as much for me.
At least one of the PulseAudio features that's enabled by default - flat volumes - also expects audio hardware to behave in ways that wasn't in any spec and in some cases probably wasn't even physically possible, and even on hardware that behaves perfectly it causes annoying volume glitches all the time.
Why does PulseAudio have anything to do with Ubuntu? AFAIK, there is no relation between the two, other than the fact that Ubuntu offers PulseAudio (along with almost every other distro).
I must admit that, more than once, an Ubuntu upgrade killed my audio and I had to go and fix it myself. It's really painful that a distro that wants to be number 1 is so broken at times.
I don't think it's more that they push non-prod ready packages (yes, they do) but the fact that their update model doesn't easily allow them to fix bugs from upstream.
The vast majority of the packages in Ubuntu (more notably in the LTS releases) are all custom patched by Ubu-devs. Rather than taking the raw package and pushing it through, they have a semi-opaque method of testing, patching, and then releasing to the greater community.
I've tried several times to get an 'in' so that I could help with the process, but it seems to me like a 'club hangout' where outsiders are slightly discouraged. :(
> I've tried several times to get an 'in' so that I could help with the process, but it seems to me like a 'club hangout' where outsiders are slightly discouraged. :(
I think this is an unfortunate perception. There's an active sponsorship queue, and nominated people run shifts to review and accept outside contributions. All contributors need to do is submit a suitable diff, and subscribe the sponsorship team to the bug. See https://wiki.ubuntu.com/SponsorshipProcess for details.
If you are willing, please help!
Where outside contributions tend to fall down (IMHO) is understanding and thus compliance with existing processes. For example, outside contributors tend (again, IMHO) to underestimate the consequence of regression in stable releases. Or they contribute patches that should really go upstream, and underestimate the extra overhead and community harm of independently maintaining these patches just in Ubuntu.
> Rather than taking the raw package and pushing it through...
Where upstreams have sufficient QA and release process to sufficiently minimize the regression risk to existing users of a stable release, an exception can be granted so that the updates can just be pushed through: https://wiki.ubuntu.com/StableReleaseUpdates/MicroReleaseExc...
I suppose I misspoke- I wouldn't say it's discouraged, but as another comment said; it's painfully bureaucratic- it's almost too difficult for an 'average joe' developer to hop in and help patch/test their own software.
I'll share my story:
I help maintain a 3,000+ node Puppet config for a school district, running Ubuntu 12.04 on various laptops/desktops. Part of our config used XFCE's weather widget on a panel.
A few months back, the widget stopped working, as the API's url had changed. I had tracked down the issue independently, and even found that newer versions were released for 13.10+, but alas- no backport for 12.04.
I tried to investigate how to test and sponsor the package to get it backported for the LTS, but after nearly a day's worth of digging, I gave up.
Conversely, on my Arch tech station, the XFCE widget was updated many times in between, with many bugfixes and other updates.
> The reason is the risk of regression.
I can appreciate it. It makes sense that one would want the software to be as tested and stable as possible.
However, it seems a bit disingenuous to me that the developers of the various software packages are not the 'final say' in the development.
Sure, the patches might only affect a specific version of Ubuntu, but as a Developer- I would prefer to fix the issues myself.
I hope that makes sense- I'm not trying to paint Ubuntu's update method in a poor light, I just can't quite grasp the full scope of why it's done that way.
Two things: one, the quality of "developer released" packages varies wildly, so it's hard to make a global policy. Two, a distro means integrating many packages. An upstream package change might be 100% reasonable and at the same time break existing users because (e.g.) of a feature removal that went through a normal deprecation policy for that package, that distro users are obviously not aware of.
Ubuntu is supposed to integrate and add a QA layer on top of upstream packages. Just pushing any so-called stable release of upstream packages isn't going to improve the situation, even if it might in specific cases.
Personally, I don't see a way out. The very fact that a distro is an agglomeration of independently released packages means that there will be breakages like the ones you describe, because every package has its own concept of stable release, development cycle, feature deprecation, old version support, and so on.
In the case of PulseAudio, there are no stable upstream releases, so distros basically have to backport fixes themselves in order to get working audio. Red Hat (who fund most of the development) do the same. New upstream versions come with new features and can be expected to break as many things as they fix.
My impression wasn't so much "opaque" as it was "dreadfully, soul-suckingly bureaucratic". It's been a long time since I looked, so this may have improved since, but it was really hard to figure out what the first step towards helping with packaging is (aside from the triage/testing side).
We release a new version of an operating system that is used by millions of users around the world - twice a year. So there's definitely some rules to ensured quality. And if anything end-users want more quality, as seen from some of the complaints about bugs in this thread!
So there's definitely a community and it takes time to learn the rules. But there's lots of ways to get involved and if you take the time to understand the issues and then have ideas on how to improve the processes you'll be listened to - Ubuntu is ultimately a community.
Ubuntu pushed pulse before it was production ready -- I like that they did that, because it accelerated the production-readiness of a great idea with buggy implementation, and I'm using debian where stability matters anyway :P
Really? Intriguing! I'm used to MS jumping onto bandwagons late ("Oh, so Netscape is doing well? Maybe this 'inter-net' thingy isn't just a fad. Get Spyglass on the phone.") but maybe this time they had the product ready to go. Certainly they didn't push it as hard as Dropbox did; I didn't notice it existed until I started seeing it activated by default in Win8.
I think it stems from the 'Linux' way of doing things, which is to give as much information as possible, I like it, but I can see how it's off putting to some people.
Hearts and minds. It's a free product and installed and activated (and pushed!) by default in Win8, as far as I can see, so you can't really measure success, but I would estimate one in three non-tech-savvy people I talk to understand what Dropbox is, and maybe one in fifty understand what SkyDrive is. So I'd call it a failure without too much hesitation.
It's quite different honestly. Google is a services company, and they make you eat advertising everyday to support their products. Ubuntu provides an OS for free (you can turn off the Amazon nonsense anyway) and storage was just an icing on the cake - it was never a key component of their strategy.
I don't think it's fair to compare Ubuntu to Fedora. Ubuntu encompasses the mindset of both RHEL and Fedora. With Ubuntu versions, the LTS releases are like RHEL and the non-LTS releases are like Fedora.
I'm a proponent of RHEL/Fedora, but still, Ubuntu can upgrade from LTS to bleeding edge fairly easily. Try that with RHEL.
My experience has been to have trouble with both bleeding edges, Fedora or non-LTS Ubuntu. I don't freak out though, as I know the purpose of the bleeding edge is to suffer the bugs so the RHEL/LTS folks of the world don't have to. That said, using a release backward from current for Fedora or non-LTS Ubuntu usually is pretty stable, supposing you're on your game enough to upgrade before repo EOL.
Interesting - I think your prejudices are out of date. Ubuntu has been an out-of-the-box thing for me for the last two or three years, on old and new hardware, laptop and desktop. Of course I avoid Acer on principle, preferring to stick with Toshiba wherever possible, and I admit I'm not thrilled with its battery sensor support, but the old problems with dual monitors, suspend/resume, sound cards, video cards etc are a shadow of what they used to be.
Meanwhile, my own experience with the Red Hat family is with CentOS, which is either kept artificially out of date for security reasons ("bleeding edge" is well-named, generally) or else I only ever saw poorly-configured old versions. So I don't know enough to compare it, but I'm comfortable with the Debian family so I'm sticking with Ubuntu.
Both have a dependency system, but they are quite different in how they work and resolve dependencies.
Apt/dpkg is also a lot more careful about its treatment of configuration files. Debian packages generally ask questions during preconfigure, build a config file, and then run out of the box. But if you already have a config file it will either be left alone or ask you to resolve the merge.
Only differences that I have noticed and cared about as a home user: Yum lets you do `yum search` but Apt splits that off with `apt-cache` (this is annoying because with yum I can just swap out that 'search' with my line editor). Apt tends to be faster for equivalent operations.
If you use aptitude instead of apt* you get most of the features rolled into one command, and it also provides a curses based interface if you prefer that to the command line but are in a situation where a full GUI is not available to you can use an X based tool. I'm not sure about Ubuntu but it has been standard issue in Debian for some time.
I hope one day there's a command line package manager that works with both DPKG and RPM. Something tells me that will expose the real reason people stick with one over the other; remembering the package manager syntax. I feel no great allegiance to one side or the other, there's nothing of real merit I can differentiate them on, the only real result of the split is package maintainers duplicating work.
I have a set of shell scripts which wrap apt, yum, and macports, so i only have to remember one set of commands. They only cover a small fraction of what those tools can do, but they cover 90% of what i do day to day. They were not at all hard to write.
dpkg -S /some/file # tells you what package provides that file
dpkg -L some-package # lists all the files provided by a package
apt-get search <regex> # lists all package names matching <regex>
There used to be a massive difference. Mostly because debian had apt-get and Redhat had just rpm. That's why we still have sites like rpmfind. Nowadays there is not that big of a difference from end user point of view.
As a dev I still love dpkg-buildpackage way more than rpmbuild.
I'm one of the heavy users of this. It integrates with all my devices better than other options. Google Still hasn't released a Linux client and the dropbox app has been very flakey for me in Ubuntu.
I've set my phone camera pics auto sync to all my devices. All my music and photos are backed up with Ubuntu One. When I upgrade my OS version, I sync all important files here. When I switched jobs late last year, my music was instantly available at my new office. When I'm working and I need a file on my Windows machine, I just throw it in Ubuntu One and switch keyboards.
I'm disappointed by this, but not surprised. They haven't been doing anything with the product in a long time.
I had hoped that just meant it was a stable product.
For Google Drive, inSync works well on Linux: https://www.insynchq.com/linux (and supports multiple accounts which GDruve doesn't support on any platform)
Same here. I have been using Ubuntu One file services extensively on a free account while at university to backup my assignments, projects and whatnot.
I used the service because for me it was better than the Dropbox app and it integrated nicely in Unity (though that could have changed in the mean time, I haven't checked it in a while).
This being said, it did have its warts (for example sometimes when uploading a file the app hung for no reason; there was even one time when I had some files reverted to an older version).
I understand their decision and I think it is better to discontinue a service than to keep it running without improving it.
I guess it's time to switch...
You could try SugarSync? It's fairly flexible and, if you're the type of person who can stomach it, works flawlessly under wine. There are solid native mobile apps and a good web interface, too.
To be honest, its been a long time (years) since I've given it a serious try. I had problems in the top bar, it wouldn't start up properly, often freezing, and I would get errors frequently trying to access it. I don't know if these errors were with the drop box client or on my end.
Google Drive's app on Mac is rubbish. No progress indication for uploads - yes! that basic feature is missing. You just get a tiny icon in the menu bar that flickers to indicate progress, and you are none the wiser. Google really are the "me too" company.
Why the downvotes? Surely a program that is meant to be uploading and downloading and keeping directories in sync with an external system over a variable speed link (my Internet connection) needs to indicate what it is doing and how far along it is in doing that? A "I'm doing it!" indication isn't enough, because if I shut down the Mac and wander off and expect to have files on Drive but it had finished uploading, I have no way of knowing which of many (possibly huge) files it had uploaded. I have to guess.
The web interface gives a progress indication, so why not the native application? It's very very poor.
You could try ownCloud, though you have to go through setting it up (and have somewhere to run it). There are hosted solutions but I don't know how good they are.
"Today we are announcing plans to shut down the Ubuntu One file services. This is a tough decision, particularly when our users rely so heavily on the functionality that Ubuntu One provides"
Ok, so do users really rely so heavily on Ubuntu One? If so, then why do you shut it down? If no, then why do you say they rely on it?
I believe they don't rely on it.
"The Ubuntu One file services will not be included in the upcoming Ubuntu 14.04 LTS release, and the Ubuntu One apps in older versions of Ubuntu and in the Ubuntu, Google, and Apple stores will be updated appropriately. The current services will be unavailable from 1 June 2014; user content will remain available for download until 31 July, at which time it will be deleted."
> Ok, so do users really rely so heavily on Ubuntu One? If so, then why do you shut it down? If no, then why do you say they rely on it?
From the article:
> Additionally, the free storage wars aren’t a sustainable place for us to be, particularly with other services now regularly offering 25GB-50GB free storage.
In other words, they can't compete with Dropbox, Box, SkyDrive, Google Drive, and whatever flavor of the week is next week. It's a distraction, so they're focusing on making the OS better and getting out of the cloud storage business.
They can probably tell that there are people who rely on it (active usage logs and all that), and to those people this is a Big Deal. But those are also the people that would be underserved by U1 going forward, so it's best if they get out now.
As someone who works at SpiderOak, cloud storage isn't easy and requires a different infrastructural mindset than a traditional software vendor.
Out of the 4 of our peers you mentioned, two are sizable companies in their own right but are dedicated to their essentially one product, and the other two are products produced by giant mega-corporations that have room for product teams as large as us, Box, and DropBox put together.
It's funny because the origins of Ubuntu One were a clear attempt at getting some SAAS money. Finding revenue streams has always been an issue for Linux distributions, especially consumer-oriented ones like Ubuntu, and UbuntuOne was yet another stab at the problem. Then they found that it's hard to compete in SAAS against giants like Microsoft and Google, where that sort of service is often a commodity driving profits in other areas. Focused companies like Dropbox can do it, but if your focus is elsewhere, then you don't have a chance.
Canonical does seem to have more trouble finding reliable revenue streams. This is probably due to their focus on more end-user-oriented support and features. Among the Linux distro 'major players', though, they're kind of the odd ones out. RedHat and SUSE have been doing pretty well for themselves.
RedHat and SuSe are both oriented towards enterprise server (RH) or enterprise desktop (SuSe): large-scale deployments with big support contracts.
Ubuntu has gone for consumers (home users, home servers), where nobody else ever managed to build a profitable and sustainable business in the Linux world.
If there are people who really use it heavily, then they should give them at least 6 months, I believe. Especially because without further development efforts, just maintaining the service can't be _that_ expensive.
Perhaps the logic is that if you don't use the data for a month and a half and your only copy of the data is in ubuntu one it must not be that important?
Do you really want to wait for 5 hours just to figure out you've requested a wrong folder and the photos you're looking for were shot in June, not July?
I would assume that such a service would generate metadata indexes (incl. thumbnails) that are stored on more accessible media. That is, in fact, one of the main reasons a technical person can't just use Amazon Glacier for this directly, in the same way they could use S3 directly for Dropbox's use-case.
I think today is the 2nd of April, so they are providing 2 months less a day to continue using the service as normal, and an additional 2 months (for a total of 4) for users to download their data (but presumably not upload new data.)
I rely on it heavily. I have my home folder backed up and synced across machines on Ubuntu One. But I am using the free tier < 5GB. I will have to move on to other alternatives.
I guess it wouldn't have been sustainable for them, seeing as others are giving a lot more in free tier. So they would have had only a handful of paying customers.
"we continue to believe in the Ubuntu One file services, the quality of the code, and the user experience, so will release the code as open source software to give others an opportunity to build on this code to create an open source file syncing platform"
"We will calculate the refund amount from today’s announcement, even though the service will remain available until 1 June and data available for a further two months."
I find the second paragraph disappointing, but unsurprising. "Our strategic priority for Ubuntu is making the best converged operating system for phones, tablets, desktops and more." My hope is that the next design fad that out-fads the current "make my 30 inch high-res monitor look like a 3 inch screen" fad will be a step back towards actual usability. Convergence is a usability nightmare.
I'd rant more, but hey this is free software so I'll just switch to another distro.
I don't understand why you would choose a distribution based on a vague "strategic priority" statement you found on some web page rather than any technical reason. Maybe you just meant to complain about Unity for the 4 millionth instance on HN?
I was (and still am) honestly baffled that Canonical never went about building and marketing Ubuntu One as a home directory backup service.
Maybe automatically encrypt and backup all text files in the home directory by default, and for free. Restore encrypted backup from Ubuntu One every time a user does a reinstall or upgrade. Charge users if they want to throw in media files or binaries.
The standard Ubuntu install includes Deja Dup, which does encrypted backups to Ubuntu One. I don't remember ever seeing much marketing of this feature from Canonical, though.
Deja-Dup however was rather buggy... for my machines it corrupted all backups I had and I had to restore data from duplicitys broken indexes for at least two more people.
A rather bad track record for software that is there to prevent you from losing data :/
Agree, I got corrupted backup data as well, and DejaDup failed to restore a single file from a directory (but the file was actually backedp up correctly, so I managed to recover it by restoring the WHOLE disk backup on another volume and fetching it from there)
On top of that, also the backup process itself was unbearably slow... I really wanted a backup tool built upon inotify, so now I'm using Crashplan (it's not open source, but at least it works on linux)
It's unfortunately not an exception rather than yet another indication that they need to spend some energy on polishing things and fixing bugs. At least critical one...
The price cuts from Google last week were clearly an offensive move in this space. One of the best ways to refine a market is to run it at sustainable loss and watch those that cannot compete die off. Credit to Canonical for failing fast here. I hope they decide to reassess the situation and provide tooling for a BYOCS[1]-esque abstraction. This'll permit users to roll their data from one cloud storage company to another as they all start dropping off.
As a somewhat related aside: where is amazon in the consumer commodity SaaS world? No email, no calendar, no storage (albeit they do provide mp3 storage). Do they just have no interest in providing these user services?
I mean it really doesn't cost anything pushing your old code in a github repo or something.
That's not true. The code must be checked for licensing issues; certain parts may depend on components they don't intend to release, or on third-party ones they can't release. It must also be checked to make sure it doesn't have any security implications (credentials, information about internal architectures, etc). Finally, releasing it may expose them to patent attacks of which existence they might not even be aware.
> certain parts may depend on components they don't intend to release
Then hopefully that code isn't in the same repository during develoment, but as a separate library. Everything else makes no sense.
The open source code can be useful even if it doesn't run without the library you didn't release.
> or on third-party ones they can't release.
Again, I hope for you, you didn't check in this stuff in your main code repository.
> It must also be checked to make sure it doesn't have any security implications (credentials, information about internal architectures, etc).
If your code has hard coded credentials you should probably start by firing your developers. As for the architecture part, if you need to do security by obscurity you're doing it wrong anyways.
> Finally, releasing it may expose them to patent attacks of which existence they might not even be aware.
Not for canonical, they're based in the UK and there are no software patents in Europe :)
But okay, that might be an issue for some US based companies.
Software patents in the UK and Europe are actually quite a complex issue, although Canonical have publicly lobbied against them. I'll just shamelessly repost a previous comment of mine:
Software patents are not entirely avoided in the EU, just to be clear. Quoting a little from an essay I had to write, although “programs for computers” are excluded by Article 52 of the European Patent Convention (1963), inventions that include an inventive step and solve a technical problem by the utilisation of a computer program have been upheld on appeal, for example in the case of Microsoft Corporation (data transfer with expanded clipboard features. T 0469/03 - 3.5.01, (URL now dead : http://www.epo.org/law-practice/case-law-appeals/recent/t030...)
They might tell us that computer programs can't be patented, but their courts say otherwise - including states involved in the upcoming European Patent (for those unaware, currently EU states choose which European Patent Office patents to accept, the European Patent will synchronise all states except Spain and Italy). Actually the situation in Europe is quite similar to America - software isn't part of their patent legislation literature either; but such patents have been upheld repeatedly.
Yes, yes, companies shouldn't do that. But I'm talking about real life, not spherical cows, and we know that companies do all that and much worse.
Besides, let's say that you're a product manager who's responsible for allowing the release of that code. You know that generally all those best practices are followed, but there were a few dozens of employees touching that code, and some might not even be at the company anymore. Are you really sure that none of them put sensitive stuff in there? Are you willing to bet your job on that?
> Are you really sure that none of them put sensitive stuff in there? Are you willing to bet your job on that?
If you really want an answer to that: Yes, I would. Even if there are credentials in there, I'd expect them not to be usable from the public net, but only from lan/vpn.
Troubling that Ubuntu One wasn't already open source.
Can anyone explain to me why every cloud drive is using their own proprietary protocols to do the same thing? Why we have to install a different client for each vendor? That's one thing that annoys me greatly, and I'll never use Dropbox as long as its syncing tool is proprietary.
Does anyone else remember when companies were proud that they were the custodians of a file format? Once things went to the internet it seems like that stuff is no longer important to companies.
> Troubling that Ubuntu One wasn't already open source.
Why is that troubling? Why should a service provider open source the software that they use for providing the service? You're not 'receiving' the software code of the server parts -- providing the storage functionality -- but are just using that.
In this case, the Ubuntu One client software and protocol libraries were already open source, and their statement suggests that they will also be open sourcing the server software code shortly.
As another example and similar case -- server software code that implements the storage functionality not available as open source -- is this quote on Tarsnap:
> While the Tarsnap code is not distributed under an open source license, Tarsnap contributes back to the open source community via bug fixes and enhancements to [...]
In these cases, the storage parts implement the competitive advantages for the provider. Now that Ubuntu One is stopping to leverage that competitive advantage, it remains a classy move to open source their implementation.
Canonical provided an answer here: http://askubuntu.com/a/15295. I think the answer is interesting, and this could apply equally well to any other vendor.
That's a silly answer. The technology behind the syncing is not a competitive advantage. The existence of dozens of these services should tell you that. The way the companies can compete is by building services on top of the cloud syncing, and by providing integrations with other services you use. That stuff makes sense to be proprietary, but the syncing itself doesn't, imo.
> The technology behind the syncing is not a competitive advantage. The existence of dozens of these services should tell you that.
I'm not sure your argument applied at the time. As technology makes progress, today's competitive advantage is tomorrow's commodity.
In any case, consider the people who pay for Dropbox today. What proportion of these people don't use integrations or services on top of the basic sync service?
Proud custodians? Yes, they wrote the format and controlled it, but it was no different that a proprietary protocol; it was just a necessary way of allowing people to send data around. .DOC and .PSD in particular have always been proprietary messes that they wanted you to treat as a black box.
I was thinking more about OpenDocument, EPub, PNG. Things we take for granted today. It seems that once efforts transferred to the web companies simply have no desire to cooperate on their protocols. Even though cloud syncing is simple enough that dozens of companies have reinvented it, the companies don't feel any obligation (as I personally feel they have) to open up these protocols.
I don't think those examples were ever representative, and recent examples still exist: Google alone has WebP, WebM and KML, at least. And PNG didn't came from a company, it came from an ad-hoc group, and we definitively still have ad-hoc groups coming up with new standard formats.
Even though cloud syncing is simple enough that dozens of companies have reinvented it, the companies don't feel any obligation (as I personally feel they have) to open up these protocols.
I agree, but I don't agree that they used to. Most formats were and have always been proprietary. Open formats are and were an exception, and they were poorly supported by the industry at large. Yeah, we had Open Document, but the biggest Office suit didn't support them.
Similarly, ownCloud has an open protocol for syncing, but most vendors don't support it, just like they didn't used to support open formats unless they were forced to by their popularity. It's nothing new.
You may be right, I may be looking at the past with rose-colored glasses. But still, it feels worse these days. Those formats I mentioned might have been developed by groups, but they were groups made up of companies with expertise, right?
With cloud syncing we have a few open alternatives, you mentioned ownCloud, but there's also git annex, and clearskies (btsync clone).
But those are all community projects. No corporate support. At least in the OpenDocument days you had Sun as the corporate owners.
Why does DropBox not feel some responsibility to step in here, as the industry leader? This is a company that was launched on HackerNews. You are like us, why does this not bother you?
You may be right, I may be looking at the past with rose-colored glasses. But still, it feels worse these days. Those formats I mentioned might have been developed by groups, but they were groups made up of companies with expertise, right?
Yeah, but that still happens. There's the W3C and the WHATWG, of course, but also formats like Opus (Xiph.Org, Skype, Mozilla and others) and protocols like OData.
At least in the OpenDocument days you had Sun as the corporate owners. (...) Why does DropBox not feel some responsibility to step in here, as the industry leader? This is a company that was launched on HackerNews. You are like us, why does this not bother you?
I think you're being a little unfair to Dropbox here. You should remember that Sun was in a very different situation: they weren't market leaders, they were users and the main vendor (MS) was asking them so much to license Office, that it was cheaper to flat-out buy StarOffice. And then opening it up was certainly socially beneficial, but it was also a way of pulling some power from Microsoft's hands and making it easier for Sun to avoid licensing their suite.
Dropbox, on the other hand, has no clear business case for opening up. They'd be adding a lot of risk to their main and only product, and for what? Good will?
I'm not saying they shouldn't open up, but I don't think we should judge them too harshly.
By the way, I'm very doubtful of the idea that HN has a whole has an ethical position on opening up. I'd say the mainstream position here is "we like open source if and when it benefits us". Which is why you see plenty of open source libraries and programming tools, but very few open consumer software. Proprietary SaaS is the mantra around here, not FOSS applications.
Don't think the server code is available as open source right now. The statement was:
> Additionally, we continue to believe in the Ubuntu One file services, the quality of the code, and the user experience, so will release the code as open source software to give others an opportunity to build on this code to create an open source file syncing platform.
So you might expect that they are (or will be) preparing for an open source code release shortly. A move with class, BTW.
That's about as classy a product shutdown as you could wish for.
Pity this didn't work out financially for Canonical, and too bad for those users that came to rely on it (but this is the issue with pretty much any service that you don't operate yourself).
> That's about as classy a product shutdown as you could wish for.
2 months continued service and another 2 months data access. It could be worse but it is a long way from the best you could wish for. 12 months to 2 years after the announcement would be what I would hope for from a big company winding up a service
This is disappointing. I have all my backup files on Ubuntu One, and I've purchased extra storage as well.
Between various bugs, shit that just plain doesn't work, cancelled projects and now Ubuntu One, it seems Canonical likes shooting themselves in the foot... Maybe if they had actually marketed Ubuntu One and made it better they could be a Dropbox competitor. Then again, maybe they just don't have the expertise to make half of this shit work.
No wonder SUSE is a billion dollar company, Red Hat a multi-billion dollar company (with over a billion in yearly revenue), and Canonical a trust fund baby...
It was expected this kind of ubuntu branded services were not going to last long.
From its early beginning ubuntu has been pointed out for lacking a foreseeable long term future, being a nice experiment funded by a donation from the personal fortune of a rich guy,which will eventually run out.
Ubuntu one felt like an experimentation from Ubuntu to bring some money in, in hope to somehow contribute money to fund itself for lack of a better business model.
It makes sense to kill the experimentations that don't bring enough profit while consuming precious resources.
Anyone interested in porting the Ubuntu One code onto a new free, open source decentralized cloud network?
The spec is amazing, including client side encryption, fully anonymous, no single point of failure, no way for the network to be censored or shut down etc.
If interested check out [MaidSafe.net](http://maidsafe.net) for overview, and if you want to talk code join us on maidsafe-developers Google Group. It's launching soon and it would be fantastic to have a Ubuntu One government as one of the first apps!
I've loved Ubuntu One for years. This is too sad, I would've wanted to have Ubuntu use it automatically for all kinds of things such as home directory backups etc.
Are there any good open-source self-hosted options that I could run on my little box at home? Preferably something that doesn't require a special setup or deployment on a server.
I could imagine there are file-sync solutions that just need an operable SSH account somewhere and merely automate the use of rsync to do the transfers, watching files and taking care of conflicts.
Time your code release so it coincides with Ubuntu One appearing wherever and piggy back on the Web interest. Timely blog post about how you plan to replace Ubuntu One.
Wouldn't it be better if they partnered with someone like DropBox to provide a migration path? Possibly a transparent migration path? Then convert Ubuntu One to a relabeled version of DropBox.
I for one (no pun intended) actually liked Ubuntu One. In the age where the DropBox installer for Ubuntu was funky and running it headless involved downloading a Python script and running it in a screen session Ubuntu One provided a much better experience.
That said, I was never one to pay for DropBox or Ubuntu One because their pricing was just a little too expensive. The free tier got me enough space to share a few random files, and if I needed more than a few GB's, I've got my own infrastructure for that.
Considering the fact that Ubuntu breaks sometimes on laptops, a online storage for backing up important files had been a good service for me. Now that it will be down, I have to configure some 3rd party software in order to get the automatic backup process. On a brighter side, I feel the Ubuntu on my laptop is getting more and more stable(no breaking for past few month in fact), so online backup may not seem that necessary now for me.
I never liked their storage service. On top of that, all the videos files which I stored in their server got lost previously, so I never cared to use their service again. It just showed the filename of the video and displayed it's size as zero bytes. A quick google at that time showed me that other peoples had also lost their files and Ubuntu didn't even send a apology mail for that. (But text files was present.)
Are you 100% sure it was Canonical's fault? I've had issues with dropbox, google drive and sky drive at different points in the past and never received an apology letter from any of those companies either. A simple google search will show literally thousands of people having trouble with all sorts of Microsoft services. I sure hope they get apology letters.
Yep! Mirall is the client (full disclosure, I'm a pseudo-team member working with this client's development) and the server software would also be needed.
"Our strategic priority for Ubuntu is making the best converged operating system for phones, tablets, desktops and more"
I guess a server OS is also not a strategic priority. Oh well. What is a good Debian-based server OS that is a bit more up-to-date than Debian (and also a strategic priority for its developer)?
Just use Debian Testing. If you find a "Debian-based server OS that is a bit more up-to-date", odds are they're just repackaging testing or unstable anyway.
Sysvinit is the UNIX way. Besides, if it's a server, you start everything once and leave it running. Reboot time is not important, only reboot reliability.
Canonical should buy Dropbox and integrate. That'd give others (The Goog, AAPL, MSFT) pause for thought and give the Ubuntu folk a key differentiator versus the other Linux Distros. If they can't afford it they should figure out how to afford it. Beg, borrow or steal :)
I can see disabling new signups but not shutting the service down and erasing everyone's data - it doesn't bode well for Ubuntu in the enterprise or anywhere else trust and longevity are important.
Thank you! Ubuntu One is a buggy, laggy mess and frankly an embarrassment to Canonical. What would be neat is if they made a GUI for BTSync, and sold pre-configured hard drive space for it!
I imagine this service had a low adoption rate and is costing Canonical money at a time when Mark is putting all his eggs in the phone/pc convergence basket.
Oh no, Ubuntu is shutting down a service despite "our users rely so heavily on the functionality"... clearly this means we must never trust Ubuntu again!
At least that's what people on HN say every time other companies launch something new.
The only reason it's not sustainable anymore is because send-all-my-searches-to-Amazon will be opt-in, so they will have to start paying for s3. The deal where Amazon would give away cloud storage for OS-level privacy has changes since they announced the opt-in.
Maybe the NSA came for their files and they don't want to shill on their users but also got one of those security letters so they can't say anything about it. Maybe this is a heroic move for us against the US government and we'll never know. Might explain the hasty timetable too.
A business we like does something we don't like: the NSA probably caused it.
Maybe companies are just capable of doing things we like and don't like and our liking them is just a cognitive bias of our historical cherry-picking of good things over bad.
Actually I'm quite ambivalent about it, I never used ubuntuOne. The problem is there is now precedent for this, and it's all secret. We can't know one way or the other about this or a ton of other stuff. It's more a comment that the NSA has thrown up a huge amount of uncertainty about a lot of things and you can now second guess so many things. And that sucks.
For a service that we were suppose to be syncing our lives to, that seems like a really abrupt, customer unfriendly ramp-down.
I would have expected something more like: