More than 15 years ago, I worked for a Linux services company that competed with Red Hat. We sometimes saw them or were invited to see them rather negatively (as the competition), and I particularly remember that an ad agency that wanted our business once shipped us a box containing a burnt red hat (!).
Time has shown that Red Hat really knows what they're doing, on many different levels, and they're the most successful free software business in history.
Actually took me forever to figure out he meant a literal box, containing a literal burnt red hat. I parsed that as a box (server) imaged (burnt) with Red Hat (the operating system).
Seems like they used to at least tip their hat (no pun intended) to free software, GNU, etc. --- the whole free software ecosystem on which much of their distribution is based. Though maybe I'm remembering it wrong.
Now I can't find any mention of free software on their site's About pages.
There is no billion dollar open source company outside of Redhat, if a billion dollars in revenue is your benchmark. On pure success financially from an OSS company, they truly are unrivaled.
Otherwise, there are thousands of small open source companies that do quite well for themselves. If you look at pure profit however, Redhat does abysmally compared to say Microsoft or Apple, or any traditional proprietary software company. Due to this, the valuation for a purely open source company have to be lower which prevents a lot of the VC funding frenzy for "the next unicorn company". If you want small and sustainable OSS can work. Otherwise, Redhat is the black swan in that they pulled off something thought to be impossible.
Mozilla should be considered in similar category to Red Hat. They make almost half as much in the hundreds of millions range. That's focusing on search revenue instead of doing lots of products like Red Hat does. If they did that for enterprise, they'd probably be in the billion a year situation Red Hat is. Especially now that the competitors are surveillance engines & they could differentiate w/ paid offering. Even per organization rather than per user. Plus value-added services like encrypted backups.
They have a lot of potential that's underutilized.
Maybe, but from a corporate standpoint, Mozilla's leadership is pretty abysmal. Take the entire debacle with Mozilla Messaging. It was pretty much the only serious game in town for a decent gui mail client on Linux or really cross platform and they dropped it on the floor instead of finding some way to monetize it. I'd have paid for a decent Linux mail client with sensible calendaring (that isn't evolution aka outlook for Linux).
I lost a lot of faith in Mozilla years ago when the Firefox brand was starting to get really big and the Linux releases for everything Mozilla would lag behind by a few months. As a very early and avid Mozilla Phoenix/Firebird user, I lost faith in the brand as a whole. Sure they're doing wonderful things with webgl and the bleeding edge javascript stuff, but so is Google, and arguably to a greater degree.
Their potential is underutilized due to pathetic leadership, which will continue to flounder until that is resolved.
Yeah how long have you used Firefox on Linux? Have you used it since Phoenix / Firebird (before it was called Firefox maybe 2003-2004 I forget the exact dates)? I did, and I distinctly remember when they pretty much shit on all Linux users after it really took off on Windows.
Nowadays their releases are always in sync, but that is a more recent thing. I also seem to recall Linux support was much better with Firefox 4 and after, but that was a long time ago and it is a bit fuzzy for me currently.
Is that really true? It seems most of their effort is funded by the corporation that makes boatloads of money off search revenue. Then, there's the nonprofit that owns it. I believe monetization is definitely one of their goals even though I can't say the priority.
I get your points on what issues they have. Yet, you were talking about how Red Hat is so huge in terms of labor and money. I was only bringing up Mozilla in the sense that they're a FOSS company making a ton of money with a bunch of employees that goes far in Red Hat's direction. I've got plenty of ideas for how I'd do it way better as a business. We're talking whether they make big money, though.
Same here! I was looking for it one day wondering how much money an open-source browser was "scrapping by" with... to find... "Holy shit! Hundreds of millions of dollars!?" I can only dream...
> I lost a lot of faith in Mozilla years ago when the Firefox brand was starting to get really big and the Linux releases for everything Mozilla would lag behind by a few months.
Cloudera will soon be a publicly held mostly-open source company valued well over a billion dollars.
Hortonworks is already public and almost fully-open source, although their market cap is under a billion ($670MM today).
Edit: I missed that the billion refers to revenue, and that RedHat is >2 billion. That's extremely impressive! These data platform companies are doing well, but I think they're still doing far under a billion in annual revenue.
Not just the Linux kernel. Canonical is primarily a consumer of open source that does some really need value add ontop of it (see Juju for an outstanding example). Redhat is primarily an excellent engineering company with good enterprise sales and (compared to Canonical) lousy marketing.
Was that snark really necessary? Parent had a valid question; your org could be using the OSS Ubuntu which would be free, or the enterprise version which costs money. You never really specified which, and it changes the argument drastically from a cost perspective (and therefore potentially some of the motivations behind the choice.)
They're not even in the same ballpark. RH has 10k employees and USD 1bn in revenue with a market cap of ~15bn, these guys just seem to be a ~500 person WordPress shop.
Automattic is the WordPress shop. They develop WordPress and run WordPress.com. There aren't any recent revenue numbers available, but indirectly powering ~25% of the internet is nothing to sneeze at.
That's interesting; they really don't have the brand recognition they deserve then. I would consider myself as reasonably well-informed as the next HN reader, and I certainly hadn't heard of them. Of course I don't work with WordPress - but i have heard of WordPress...
There many success stories in opensource companies but they are not as open as redhat nor as succesdful as redhat.
There is a consulting company over the corner that customizes opensource software at small scale. There coreos, percona, suse,...
Oracle is the only multi billion dollar database-centric company.
Faceboom is the only successful social media (twitter is micro blogging or whatever you call it)
Microsoft is the only desktop blah blah
Amazon blah blah
That is a result of being early, acquire many small successful companies ex. Kvm, ceph, Ansible,...
And being really truely opensource.
No one else is truely opensource, ex. Mysql used to sell exceptions and prosperity features instead of selling Enterprise support.
There's a ton of them. Many of them are smaller players consisting of core developers providing feature additions for money, or support contracts, or selling an "enterprise" wrapper/hosted.
It's really difficult from a business point of view.
To have a business, you have to have something scarce to sell people - it could be simply your time, but that's difficult to scale. For most companies it's some kind of product, but open source products are inherently not scarce.
So to succeed, you have to have a source of scarcity in there somewhere, and that's not always easy, on top of the fact that no startup is ever easy.
it's bad only if that's the company you want to take to $1 billion scale.
what if you don't have hot shot VC buddies from princeton, and didn't go to stanford, and need to make $1m in cash to do your real business idea? all of a sudden doing paid features for an open source project or doing consulting sounds like a fantastic idea because that will get you to $1m, almost guaranteed (if you're generally capable)
Consulting is not a great business in a lot of ways, and 1 million is not at all easy unless you're 1) someplace where that's not a lot of money, or 2) you're famous.
I would heartily suggest avoiding it, as it's also not scalable even at a small size.
Look at the kind of stuff the 'micropreneur' guys do, like patio11's bingo thing. That was small (I agree with you there) but still something that once running, kept making money without a daily grind like consulting.
I dunno, IBM and the likes (sun, oracle) seems to make a fortune off consulting, to the point where they've just about removed themselves from actually making stuff.
Consulting involves very little capital outlay, and is mostly just trading your own time and expertise for money.
Creating a product - even a small scale one - is much more of an investment (and a risk), because while you're building it, it earns nothing. The payoff is that it can make money while you're asleep.
I know a handful of people doing this right now, and they're pulling above average developer salaries. However, they're also traveling a ton and building kick-ass resumés as well.
Ah yes, that may be it, I am fairly sure they were in the black for a while from their cloud services, but they probably blew that on their push to mobile.
The same article said they were approaching break-even before the mobile push. So maybe, maybe not. It's clear they could do it if they wanted to. It seems to have mostly lost money that Shuttleworth put into it out of his pocket, though.
That's too bad because I like what they've done in terms of usability of Linux outside Unity. I switched to Mint at that point until the distro got subverted way too easily. Then back to Ubuntu with shitty Unity. I can't wait for the new one without it that was just announced since the mobile thing didn't pan out.
The Apache Software Foundation is in no way shape or form a business. It's a non-profit with maybe 1 or 2 full-time people (last I recall) and tons of people who commit to projecs.
> I don't think it is possible to be Open Source without being Free Software, and vice versa.
It is a bit tricky here. Based on the situation, the same software can be some times be open source AND free software, or open source only (ie, not free software).
Eg: Linux kernel. When it is run on your computer, usually it is open source and mostly free-software (linux-libre would be fully free). When it is run on your router, they (vendor) shall give you the source code, but may not allow you to modify it. This is violation of freedom 1 (The freedom to study how the program works, and change it so it does your computing as you wish). Then it won't be a free software, but just open source. You can't even confirm whether the source code they gave corresponds to the binary run in the router.
Atlassian software is Open Source but not Free Software. You can download it, inspect it, and run it... but you need a paid license to keep it running. The same is true of RHEL, because CentOS is effectively "RHEL without a paid license".
Both Atlassian's "open core" and RHEL are free software.
The difference is that Atlassian sells plenty of proprietary software and add ons too, while the only proprietary software that Red Hat sells comes from recent acquisitions (e.g., Ansible Tower), and it is all in the process of becoming open source.
Red Hat has a great track record for open sourcing software from companies that they acquired.
As I recall, the USPTO declined to grant the registration for the service mark, which was kind of sad for OSI, but they kept going anyway, encouraging people to refer to OSI-certified open source.
No, I am referring to Stallman's view on free software. See below. Software free to download vs "free software" from a license PoV aren't necessarily the same. Redhat's products are best to describe as open source software.
Redhat's code is Free software, actually. The page you link simply says they don't follow the free software distribution guidelines (and thus are not endorsed by the FSF/GNU project) because they also allow non-free software in their repos.
A better reference for Stallman's view here is probably https://www.gnu.org/philosophy/categories.html, which shows why Red Hat's business is probably reasonably described as a free software business (particularly since "the differences in extension of the category [open source] are small").
You're incorrect. Red Hat (and SUSE)'s distributions are free-as-in-freedom software (or at the very least, the majority of the distribution is free -- the link you posted is complaining about a very small minority of the software that is part of such distributions [which is a problem but is not as big of a problem as you claim]).
The fact that you won't get official updates after a trial period (or if your subscription runs out) isn't against the principles of free software. As a user, you still have the source code and have the freedom to maintain the distribution yourself (nobody does, but the freedom is still there).
Something I just learned recently after I started working for Red Hat and is the tldr; of the article:
"Red Hat also commits to keeping our commercial products 100% pure open source. Even when we acquire a proprietary software company, we commit to releasing all of its code as open source."
I work for SUSE and we have a very similar view on making all of our products free software. Everything I've ever written or done for SUSE is free software and I contribute it to openSUSE before it goes into our enterprise distribution (we are a member of the openSUSE community and we very strongly believe that our contributions should go to the community before they go to our enterprise distros).
Not out yet, but you can go to https://www.ansible.com/open-tower for updates. The cool thing is that even though it isn't out yet, it will be, because that is the Red Hat way.
Export compliance is a huge PITA for red hat. Most open source projects and libraries are not very well set up to meet requirements from the US gov. It sucks down a ton of time, making sure everything is above board.
I spent an entire weekend setting up Ubuntu, and then wss extremely jaded when I could barely get anything to work, and even getting Flash in the browser was a huge pain in the ass. So I only use my windows side partition now, and its just easier....
> even getting Flash in the browser was a huge pain in the ass
One thing that you have to remember is that what is easy and hard in Linux is different than what is easy and hard in Windows and MacOS.
Flash is one example. Since it is not Free and Open Source Software, the Linux distributions cannot redistribute it to you without risking being sued by Adobe. So if you want Flash you will need to jump some hoops.
On the other hand, if the software you want is in your distro's package repository then the installation experience blows Windows and MacOS out of the water.
It's this mindset that makes me want to support Red Hat as a company. As good as the Fedora 25 experience has been for me, I'd love to see Linux laptop vendors (especially Dell) make it an installed option soon. Maybe CentOS laptops would make more sense for them from a sheer "stability for support sake" experience.
There's a sort of chicken-and-egg problem there (or at least, there has been in the past. I've been out of the loop on this for a few years). CentOS/RHEL are definitely the stable, long-term distros, but they don't always map exactly to a specific Fedora release, so you can't necessarily just build RPMs for the last few Fedora versions and also support CentOS/RHEL. Historically, when looking for software that wasn't already in a repo but packages were provided, you would often see Ubuntu and Fedora packages but not RHEL (and thus CentOS) packages.
RHEL's release schedule is really optimized around server and workstation time scales, not desktop time scales. Ubuntu's is much more amenable to desktop use, where you get an LTS release every 24 months. RHEL regularly goes 3-4 years now (with point release refreshes, but that's not too exciting for desktop users). The flip side is that RHEL and CentOS have a much longer support lifecycle.
Point RHEL releases are certainly exciting for desktop users. You get lots of driver and filesystem updates, and some point releases even rebased Mesa, X11 and GNOME. Plus, Software Collections such as Red Hat Developer Toolset provide updated MySQL/MariaDB, Ruby, Python, GCC, etc.
Red Hat was primary reason it took me a lot longer to adopt Linux than it should have. Headbanging experiences with dependency hell and things not working as expected left me extremely discouraged. It wasn't until I dabbled a little with Solaris 7 and finally found Slackware that I realized that Linux could "just work". IMO Red Hat's success was primarily based on the critical mass of support behind it, not because it was the best distribution.
I agree, but they've certainly come a long way. When I started with Red Hat 6.2, the dependency hell of installing RPMs from random websites and using the (almost never functional) up2date left me with an awful taste in my mouth as I jumped ship to Debian and the "just works" experience of apt-get.
After switching to Fedora because we're using CentOS at work, I've come to like it. DNF and yum are fine replacements for APT. A PPA analog (copr) is just a 'dnf copr enable user/project_name'. 'dnf history' shows every transaction on my system and makes it easy to undo installations.
The only things I don't like on an out-of-the-box Fedora installation are the stupid, touchscreen sized title bars in GNOME 3 and SELinux - which is fortunately easily disabled.
>The only things I don't like on an out-of-the-box Fedora installation are the stupid, touchscreen sized title bars in GNOME 3 and SELinux - which is fortunately easily disabled.
You're missing out. Nix is a fantastic package manager with features that apt (and similar) simply do not have. NixOS is not perfect yet, but it's very usable, and quickly improving.
Can't say this enough. You write a config script for your OS and run it.
Need to set up device #25?
Upload and run. You're guaranteed to get the same system as device #24,23...
You can keep it in source control, etc.
If it (or guix - no systemd FTW) takes off, I can't imagine using another distro again.
Unfortunately, though, it doesn't have enough maintainers yet (a year or so ago nginx was some 4 or 5 versions behind, and they definitely don't do backports/LTS).
The biggest problem Nix[OS] has right now is documentation. The wiki is deprecated, but still the only source for explanation of several key parts of the system. nix-shell is fantastic, but hard to learn simply because there is no specific place to learn. Nix language really needs some clear documentation so a user can understand how to get from "I don't have X" to "I have a working default.nix for X, and X installed in my environment."
GuixSD seems nice, but seems to have no method for installing non-free software, which is, unfortunately, necessary for many setups.
One thing I would like to have is a Nix/Guix wrapper for Debian packages (and possibly other packages/distros), so I could take advantage of Debian's robust ecosystem, and NixOS/GuixSD's totally functional environment.
My problem with documentation isn't so much that the nix language is so hard, but that a nix's package configuration's is (obviously) not the same as upstream.
Before yum and dnf, dependency hell is a serious issue indeed. After that it's still a problem if you use multilib (sometimes updates get pushed for x86_64 but the i686 packages are missing) but I wouldn't really call it a dependency hell anymore.
Seriously. Weekend long marathons of dependency chasing (in the snow, uphill both ways) to finally get config/make/make install to run were not only a thing but disturbingly commonplace. These kids don't know how good they've got it. God I feel old.
At least you had a package manager with a reasonable number of packages. My first distro as a main OS was Puppy Linux because it was a small download (remember dialup?) Actually at the time I didn't even have internet in my apartment, so installing software went something like this:
1) Identify a really interesting piece of software, like OpenFOAM, with (unbeknownst to me) lots of dependencies
2) Take a half hour walk to the local library with my shiny 4gb flash drive (89 dollars, a Christmas present to myself).
3) Download the .tar.gz of the software.
4) Walk home.
5) Unpack the .tar.gz and run ./configure. Watch it fail.
6) Walk back to the library to download the missing dependency. Walk home.
7) Goto 5.
I ended up very irritated that configure always fails on the first missing dependency, instead of comprehensively listing the missing requirements in one go. Things did get a bit easier when I learned to scour the documention for any libraries referenced, but of course reading the docs still often required a trip back home to unpack the tarball...
It feels so good to not have to look at that book with the guy on a horse, to get networking going every time I installled a new distro, or sometimes upgrading. Especially because I usually didn't know what I was doing to fix the issue, I didn't even know what I was looking for.
Thanks to everyone who did read that book and others cover to cover (or wrote them), so that my linux install just works.
You could recreate the experience for younger colleagues using a slackware install and deliberately not using any of the more advanced tools (slackpkg+ &c) and hiding the existence of slackbuild scripts.
> has always enjoyed the reputation of being the most bulletproof distro if you could afford it.
I've always heard that said about Debian more than about Red Hat (though Red Hat certainly is pretty stable).
Red Hat has a lot more success in businesses though because you can get contractual support; which may not only be useful if you don't want to get the skills in-house but also because your own customer may contractually require it.
Debian also has an impressive reputation for quality. In many cases it has achieved, through its vast user and developer network, what Red Hat could only achieve through paid staff and commercial resources.
But this is not true in all areas, especially in some aspects that matter to companies such as training/certification and having good up-to-date documentation.
And as a volunteer-driven project, I don't think Debian can ever be as responsive to end-user problems or requirements as a commercial product can be.
But it definitely gives Red Hat a good run for its money. For example, the Debian LAMP stack has long been and still is the gold standard.
Debian is truly a solid alternative. Ubuntu, less so. They're shipping an impressive amount of new features, even on LTS releases, but their QA is nowhere as good as Debian or Ubuntu's.
UNIX vendors didn't normally ship releases or updates that introduced dependency issues. Part of their job was doing testing and quality assurance, that's what you were paying good money for. Most such issues were introduced locally.
A good sysadmin would know how to avoid dependency hell. For example this might include taking basic precautions such as testing changes in a chroot or development/testing environment, before rolling them out.
I'm not saying the problem didn't exist, only that it wasn't an actual problem if you knew what you were doing.
The problem with dependency hell was more that RedHat back then didn't have such a thing as a "rolling release" or "testing" in Debian speak. So while "RedHat 6" worked fine, trying to upgrade (say) OpenOffice would require that you download the rpm from OO's website, which would require that you upgrade Gnome, which would require that you upgrade X, which would require that you upgrade glibc, and down the rathole. Considering that most people then were still on dialup, it was horribly annoying.
Also, RedHat didn't ship with (relatively) much software (install everything was an option). Now you've got to configure; make; make install. Now you've got two parallel installations of libraries and software.
Oh. And the RPM database would die periodically (rpm anything would hang), requiring a reinstallation.
Since moving to Debian over a decade ago, I think I had to configure; make; make install something only once (it was an old and unmaintained Java library on SF.net). Almost everything (open source) is in the Repos.
I never install anything from source. Packages from yum repositories only.
Upgrade OpenOffice with a downloaded RPM? No. The point of RedHat is stability for enterprises. If you want the latest version of everything, RedHat is the wrong distribution.
As someone who used Redhat since 5.0 - the RPM database never died for me. But then, I've never used 'rpm --force', which many did and I suspect was the major factor in RPM database death.
I think it was more of a problem on the BerkeleyDB side than on the side of
data. BDB was giving an impression of quite fragile thing back then, but it
was (and AFAIK still is) the only source of data about installed packages.
Except this really has nothing to do with rolling releases, or the amount of software the vendor ships in the official repositories. Switching to Debian won't solve the problem either.
There is no distro today that provides packages for all the software you will ever want, or the specific version that you need. At some point, you will resort to installing software from outside the officially-supported sources, whether from experimental or user-maintained package repositories, or from a third party in binary or source form.
Until recently, this was an operation that wasn't guaranteed to be easy or straightforward or risk-free. In the worst case, it could even screw up your system in ways that are time-consuming to diagnose and fix.
In the example you gave, I would conclude that OpenOffice didn't package their RPM well, because it ended up driving RH6 users down the rabbit hole. At the very least, they could have unpacked all the files under /opt and provided static binaries, or included all the libs in the archive. Many packages still do that today, such as Vagrant, which installs under /opt/vagrant and includes its own Ruby interpreter there.
Nowadays there are efforts underway to make installing custom software safe and easy, projects like: flatpak, OSTree, appimage, and snap. Hopefully we can reach a point where you can install whatever version of whatever software you want without breaking anything.
Generally, nowadays you keep a system relatively up to date using apt-get update and upgrade, pacman or yum. Back then all you had was RPM (the equivalent of dpkg).
The attitude was "you want new versions? Go to upstream, download the .RPM and install".
Now, OO (don't remember if it existed back then) depends on certain versions of (say) GTK.
What timeframe are you talking about? I remember it didn't "just work" for me when I first started experimenting with it (V4.2, IIRC). Yes, I was new, but it wasn't the easy-install experience that it now is, for sure.
I started at around the same time -- Red Hat Linux 4 era. Yum didn't exist yet. I was also quite new to Linux, but I'm reasonably sure there really wasn't something that would do dependency resolution for you. You either had to find an RPM that had the right version, or you had to find a tarball somewhere and run configure and make yourself.
Installing software on an OS, shouldn't be something that requires ANY experience. What exactly is the point of an OS without the software running on it? If it's not immediately obvious how to do any of it, for a total noob, to a veteran. Then it's a garbage OS in my opinion.
That's missing the point of Linux entirely. Your average end user isn't using Linux as their day-to-day desktop environment as it requires some skill to maintain. They are using Windows/OSX because they don't have to focus on the platform and are able to carry on with their workflow applications, needing minimal back end maintenance. On the flip side, many sysadmins will not run back end operations on a Windows box as you simply don't have the same level of control, configurability, and customization that you can achieve with a properly set up Linux solution. And calling any OS requiring knowledge to operate and maintain garbage is downright ignorant, as this entire thread and article proves otherwise.
Except all I've been hearing about for the last 20+ years, is that THIS YEAR is going to be the year everyone uses Linux on the desktop. That's not ever going to happen with how archaic it is to do basic tasks.
> Red Hat's success was primarily based on the critical mass of support behind it, not because it was the best distribution.
May be true, but they're given so much distribution independent code back, that I'm really glad they made it. It even seems that they contribute more to desktop Linux (GNOME, PulseAudio, systemd, kernel devs etc.), than Canonical does, (they seem to focus on Ubuntu-specific solutions mostly), which is pretty neat for a server vendor.
IMO the distro that actually made me realize that after some good configuring Linux can just work is Gentoo. After Portage no package manager was ever good enough for me.
No surprises there. While just about every other package format out there is some variant of a tar-ball with added meta-data the RPM is a custom binary format.
I know someone did a blog or article on the RPM format internals, but damned if i can find it with a quick search. I just get a whole bunch of Fedora and Red Hat links...
I proudly worked for Red Hat for almost a decade and I can honestly say that they've got pretty much everything right, from culture to work ethic to vision and execution, all throughout its done with the community at the forefront. It really is part of the company's DNA and they don't just talk the talk, they walk the walk.
A great company, culture and a great place to work.
Having used linux for over 20 years now - the 'fragmenting the Linux desktop' idea boggles my mind.
The entire Linux eco-system exists of competing things. Nginx vs Apache, sendmail vs qmail vs postfix, ... the list goes on. There is no singular 'desktop' when it comes to Linux, and there will never be one. Thinking and hoping there will be is simply not understanding what drives the whole Linux world: options and competition.
"I’m writing to let you know that we will end our investment in Unity8, the phone and convergence shell. We will shift our default Ubuntu desktop back to GNOME for Ubuntu 18.04 LTS."
I checked the date of the post really carefully as well. That is certainly a change of direction.
I guess unity was the only thing driving Mir then? Is dropping one the same as dropping both?
Seems like good news to me, having a significant player like canonical contributing to the community instead of fragmenting without much benefit can't be bad can it?
I don't use it on the desktop anyway, but I hope that moves in this direction are rewarded by the community.
My (limited) understanding is that Gnome includes Wayland as an integral component, and so reverting to Gnome as default desktop for 18.04 will preclude use of the Mir graphical server.
Huh. Maybe I'll move back upstream to Ubuntu. Although at this point LinuxMint works fine for me, so it's hard to see a compelling reason. But Ubuntu's desktop games are what finally drove me out, as a desktop.
2015/46 - visual and tactile bodysuits enable advancement in personal Virtual Realities, which begin to take the market share from TV, radio, films, and other media.
2017/48 - first universal operational machine 'Harvey' constructed by a team lead by Peter Shor at Bell labs, with funding from IBM, Lycos, RedHat and Pepsi.
That bit from the timeline was probably written around 1999-2001 or so. Kind of amazing that out of the tech companies mentioned, RedHat is the one that's still doing pretty well.
The Red Hat retail to Enterprise shift still leaves a little bitter taste in my mouth. I remember when they shut down public access to the rhel binaries and the beginnings of Centos. All legit to the letter of the license and no more.
I get it though. Anyone know how Centos under Red Hat is nowadays?
Better staffed and as healthy as ever. Also, Karanbir, the guy who runs CentOS and always has, seems a bit less stressed out. Notice how the release cadence and security updates in CentOS have increased in release speed massively.
It was probably the best thing to happen to CentOS provided RedHat will continue letting them be an independent distribution.
Still thriving, to my knowledge. Scientific Linux is also out there as well.
My recollection is that Red Hat was actually working directly w/ CentOS on some things to streamline their build processes. Red Hat doesn't see CentOS as a threat (as it did OEL).
> Red Hat doesn't see CentOS as a threat (as it did OEL).
Quite honestly I wonder if this was a big motivator for them to take CentOS under their wing. Why go with the totally unaffiliated community distribution when you can go with Oracle for free (somehow)?
I've been using CentOS for some time now, and have never been happier as the release cadence has only gotten better since RH started supporting the project directly. There's no real reason I'd even consider Oracle Linux at this point as a result, considering neither comes with commercial support without purchasing a paid support subscription.
Red Hat still releases the source RPMs which allow CentOS to build the distribution. No, they aren't giving straight binaries, but it's not totally necessary as long as you have the pieces you need to build them.
Just started working at Red Hat about a month ago. Absolutely love working for this company! Especially that most engineers use GNU/Linux of some sort. Great to see this!
"Just started working at Red Hat about a month ago."
I have something I'd like to ask you ...
Two years ago, at the RSA convention, I ran into a redhat employee and was chatting about (unix) and I mentioned that I was the founder of rsync.net with the expectation that he knew exactly who we were and what we did.
He had no idea and had never heard of us. I felt like I had uncovered an enormous flaw in my outreach and relations efforts if someone at redhat had no idea who we were.
But now I'm not so sure ... I have a feeling that somehow the redhat ecosystem and the people that use it are quite a bit differently focused than the HN crowd and people that are UNIX enthusiasts for the sake of UNIX.
The make up of the employees is pretty diverse and not everyone there is knowledgeable about every piece that ships. In fact some solutions engineers are hyper-specialized at only their specific piece of the puzzle that they may not even be familiar with more than a few specific products or pieces of products that Red Hat ships.
It's not a bad thing though. It allows them to be super-expert at whatever it is they know and leverage that knowledge to assist customers.
So someone who is deep into encryption may not necessarily know what Rsync does, he is not devops after all and might have never maintained a server, and doesn't keep local backups (shame on them though) to need to use rsync.
There are also marketing people and finance people, etc.
Yeah, there are quite a diverse set of people working at Red Hat. There are a lot of different areas to be involved in as well. For instance, I am in the marketing department and am a software engineer for www.redhat.com. Nothing to do really with the engineering of the product itself. I have knowledge of rsync.net (even emailed with you once I think) but that was for personal hobbyish reasons, zfs syncing of my desktop.
I could see many people never hearing of rsync.net though, until you have the need you usually don't know about stuff. Well, unless you are on HN, but even then I gloss over a lot of stuff that currently isn't in my "need" bucket.
> Two years ago, at the RSA convention, I ran into a redhat employee and was chatting about (unix) and I mentioned that I was the founder of rsync.net with the expectation that he knew exactly who we were and what we did.
That's a bit presumptuous isn't it? While many people have heard of rsync I've only heard of you because of HackerNews, and not everyone reads this site. When I first saw the URL "rsync.net" I didn't think of "backups", I thought that you maintained rsync or provided some sort of support for rsync.
> But now I'm not so sure ... I have a feeling that somehow the redhat ecosystem and the people that use it are quite a bit differently focused than the HN crowd and people that are UNIX enthusiasts for the sake of UNIX.
I work for SUSE, not RedHat, but our culture is incredibly similar (and I collaborate with people from RedHat every day). The makeup of companies like us is incredibly varied, and it's not really fair to make generalisations like that.
I know quite a few people who work for RedHat and SUSE who are super obsessive about Unix and have hobbies that are massively tied to programming. But that doesn't suddenly mean that they will automatically know every startup that does cloud storage.
You aren't exactly a household name. Does it matter if random people haven't heard of you though? The important thing is people looking for offsite backups can find you.
At SUSE engineers use a wide variety of distros. There's no company policy against using anything (some of my coworkers even use OSX). But I'd say that a fair few use openSUSE because that's the distribution that we all help maintain.
What is so great about RH? When i had opportunity to use it, it look so bad mostly because lack of current/decent software in repo. Maybe their support is amazing?
Yes, support and peace of mind. Packages that make it into their repository are fully tested and supported by RH. Almost all of them are old but they are pretty much guaranteed stable and Red Hat will support the software until the EOL of whichever RH version you are running, even if the developers of the package do not support a version of the package anymore.
Sometimes being old is a double edged sword. For instance, RHEL7 has a lot of problems on AWS because the kernel is too old to properly support many of the virtualisation features AWS provides. On bare metal RHEL7 is great, though.
Hey sure, I should have been more specific. The enhanced networking in AWS requires ixgbevf at least version 2.14.2. In addition we had random hardlocks with ext4 under the centos7 kernel. Also we had a lot of difficulties running docker on EBS disks with the device manager driver - hard lock ups randomly. That's all I recall off the top of my head. All these issues disappeared upon moving to Ubuntu 16.04 on a later kernel. Centos7 was great for us otherwise, it just didn't seem to work out so well with Docker + AWS.
Red Hat is pretty good about back-porting features like this into the kernel and rest of the platform. It would surprise me if the issues you describe aren't on the roadmap to address.
I work for a company that sells software which runs on top of Red Hat/CentOS servers. As a result, sometimes we have to tell our clients that the issue is on the server level and that they need RH to take a look at it, assuming they're paying for Red Hat support.
FWIW, I've never heard any of them complain about Red Hat. I've had plenty of them complain to me that I'm sending them to RH instead of fixing their server for them (because that's what software engineers should do, right?), but they've never once complained to me about RH once they reached out.
I think this is a myth, or at best relative, not all enterprises accept old for stability
Many enterprises want up to date and stable software, and since you are paying for stability, why shouldn't you get fairly recent and stable software
Most companies for example don't consider MS SQL 2016 as less stable than MS SQL 2008 or 2012, most users expect the same level of stability from the latest SQL Server than from earlier version, if not more because it's newer and more advanced, might as well be more stable
I didnt use RedHat in a while, so I dont know how bad the situation is , I hope it is not too bad
Nobody running a thousand of anything uses the first release of $X, whether $X is a database or window manager or kernel or router firmware
SQL Server 2016 may be better than SQL Server 2008, but enterprises aren't deploying it until it has baked a few months, and maybe sp1/sp2/some major rollup ships for all the bugs the suckers find at rollout
I have software that I need to run for about 5 years. It is certified on Red Hat Linux 6 not 7. I'm not unique. Heck, we had to keep copies of IE 6 running up until a year ago for one testing vendor[1]. I, like a lot of others, need stability and paperwork not new.
Frankly, having a lot of new companies adopting a mantra of "move fast, break things" makes me want to 'upgrade' less and less. I need stuff working at Monday at 1PM for the weekly test offering not some oops.
1) deep freeze or virtual machines are your friend.
With things like Software Collections and IUS you get to run the latest stuff on a rock-solid base. I don't want to upgrade my kernel to get a newer Python or NodeJS release!
Software Collections are honking great, I'm actually looking at moving one of my Python 3 apps from Fedora to CentOS purely because it requires a version of Firefox that supports the Java plugin (selenium script that has to use a page with a java applet...) - since RHEL/CentOS stick with ESR releases I'll at least have 18 months to hope the applet goes away.
Ditto this. We run the power grid on RHEL and the latest and greatest features are far down our list of wants. Stability, security, and support; yeah, we'll take those. ;)
I support both Ubuntu LTS and RHEL in production (multiple releases each) and RHEL has been rock-solid for years while there's constant trouble with Ubuntu.
I thought what makes them RH was their support contract asking for money for (unsupported) CentOS installs, forbidding to republishing kernel patch series if you want to continue to be supported, and the like.
But I've read that text, and have been utterly converted to RH. RH is the ultimate good, go with them, and accept their demands.
Came in 4.9 and 4.8 (the commit provides the building block).
And I also think Dave Chinner (works at RH) put a lot of effort into it? (but I'm not 100% sure)
OT: Does anyone know how to get an open-source kernel module signed by RedHat and be included in the CentOS kernel? Do they have a form/contact for this sort of request?
I really like the products not projects mantra. It kind of explains both the problems that open source projects have, but also the failure of so many Google projects.
The supported, RHEL-released version? No. I have used Katello though, we ended up just going with Foreman+Ansible though since the lack of errata in the CentOS update feed made all of the extra features pretty useless.
What problems exactly have you seen? If I was running a proper RHEL shop I would pick it up in a heartbeat along with CloudForms.
Redhat: I clicked on the link and was presented with a wall of words. Oh and there was a bloody great piccy above the words that I had to scroll past. It looked a bit rubbish. Is that really the best that a multi billion dollar company's S&M department can come up with?
Back in the day, after I'd farted around with Slackware, Yggdrasil, Mandrake etc I'd come back to RH but that was a long time ago and now is now.
Should I bother to read the blurb ... had a go ... had another go ... ... got bored ... didn't finish - soz.
I believe that RH are an important component of the (GNU)Linux ecosystem but I am not exactly fired up by this effort. I am fired up by software in general and Libre software in particular.
OK ... another go at the marketing thing ... oh, apparently RH are the leading force in open source software.
Piss off.
Open source software looks after itself.
Feel free to include say the kernel devs amongst your "life blood"
Feel free to read LWN's articles that describe how stuff is done
There is no doubt that RH has contributed to open source software in a significant way but they are not on my radar these days.
Red Hat or the Microsoft of Linux is the face palm of linux distros, forget dependency hell, how on earth someone choose rpm over first class Superior Debian package system or distros like Ubuntu? People just buy it for the support even if the Red Hat they use is Gnome 2 and Firefox V.3.0. Then finally RHCA/RHCE courses, i should get certified from this distro to prove my Linux skills for the companies even though i am a debian guy? No Thanks.
Why would you use Ubuntu with it's repository of stale packages and over-engineered package system when you can use pacman on Arch? On Ubuntu you end up having to add a bunch of PPAs if you need up-to-date software for any reason, then it's a nightmare to maintain. People just use it because of its convenient point and click interface and config utilities. Why should I learn to use a mouse even though I'm a command line ninja? No thanks.
(Note: satire. I use and love both Ubuntu and Arch.)
> how on earth someone choose rpm over first class Superior Debian package system
So, what makes the Debian package system superior? Having built multiple specs in the past, but not any debs, I'm curious. Given that you can often convert debs to RPMs and vice-versa, and can use the different package managers on either type of repository, it's not obvious where one has any specific benefit.
I've built both Debian packages and RPMs, and I feel that RPMs are somewhat easier to wrap your head around. It's been a while since I did Debian package building, though.
It's also undeniably true that there was a time when APT + dpkg blew Red Hat's "equivalent" tools out of the water, but that's not the case anymore.
One thing I like in particular about dnf/rpm is that I can install packages by using functions, so eg. telling yum to install perl(Net::DNS) will work and installs the perl-Net-DNS package. I'm not sure if apt can do this.
This is my biggest gripe with debian packaging. Also, the use of make files and the magic that goes on behind the scenes with debhelper (which you probably want to use) makes packaging much less transparent.
It took me days to grok how to make a proper debian package nearly a decade ago when I first did it - writing my first .spec file took an hour or so and now I can hammer most of them out in 15 minutes or less.
The software tools and packaging formats exist only to implement that. It's Policy and the comprehensiveness of the Debian archive (60k+ packages last time I checked, a year or so back), which make it bliss. You simply rarely have to go outside it, and you can configure a system as lean or rich, or anywhere in between, as you like.
What Policy states is that the package maintainer (not the user) must or must not do, including incorporating specified, requiring manpages (though that's not a release-critical bug), so that documentation is standardised. The fact that /usr/share/doc/ has every single package installed listed in it, and they don't include the installed version number in the directory names, as RH still does, means that you can use that directory to recover or rebuild from various packaging failures. (Why would you want to do that you ask? Same reason you'd want to have a doctor or hospital nearby if you had a broken leg -- it's not so much what you'd planned on, but it's damned convenient when you find yourself in that situation.)
Joey Hess and Martin Krafft (see the latter's The Debian System) both expanded on more elements of Debian packaging vs. other systems, particularly Red Hat / RPM.
> The fact that /usr/share/doc/ has every single package installed listed in it, and they don't include the installed version number in the directory names, as RH still does, means that you can use that directory to recover or rebuild from various packaging failures.
Why wouldn't you want the version numbers? That seems like something you would want when trying to rebuild a package database. Also, what do you do if you have two versions of something installed ?
1. Generally you rebuild to the latest available versions of yor specified release. APT sorts version deps as it does package deps, so you don't have to.
2. You can reference content under /usr/share/doc/ without having to specify a goddamned version number that a) changes arbitrarily and b) is inconsistently specified every fucking time.
Version numbers are metadata, not directory name elements.
Changelogs, also required, give you the info if you desperately neeed it.
From the top of my head in the last few years I have (and you have also probably) used the following to get software: rpm/yum, deb/apt, pacman, ports, sysvr4(Solaris), ips, homebrew, pkgin, npm, gem, pip, hex, cpan and hmmm probably a few others.
They all do pretty much the same thing, one might have prettier progress bars or a better cli but we're just talking compressed tars with some metadata files.
Competition is good and all that, but how much innovation do we need to drive in package managers? We have lots of tools all doing the same thing with negligible advantages over their competitors and I don't see it as a positive.
I have built both debs and rpms and I like rpms more. Both aren't perfect though (still so much fiddling around with Bash in pre/post scripts for doing anything more then extracting a bunch of files).
I've actually never found that to be true, mostly due to a wird cognitive disconnect between the desired outcome and the actions you need to perform.
Either use separate commands for different actions, or distinguish actions with arguments, but don't do both. E.g. apt-get install, apt list, apt-file update.
Time has shown that Red Hat really knows what they're doing, on many different levels, and they're the most successful free software business in history.