It's interesting that we are entering 2017 and the most popular configuration for users are Windows 7 (45%), 1366x768 as resolution (33%), CPUs with 2 cores (70%) and a majority have 4 GB of RAM or less.
I have a ThinkPad Edge (the cheap version) from ~2012 with Core i5, 8 GB of RAM and SSD. It's incredible that it's still top of the line in these stats. PCs aren't dead, the problem for the PC industry is that old PCs are good enough, hence people don't feel the need to upgrade. Of course mobiles are now more attractive for the industry, because most of them have a 2 years of life expectancy.
And it's funny that Windows 7 is still the most popular OS, even with Microsoft's aggressive upgrade tactics.
> the problem for the PC industry is that old PCs are good enough, hence people don't feel the need to upgrade.
Throwing away a working electronic device to replace it with a new one is really bad when you think of the ecological impact. I'm glad that some resist to the urge of buying shiny new stuff.
I am running 5 - 16 year old PCs on my network and they play 99% of the games on steam.
Stuff I purchased in 2000 is still running like a CHAMP. ASUS Motherboards( half died in the first year, and were replaced ). Once they make it past 1 year they seem to run forever. I have upgraded components like the disk drive, a few of the processors from 2.3Ghz Core2 Duo, to a blaze 3.0 GHz Core2 Duo. And replace thed GPU with what I can as friends upgrade theirs and sell old ones for cheap. Now that they are all running Windows 10 64Bit, I can put up to 8 Gb of ram in them. My kids love it, they can have 4 friends over and play Killing Floor, Men of War, Terreria.
I mean for the average use case, a 5 year old pc generally has the compute capability.
Gaming requires updating (though 32nm cpus are still perfectly fine).
Scientific computing will move to the cloud soon enough, as you either need absurd capabilities, or a recent laptop is generally powerful enough for data science (depending on the usecase).
Hell, I dusted off a 10 year old core2duo HP laptop and it does most of what I need it to (browsing, jupyter notebooks, etc.) fine. The one common app that's painful to use it with is Facebook and other sites that became a sort of monstrosity (the ones with 40 NoScript dependencies you need to whitelist to see content)
I've had my HP Workstation for 5 years now. I found an xw8400 workstation off of EBAY for around $400. Oh sure the specs are a little lower than what you'd find on the newer PC's:
- Dual intel Xeon x5355 processors at 2.67GHz
- 16GB of DDR2 RAM
- Two 500GB hard drives in RAID config
- nVidia Quadro FX 4600 768mb video card
But I can expand the RAM out to 32GB and I've already upgraded the RAID drives to an SSD. I can also upgrade the video to 1.5gb or 2gb video card. I'm still impressed how stable its been and how even the 768mb card handles Adobe resources without any lags - I constantly have two or three Adobe programs running at a time which is normally a huge resource hog.
I just fell in love with the performance and the stability of the HP workstations. For a fraction of the cost, and its easily upgradeable so I don't feel like I need to go drop 1K on a new laptop or PC. Also, with such an abundance of these on the market, you can always find a really good deal on them.
Yep, and ECC RAM starts getting pretty important >8GB if you value stability. Workstations depreciate like bricks typically, so a last-gen system can be had for a song if many cores if what you're after. A short while back, some Xeon engineering samples were flooding eBay and were a better deal per-core than whatever the latest i7s were at the time. I forget the model number.
I'm running 32 GB Ram for 5 years without the siltiest stability issue. ECC is probably for really, really big machines 128 GB up or overall an overrated myth.
Yeah, me too. I think if one needs the RAM for running VMs (so, not doing ZFS or running Prod databases), paying extra for an ECC enabled architecture isn't the greatest investment.
Could you tell me what I gain from workstations compared to standard dev boxes?
- Does the number of cores affect general performance or only helps in multitasking and data science?
- Is ECC memory worth it (in practice, not in theory)?
Server architecture: more cpu cores, more cache (resulting in better performance per clock period), detection of RAM errors (1-bit and 2-bit) and correction for those (1-bit), built for non-stop operation.
ECC arch. is worth the money wherever correctness of the computed results or integrity and stability of running programs is imperative - scientific and engineering calculations, database systems handling precious information (money), etc.
In practice, it seems ECC in a personal computer is not necessary for most people, but in 2009, a study of DRAM errors in datacenters, ECC was found to be very benefitial:
"Across the entire fleet, 8.2% of
all DIMMs are affected by correctable errors and an average
DIMM experiences nearly 4000 correctable errors per year"
Outside of the rare socket 2011 i7's (the only real i7 IMHO) you get more PCIe lanes and significantly more memory bandwidth with xeons too. The extra PCIe lanes can matter for SLI setups..
But this is where Intel irritates me. I want a socket 2011 CPU with both an unlocked (or low core high clock rate) multiplier as well as ECC RAM. For some reason, intel refuses to make such a product.
I was looking for something similar and indeed there isn't much choice in that department. I eventually switched to the high-core lots-of-performance/per-buck E5-2670 especially considering its low price. But if you want high single thread performance, take a look at E5-1650 or, if you have unlimited budget, E5-1680 v2 (12% faster).
The latest v4 parts have turbo frequencies again peaking at 4ghz. Which when compared with the i7-7700, that is a 12% frequency deficit. If you count in a couple percent between haswell & kaby lake, its probably at least a 15% single thread advantage to the part that costs far less.
Hence the unlocked 2011 parts which people routinely run up in the ~4.5Ghz range (myself included) with little issues. The only point of instability seems to be the larger cache, which can have its multiplier independently limited, at which point the stock voltages suffice for a significant clock rate increases. Why intel couldn't bin and sell a 4.2Ghz 2011 part like devils canyon is a mystery.
Thanks a lot. Does it make sense to have more cores vs faster cores if I only plan on using IDEs and doing general development? No video editing or doubling my machine as an application server to serve my personal website.
The DIMM vs ECC point and data you bring up is very interesting. Thanks for that link.
Take this with a grain of salt since this is mostly the marketing and what I've gotten in my own experiences. Of course, this can differ wildly from box to box so take that into account as well.
>> Does the number of cores affect general performance or only helps in multitasking and data science
More cores means more data can be processed at the same time. For a gamer, even if you have 20 cores, it doesn't help you because they're after speed. For myself, it's being able to run several Adobe products doing video editing, photo editing, and graphic design, on all the same machine, at the same time. Think of it as workload vs. speed. Workstations are more powerful and built for heavy workloads, whereas gaming PC's are mainly built for speed.
And yes, the main application is in crunching big data sets, CAD design, or anything heavy on the graphical side of things. I definitely saw a big swing in stability when I switched to a workstation with the myriad of Adobe products I use.
>> Is ECC memory worth it (in practice, not in theory)
So in theory this is the advantage of ECC:
ECC (which stands for Error Correction Code) RAM is very popular in servers or other systems with high-value data as it protects against data corruption by automatically detecting and correcting memory errors. Standard RAM uses banks of eight memory chips in which data is stored and provided to the CPU on demand. ECC RAM is different as it has an additional memory chip which acts as both error detection and correction for the other eight RAM chips.
In practice, I can say it for sure helps your system to be a lot more reliable and stable. I've been running my workstation for several years without a power cycle and its been rock solid, no issues, no hiccups, no nothing. As such, this is the main advantage of workstations is their reliability. I also like the idea in some models of stuffing up to 128GB of RAM or running dual video cards.
In the end, with so many people opting for laptops and gaming systems, you can get your hands on a hell of a workforce PC for under $500. The processors and video card are still dirt cheap (around $100). The only mildly expensive stuff is the RAM. I've really good luck with the HP workstations, but that's just me. I'd say dig around and see what you can find. It's also pretty easy to build one of these from scratch since the parts are so widely available.
Thanks a lot. This was a very detailed experience you provided. Also, is there something I might miss from a workstation focused motherboard compared to a workstation one? Like dedicated audio cards, SLI etc.
I do plan to research before spending but want to get a general feel before going in.
That sounds like a lovely workstation. I had a dream machine of overclocking a dual x5670 xeon machine, but motherboards that let you do that now are too rare and expensive.
...people don't feel the need to upgrade. Exactly!
I have similar specs on my Asus from the beginning of 2012, Core i5, 8 GB of RAM (originally 4), an SSD (I replaced the dvd-drive), and a nvidia GT540M (2GB) graphics. I bought it for ~500 EUR. If I browse laptops within the same price range, what do I get? Core i5, 8 GB RAM, nvidia 920MX (2GB), HDD, I don't even get a HD screen (or a touch screen - I would like to try those). I would buy a new laptop, but I would have to pay double the price to get something that it is much better than my current one. Until this does not changes, I will stay with the old one.
The difference in performance between a recent laptop and a 2012 era one is marginal, but the difference in power consumption is considerable.
My 2012 Asus has a i5-2410M, which has a passmark score of 3154 and a TDP of 35W. My recently purchased Dell XPS13 has a i5-6200U, which has a passmark score of 3926 and a TDP 15W. A 25% improvement in raw power, but a 190% improvement (almost a factor of 3!) in performance per watt.
The reality is both are fine for day to day browsing. The real difference is that the power difference means the Dell has a smaller battery, so can be thinner and lighter. Given that I use my laptop plugged in and with a monitor 99% of the time, there's no real reason to upgrade.
> 2017 and the most popular configuration for users are Windows 7 (45%), 1366x768 as resolution (33%), CPUs with 2 cores (70%) and a majority have 4 GB of RAM or less.
Not a knock on you but...surprise surprise Silicon Valley, the world isn't full of MacBook Pros.
> because most of them have a 2 years of life expectancy.
Mostly from OEM "neglect" than customer wishes. I used a Android 4.0 tablet until recently. And only had to give it up because the USB port gave out so i can no longer charge it.
The whole 2 year replacement cadence reeks of planned obsolescence.
I disagree. Old PC's are awful. The problem is that a good chunk of new PC's are just as awful. The sub-$400 laptop space is filled with low spec machines that still have 1366x768 resolution screens, low RAM, and slow processors. There has been very little spec change in that space for a long time.
Windows 7 is still popular because most companies have not upgraded to Windows 10 yet. But it is starting to happen. I expect an explosion of Win10 users in the next year.
It's interesting that we are entering 2017 and the most popular configuration for users are...
But the stats cover Firefox users, not average PC users.
I'd guess there's a lot of overlap between (people savvy enough to keep old PCs in service) and (people savvy enough to download a browser that didn't come built-in to the system).
In particular, if you're using an older PC to browse the web, you're pretty likely to want to do some selective content blocking. I believe Firefox has more plugin options for that than the other big contenders.
Similarly, Firefox may be the best choice on Windows 7. But on a brand-new Windows 10 PC, many people may be satisfied with built-in Edge.
Even I'm no longer fond of upgrading my OS. Still refusing macOS Sierra. My PC is on Windows 10, but honestly, Windows 7 did the job pretty well. My parents hate every change on their devices. So does almost everyone I know who's not an enthusiast. So yeah, I agree. 95 % of the people need very basic hardware and a small, simple OS.
> the problem for the PC industry is that old PCs are good enough
Yes. I use a T61 with a Core2Duo 2.5 Ghz, 3 gb of ram , and a SSD for browsing, Python and some image editing. Yes, it could be a bit faster but overall it's fine and I don't see any reason to upgrade.
Actually I don't want to upgrade because I love the 14" screen with 1400*1050 resolution. It's perfect for reading and programming.
To be honest, it's not as unusual to see the OS still being used because with the bunch of things Windows 10 runs in the background that can't be turned off it can get rather slow. Plus it hates 2 GB of RAM or less.
With 70% having 4GB or less RAM, I hope this makes it clear to Mozilla people that the 20% memory tax of Electrolysis is simply not acceptable. I'll take less memory usage over benefits of multi processes browser any day.
Most people don't need to run VMs probably. Of the people that do most probably run GNU/linux which is usally fairly lightweight. So the only memory heavy app is their web browser.
I wonder what the breakdown is between homes and offices? My office computer (I'm an adjunct instructor) is slightly below those specs and slow as crap.
I never understood the logic behind trying to charge for the upgrade after giving it away for a year. Who was seriously going to pay for it after clearly going out of their way to avoid updating prior to that?
Shipping Win64 builds by default to a wider group of users is a high priority in 2017 (stability being a big driver of it), but there's still some bugs blocking it. The work is being tracked at https://bugzilla.mozilla.org/show_bug.cgi?id=558448 and various dependencies.
FYI, this default has been flipped on Nightly, and that change should make it in to the version 53 release in April (barring any huge problems found in the 64-bit build).
For Linux and MacOS it's been available for a long time.
I've also used the 64-bit Firefox on my Windows for some time now, but it's not the default download that they offer. See it here: https://www.mozilla.org/en-US/firefox/all/
In fact, 64-bit Mac will be the only version supported as of Firefox 53. This will halve the Mac installer size. 32-bit and Universal Mac builds were dropped in Firefox bug 1295375: https://bugzil.la/1295375
My first thought upon seeing this is that I was not aware that Firefox sends this kind of data to Mozilla.
Under "how is the report created" [1] it says that "Firefox automatically collects information [...] unless users disable this collection". This seems to be in conflict with Mozilla Telemetry FAQ [2] that says that data collection is disabled by default for release builds.
In fact, on Firefox 50.1.0 on Debian I no longer see the "Share additional data" setting mentioned in [3] - see screenshot in [4]. Does this mean that Telemetry is now enabled by default and can't be disabled?
The first time you start Firefox, it gives you a notification (bottom bar, I think) among the lines of "we're collecting stats for improvement" and two buttons "Ok" and "Disable". IIRC, it defaulted to enabled, but didn't appear sneaky to me at all.
Exactly. It's easy to opt out, as long as you're paying a bit of attention when you first start up Firefox. It's easy to opt-out later if you are comfortable navigating the preferences dialogs.
If anything, I am interested in the bias this data will have since I suspect many advanced users instinctively opt out of data collection policies. I opt out of anything that I can, and I would opt out of more if the options were readily available. Since advanced users are more likely—in my estimation—to have higher-specification hardware, I suspect the data is slightly biased toward the low-end.
I consider myself an advanced user and I enable sending Telemetry and crash reports on Firefox explicitly. I dont do that on Chrome but I prefer doing that on Firefox. The point I am trying to make is, although I agree that some users may opt-out, many trust Mozilla and so would usually let it collect data like this.
A good point. If there were one major organization that I feel comfortable sending telemetry data to, it may be Mozilla—the only organization that makes respect for their user's privacy a paramount concern.
(I more or less instinctively opt out of data collection and have never enabled it in Firefox.)
Even if you don't trust Mozilla, the list of things Firefox collects as telemetry on release is extremely limited. (You will have to trust that the binary you downloaded is built from source though, but you can build yourself and/or turn the pref off of you care that much).
Mozilla collects a limited set of aggregate data by default, but it can be opted out. Users can also opt in to send a more extensive data set (also aggregate AFAIK), but that's not the default.
The screen shot from your preferences section is odd - all the boxes are there on Windows and macOS for me. Perhaps you can file a bug on bugzilla.mozilla.org
And AFAIK distro builds of Firefox do not send telemetry to Mozilla. Thus Mozilla has limited insight into the Firefox Linux population, such as how many are running 32- vs 64-bit builds.
Ouch. I'm pretty sure that here in Germany they could be sued for that. Regarding privacy law there we have a general consensus that out-out is too weak in almost all cases. You need opt-in, or in some cases double opt-in.
Also, I wonder why Mozilla is playing the game that way. Aren't they afraid that actions like this may damage their reputation?
On the other hand, this may simply be the continuation of their "pragmatic" policy that led them into feeding Google Analytics with their sites and making Google the default search engine (instead of privacy-friendly alternatives).
What you say is only true for personal information, for which you do need to opt-in in Firefox. Broad stats about the users' system wouldn't fall under that. You already leak most of this information to every web site fwiw.
AFAIK you only need opt-in for personally identifiable data.
Firefox would also definitely not be the first piece of software to run in Germany that has opt-out instead of opt-in. Most software does it that way and usually much sneakier than Firefox.
"Existing hardware reports (such as those from Valve and Unity) are excellent, but represent a different group of hardware users than the majority of people who use the web."
Steam users are not average PC users. Believe it or not but gamers are actually just a small percentage of PC users. Most people don't game on their PC, if they even game at all. I'm talking of course about non-browser games.
A good example for this is that AMD actually beats NVidia in graphics market share in the Firefox survey. Of course, Intel beats them both, which is clearly a result of laptops and low-end form factors.
Similarly, just look at the absolute dominance of 2-core systems on the Firefox survey!
Of course it is. It was a good OS and when offering the Windows 10 upgrade, Microsoft forgot about Vista and XP users.
So without a (free) upgrade path, I'm pretty sure Windows XP will linger on for quite some time. And even with a free upgrade, I'm not sure those users will want it, because most users are conservative.
Or these users are still rocking a single core cpu with piss poor gpus that just perform terrible on any new OS, even a modern linux distro. My sister still insists on using a laptop like this, so winxp it is...
It's not that simple for old machines. How about playing a video? Any new video player uses a 3D API for displaying/blitting. Guess what? This kind of machine has terrible OGL/DX performance and can barely play even the simplest of .avi. Winxp and it's typical software still has support for overlay video and whatever crap was used back then, so video performance is acceptable for non HD stuff.
What about Office? Wine is doable for older Office versions, but the performance hit is a lot more apparent on such pentium 4 era single core machine.
I've also found that lightweight distros are not enough in the cpu department. Their ram usage is great, you can easily find something that uses only 100MB after logon, which might be even less than winxp, but since it's all based on modern software you will quickly find out that even the simplest of tasks, like navigating with the file browser is a lot laggyer and more cpu intensive than that software for winxp that was designed for pentium 4 era cpus...
Sure you could go deeper with more barebones distros and/or only use the terminal, but then that machine is nearly useless for a common user, and may require a lot of setup and tweaking.
I think it is possible to do a mix. For example, there are several file-managers that are not heavy, I expect Thunar to work well on older machines, pcmanfm as well. For Office one could use something smaller like Abiword, though I never tested whether the performance of Libre Office is worse than Microsoft Office.
My experience with such older machines is really that when using a good software selection, it was a lot faster and more comfortable than Windows XP. However, you might be right about video. I have an Thinkpad R50 here and I was indeed not able to get gpu support for video playback from its Radeon 7500, with the cpu being too weak to handle even SD-videos. I was not aware that was a general problem XP solves better (this specific machine I never used with Windows) :/
GPU accelerated GUIs are one of the more dumb fads out there. Motif and FVWM run so much faster without a GPU than Mir/gnome or modern aqua or windows 10 could ever hope to even with a nice GPU.
SlimJet[1][2] (formerly known as SlimBoat) is based on Chromium and still supports Win XP SP3. They added a lot of Firefox-like features like sidebar, customizable toolbar buttons, and download manager.
It also supports Chrome extensions. (No affiliation with them, I just use it to test my sites on XP machines)
If you're asking about Edge, Microsoft is in the business of selling Windows, with Edge being one of the bones thrown to get users to upgrade, so not much incentive there.
on my ancient office computer I have taken to using k-meleon. It is a bit weird but doesn't make the laptop's fans run at full blast like chrome or firefox
While the data isn't broken down by region, the disparity between the hardware listed here and what most modern web developers will be using on their own machines is astounding. It seems very easy to make web apps that will be pretty unusable on the most common configurations.
Zooming (ctrl +,ctrl -) in Firefox is terrible, and with a 4k screen zooming is a key feature. I quit using firefox because of that. Chrome's zoom actually works.
I just changed the devpixelsperpx preference in Firefox. No need to mess with zooming anymore. FF should read the screen density automatically, but as annoyances go, this is a minor one (just set and forget).
It actually used to do it automatically, but it got more complaints from people who were surprised that it did so than complements from people who were thankful that they could read things without squinting.
Same reason as why Google has buried open standards like RSS or XMPP based GTalk.
Seriously I do not think that the stakeholders' values of Google are the same as the ones by the Mozilla fundation (which also has a corporation but mainly due to taxing and employing reasons).
I'm wondering the same thing. Nearly every other metric I've seen puts desktop Linux between 3% and 4%. I'd imagine that paranoid Linux users would be undercounted because they're less likely to let Mozilla collect stats on them. And of course, Firefox stats wouldn't include ChromeOS.
Still <1% seems awfully low. We're rare but we're not that rare.
I think more and more distros ship some compile of Chromium these days, or perhaps iceweasel/icecat (the GNU "fork" formed after the debacle with Debian over trademarks).
I was wondering if maybe common Linux distributions ship with a different default than the Windows builds, but all I could find is that telemetry wasn't working with Ubuntu Firefox for a while in 2016[1].
Companies like 3DFX (which once one of the best and very notable with their GLIDE API) got bought by nvidia. Imagination Technologies (IMG) left the PC gpu market to focus solely on mobile gpu (Power VR series). on mobile there are still plenty company making gpu's like IMG (Power VR), Qualcomm (Adreno - which originated from ATI mobile division), Nvidia (ULP Geforce), ARM (Mali) and Intel
There are VIA motherboards with build-in CPU and GPU for one (I had one of these for a small NAS, which could have been used for a very lightweight computer as well).
Raspberry Pi and stuff like that use other brands of GPU as well.
It's interesting that we are entering 2017 and the most popular configuration for users are Windows 7 (45%), 1366x768 as resolution (33%), CPUs with 2 cores (70%) and a majority have 4 GB of RAM or less.
I have a ThinkPad Edge (the cheap version) from ~2012 with Core i5, 8 GB of RAM and SSD. It's incredible that it's still top of the line in these stats. PCs aren't dead, the problem for the PC industry is that old PCs are good enough, hence people don't feel the need to upgrade. Of course mobiles are now more attractive for the industry, because most of them have a 2 years of life expectancy.
And it's funny that Windows 7 is still the most popular OS, even with Microsoft's aggressive upgrade tactics.