LED and CFL lights, SEER ratings, LEED certifications, Energy Stars all are having an impact.
Just in computers there have been huge advances. My MacBook Pro requires about 1/6 the electricity of my 2007 Desktop, and my LCD monitor is about 1/6 a CRT.
Transition to EVs may start to soak up all the gains we're making in efficiency so the usage may start to tick back up. But we're building Solar and Wind sources at an increasing rate that will effectively start trading oil for renewable electricity.
> My MacBook Pro requires about 1/6 the electricity of my 2007 Desktop
Apples to Oranges though (or Apples, but still). The comparison should be a laptop from 2007. I ran a house on solar electricity and windpower back then, my laptop was about 35 Watts according to the e-meter connected to it, my present day laptop is roughly the same.
Laptops have always been substantially more efficient than desktops.
> Laptops have always been substantially more efficient than desktops.
Yes but a decade+ ago, they weren't a replacement for a desktop, so you ended up having both. Now even mid end laptops (or "low end" Macs) are totally feasible as replacements.
I bought my last desktop in 2010 and I can't imagine buying another one.
> I bought my last desktop in 2010 and I can't imagine buying another one.
I still do 95% of my development on a desktop, I just can't see ever not doing that, I'm older now and a good desk, good chair, 3 big screens takes up enough space that a tower isn't taking up much more space.
That and I can put hardware into a PC for half the price of a Macbook Pro that will absolute annihilate the laptop on performance.
Portability and power efficiency does impose a constraint.
For me the trade off of having two machines is worth it rather than having one machine that is permanently hobbled by been designed for a use case I rarely use makes no sense.
> I'm older now and a good desk, good chair, 3 big screens takes up enough space that a tower isn't taking up much more space.
I do the same, 95% of my development with a Macbook Pro with 2 big screens, keyboard and mouse attached. Don't even touch or move it on most days.
A desktop would be much cheaper - but I would still need a decent laptop for the 5% time when I'm not at my desk, so the savings aren't that much any more. And keeping development environments, VMs, config files, keyboard layouts, etc syncronized on two separate computers is too much extra work.
I have an oldish but fairly beefy Vostro that does in a pinch, keeping things synchronised isn't that hard in practice since the vagrant files and project files all live in git but yeah there is some overhead but I have that anyway since I have a desktop at the office and at home.
> keeping things synchronised isn't that hard in practice
I disagree, I think how difficult it is to keep things synced varies significantly between users. Sure, its easy if your work is consistent and you're disciplined but if you find yourself switching between very different projects frequently the overhead of syncing machines and setting up software repeatedly can become significant.
Each client gets a "team" in bitbucket and orchestration becomes no harder than
git pull && vagrant up.
It works great on projects that have a simple structure (webserver/db) and projects that have complex configurations with ES/Redis as well.
The trick is to always treat configuration as code and never make manual changes to a VM (or more correctly, always make a manual change then a change to configuration code).
As an entirely added side benefit I can pull a project from two years ago down and be up in a fraction of the time without polluting my host OS.
Even when that project is currently on 14.04 with particular weird dependencies (exactly this version of wkhtmltopdf etc).
Longer term I'll probably look at docker or something but vagrant/ansible has worked out great (some projects aren't even using ansible, just a bash script).
Fair enough, but... my 2010 desktop was a lot more energy efficient than my 2005 desktop. And my 2017 desktop is probably going to be made from what are effectively laptop parts in a brick the size of a large hardback, with the power supply external like a laptop.
So while most people are moving to laptop-only, even us hold-outs are seeing more energy efficiency in desktops.
The power states and performance per watt have gone way up too, in 2007 power gating CPUs was not commonplace, and Intel was just starting to optimize for performance per watt (especially in the low power laptop/tablet market).
What vendors have done is provide computers with the same power envelope, but much more performance so they may do your calculations @ 35 watts, and then immediately cut all power to large swaths of the CPU so as to curtail power use by at least a few watts, if not 10 watts (depends on your CPU).
>Yes but a decade+ ago, they weren't a replacement for a desktop, so you ended up having both. Now even mid end laptops (or "low end" Macs) are totally feasible as replacements.
You obviously don't play video games, do CAD or video stuff or develop and compile large software. My desktop is never replaced by my laptop.
Video games run perfectly reasonably for a casual player, and even some low-level competitive players, on a mid-spec laptop. Not everybody demands that the latest Crytek game run at 60fps on extra-high settings 100% of the time. Aside from that - most people don't do those things either. A laptop is totally feasible as a replacement for the 99.99% of people who could also get away with a 5 year old desktop.
True, however desktops are getting more efficient, too. You can have an impressive amount of computing power (GPU and CPU) with a couple hundred watts.
Just a couple of years ago I had a 800w PSU to be able to run modern games. Now I have a mini-ITX machine with a 450w power supply with a lot of room to grow. I'd estimate actual power usage to be 50-60% the nominal capacity at peak.
It's hard to compete with a laptop in power consumption, of course. My CPU's TDP is higher than 35w. It's an I3, so Intel says 51W TDP. It will still run circles around the laptop's cpu.
> desktops are getting more efficient, too. You can have an impressive amount of computing power (GPU and CPU) with a couple hundred watts.
Jeeze, my desktop, a full socket 1150 Core i3, is under 100 watts. Heck, I have a file server with twelve 3.5" hard drives and a 4 core socket 1150 Xeon that draws under 200 watts with everything spinning.
The big competitor really is AWS, especially with their multi-gpu instances. For gamers not really an option to have their GPUs at the end of a wire but for a lot of other applications it makes good sense.
>Laptops have always been substantially more efficient than desktops.
No, they haven't; not on a MIPS/Watt basis. A laptop-to-desktop comparison is entirely valid. I use a laptop now, and a desktop in the past; I do the same work, and I get better performance out of my modern laptop. I even have dual 24" monitors on the laptop (docking station).
Demanding a comparison between desktops of the mid-2000s and desktops of now is like demanding a comparison between the fuel consumption of a Corvette from the 1980s and one from now, instead of recognizing that I can buy an economy car now that gets better performance than the 1980s Corvette while also getting much better fuel economy. If you're comparing on the basis of performance and capability, you don't need to stay with the same class of device, because capabilities and performance have changed so much over time.
Why? When we're talking about overall usage, functional replacements are far more relevant than direct device-category to device-category comparisons. In much the same way, the fact that my mom has replaced her desktop with a tablet is the directly relevant comparison for my mom's "electricity usage due to personal computing".
> Apples to Oranges though (or Apples, but still). The comparison should be a laptop from 2007
Nope, in 2007 a lot of people used and needed desktops that are more than satisfied by the available computing power on a laptop today. It's only an apples-oranges comparison if everyone were still using desktops, which is not the case.
In 2007, even more people had their needs met by laptops. Heck, there were people in 2002 who went laptop full time. Anyone looking closely enough in 2000 could see the writing on the wall.
I used to have the same argument... until I realized modern laptops can be desktop replacements (without being those gigantic bricktop "mobile/portable workstations").
Just slap a full sized monitor and a keyboard+mouse, and its more capable as a thin desktop (ala iMac) than the actual thing, and when you want to leave, just unplug all that shit and leave.
The only call for full sized desktops is people like me, where cramming in full sized GPUs, more than one drive, and 32+GB of RAM is required.
> The only call for full sized desktops is people like me, where cramming in full sized GPUs, more than one drive, and 32+GB of RAM is required.
That, or when you need more CPU horsepower than what a laptop will provide. Even if it is not thermal throttled, if it is not a big laptop, odds are it is using a very low power part.
Definitely, heat dissipation and power envelope limitations on laptops are still a major issue, but with power gating coming standard on most chips and very aggressive p states, average power usage is down by double digit margins on your average CPU over the last decade.
This is very beneficial for servers and workstations too, as it keeps your power bill much lower, and makes it not financially viable to stick around on 8+ year old hardware, where it draws 250 watts for half the performance of a year or two old system, that will pull 60 watts minimum, and while topping out at 250 watts, will like average around 70 watts or so, saving you about $180 a year at $0.12/kwh.
I have a 2007 17" MBP, fully loaded for its time, that was still giving me great service as a classic gaming machine (via Bootcamp) as recently as six months ago. In its prime it was essentially a true desktop replacement for most work activity. Sadly, it's about ready to give up the ghost. There are fewer and fewer use cases I can justify squeezing out of it these days. But what a hell of a decade it's given me.
> Apples to Oranges though (or Apples, but still). The comparison should be a laptop from 2007. I ran a house on solar electricity and windpower back then, my laptop was about 35 Watts according to the e-meter connected to it, my present day laptop is roughly the same.
No, it shouldn't. Many people use laptops today for the things they used Desktops in the past.
I permanently turned off my last desktop around 2007, and have been using a laptop exclusively since. I had a media PC I built under my TV, and I shut that off permanently around 2010, replacing it with a Google TV device that I later stopped using. Nowadays I have a laptop (no external monitor, even), a reasonably low-powered NAS, and a ChromeCast, and that covers my computing/media needs at home.
In that time I also went from a CRT TV to a plasma, and at some point in the next year or so I'll probably swap that out for a (lower power) OLED TV or similar.
(As an aside, 35W for a laptop today seems excessive. Assuming PowerTop is accurate, my laptop is using 9W right now. While charging it's undoubtedly using more, but even plugged in while fully charged, with less of the power-saving features enabled, 35W seems a bit high.)
The first thing I noticed about the article was the complete non-mention of energy efficiency. In the span of a decade (maybe less?) LED lighting has made huge advances.
Electric vehicles would certainly provide some growth potential, though overall they would most likely even out 24-hour usage patterns rather than overstressing the grid.
>> they would most likely even out 24-hour usage patterns rather than overstressing the grid.
Take a look at exactly how much power goes into an electric vehicle. I once worked out how many solar panels one would need to charge a tesla once per day in my local. The math pointed to something like a solid acre of land dedicated to sustaining a single car. (not an acre of panels, at my latitude you need proportionally more land on which to mount panels). A shift to all-electric transport would produce a demand for which LEDs could never compensate. Only the very wealthy would ever own enough land to actually self-sustain through renewables. Cities will need power grids for a long while.
Take electric transport out of the equation and things start looking much better. Hydrogen-powered vehicles would imho at least allow the possibility of "de-electrification" in north america.
Do you drive 200 miles a day? I don't. You rarely need to charge an electric car from empty.
I agree that solar is significantly less attractive in high-latitude, cloudy locations, though. All of those panels that Germany installed would have been much more useful in Arizona.
Hydrogen, of course, is a boondoggle: it's primarily a fossil fuel, needs a whole new distribution system, is less efficient than pure electric, and requires very expensive vehicles.
Worse than that: if you buy hydrogen today (e.g. for some fuel cell car prototype), it is almost certainly created from fossil hydrocarbons. So hydrogen is not even a battery now but just a very elaborate way to burn fossils. "Equivalent to batteries" is the best case future scenario (e.g. in presence of abundant electricity from controlled fusion power plants).
Almost no one needs to fully recharge a Tesla once per day, in fact that's really bad for the battery. More likely, they'll be doing only a quarter to half-charge every day, on average. Also, Teslas aren't the most efficient EVs around; they're big and high-performance. A Leaf will use less electricity.
People don't need to own land to recharge their EVs; they can just hook up to the grid, just like normal people today don't have solar panels usually and just buy gasoline and electricity. There's tons of room in cities for more solar panels though, on top of roofs, on large commercial buildings, and also over parking lots. Fill up all that wasted space with PV panels and we'll have more electricity than we know what to do with.
Maybe someone uses less. Maybe two people live together and drive two cars, doubling the load. It's just an example, a way to illustrate how many panels might be needed or, conversely, how much energy is actually used by an electric car. It's still going to be far more panels than most people realize.
1887 hours of sunlight in my city per year.
100w per m2 of solar panel. = 189kWh per m2 per year.
100kWh tesla battery *365 / 189kWh = 193m2 of panels perpendicular to and tracking the sun perfectly. Getting that in reality means far more panels on far more land. Depending on slope, horizon and weather, you get to an acre (4000m2) very quickly.
The math for something akin to a gas station, something that might want to "fill" 200+ teslas per day, becomes staggering. And big electric trucks? ... there isn't enough room in cities or even the suburbs. Widespread adoption of electric vehicles will need an extensive electrical grid.
There's a lot of well-founded speculation that the twin, contemporaneous technology shifts of electric cars and autonomous driving will, in the long run, lead to decreased--possibly radically decreased--car ownership, as the cars become more practical and convenient to summon on demand. That may offset a lot of the per capita charging arithmetic being discussed here.
And nonownership will lead to much smaller cars. It can be witnessed on island tourist destinations: the rentals at the destination airport are consistently smaller than the cars parked at the home airport. Without status considerations or speculative "what if" use case estimations over the whole span of ownership, people take the smallest car that will do the job. Ubers and taxis are big cars only because the driver is the most expensive "component" and cheaper hardware would not affect the cost of a ride enough to make a difference.
Worth noting that this could also lead to incredibly worsened suburban sprawl, as the factors which make driving a long distance to work suck (time, cost) are obviated.
That is indeed worth noting. As a critic of suburban sprawl, I'm disheartened that the balance of Silicon Valley's efforts in this arena seem to be in making the world better for cars, not for people.
Yeah - so far I haven't seen anyone else point this out:
* Automated driving permits you to nap, read, etc. while traveling in a car
* Coordinated automated driving permits much greater throughput on the roads with high speeds and smaller following distances
* Land farther from cities is cheap compared to land close
* Electric vehicles are much, much cheaper than petroleum-powered ones on a per-mile basis
I worry that everyone will want to live on a 3 acre estate on nice cheap land 100 miles from the city. After all, why not? Transportation is nearly free and you can kick back and nap for the 45 minute drive in to work (at an average 133 ish mph).
100 years from now, if humanity is somehow still around, I suspect California will be a gigantic skidpad with one giant metropolis that sprawls for ten times the distance they do now. Why not commute from San Luis Obispo to Los Angeles every day? Or Yosemite to San Francisco? I mean, some people already do Stockton to SF (hellishly) and that's half the distance right there. For that matter, back in college I dated a girl and drove from San Jose to LA every other weekend. I'd wake up at 4:30 AM in LA and be in work by 11 or so, traffic permitting. If I could've slept in the car and made the drive half as long I would've done it a couple times a week.
But you don't need the grid. Instead of pushing power around via wires you make the hydrogen near the power plant and push hydrogen around. That's the de-electrification, the lack of the extensive grid to move power between source and demand.
You've got to be kidding. Due to hydrogen embrittlement and fire risk from leaks there's no safe way to push hydrogen around in an underground pipeline network. The closest we can get is distributing natural gas and then converting that to hydrogen near the point of demand, which still requires a lot of electricity.
"Push" includes transport in tanks ... something that happens today. And if vehicles are going to be running on hydrogen, they are going to be transporting it in tanks too. Just like gas stations today, the hydrogen will be transported from the plant to the stations somehow. That movement of energy, as opposed to pushing electricity over wires, would reduce the need for grids as opposed to electric cars which will increase the need for grids.
I would assume that it's just simpler to synthesize methane (power-to-gas) and use that to (e.g.) fuel vehicles. Assuming you're pulling H from water and CO2 from the air, the methane is carbon neutral, so there's no huge reason to try to transport hydrogen.
I find this difficult to believe. Filling up a low-end Model S requires 60kWh. Assuming you use one "tank" of charge per day (210 miles) and you have 5 hours of sun per day, this is 12kW of panels. Nowadays you can get around 200 watts from a square meter of panels (modulo proper cosine=1 mounting) so this is 60 square meters. Multiply by 1.2 for efficiency losses and it's 72 square meters. Granted that's a lot more than most solar houses need, but it also assumes you're driving 210 miles per day, which is more than most people.
Why do you need an acre to mount 72 square meters? Even if you're in the Arctic Circle, you don't need that much. And you won't be charging in the winter anyway :-)
One thing I like is that it's very targeted. I have a two stage gas furnace, but I also have a leaky, draughty old century home. At night during the winter, it's cheaper to use space heaters in the bedrooms and let the main floor fall to 12C.
Whats funny is at least where I live it costs the same. Per Joule of energy delivered to my living space electricity is about 7x more expensive. When I use electric heat I can target and only heat about 1/7th of my house. Electric still feels more cozy though.
> Transition to EVs may start to soak up all the gains we're making in efficiency so the usage may start to tick back up.
I share this concern, but my current best guess is that while EVs will surge—it won't be instantaneous—and during that ramp not only will efficiency be increasing dramatically (in terms of sharing, routing, and better hardware) but we will be swapping commutes for VR, and moving to more walkable cities, such that if we do it right the net demand won't increase.
Also, as others point out: the transition to EVs "fills in the bathtub" more than it adds "new" demand. In the course of a day, there is a huge demand for electricity during the day and especially the work day and it drops off somewhat sharply as people go to bed. This evening drop-off is sometimes referred to as the "bathtub" in energy demand curves because that's what it looks like in the graphs. (This is often reflected in the "off-peak" pricing and hours from various utilities.)
The majority of EVs charge overnight already. A very rough napkin estimate I saw once showed that assuming the majority of charging continues to happen overnight in the bathtub, you could replace more than half of all cars on the road today with EVs and the electric grid wouldn't feel it from a capacity standpoint (ie, no new coal/nuclear/hydro plants needed).
Throw in some of the proposed "smart grid" enhancements where the electric grid has some leeway to manage how EVs charge/discharge and things are even rosier. (These proposals include the idea of allowing the grid to balance overall demand by balancing EV demand more directly. This includes "pausing" EV chargers during demand spikes and even potentially "loaning" power back from the large batteries of EVs to meet those spikes, repaying that power when demand drops again.)
My office used to be nice and cozy in the winter because of the heat thrown off by computers and CRTs. These days, it's really not that different from any other room in the house.
Zero chance everyone starts wearing VR headsets for office work. First, there will never be a need for the majority of routine office work to take place by wearing VR headsets (ie strapping smartphones to your eyeballs). Second, it would be very damaging to the eye health of employees to do so (for at least half the work day). The legal problems alone guarantees it can never happen.
Long duration gaming is still a niche and always will be. The average person doesn't game hours per day and does not want to (because of the fundamental difference between passive and active entertainment). 5 to 10 hours per week is irrelevant in terms of increasing electrical demand due to 4K gaming.
> Zero chance everyone starts wearing VR headsets for office work
I'd say unlikely not zero, the assumption that everyone is going to strap on an occulus for work is close to zero but if VR finally takes off the 4th or 5th generation systems will be smaller, lighter, cheaper and better that is just the way things seem to go.
I'm old enough to remember when lugging a mobile around was seen as something that made you a bit of a twat because they where huge, clunky, only worked intermittent, expensive and had terrible range.
Now I have a £200 quid phone from 3 years ago (Nexus 4) that is none of those things.
I actually think the first thing we'll see in workplaces is the first none-sucky AR systems, there are a hell of a lot of use cases when you combine them with things like facial recognition.
Receptionists, doctors, pretty much any role that deals with the public.
If they could display decent high quality text and graphics they'd be excellent educational and maintenance tools, back when I worked as an electrician I would have loved to have been able to pull building wiring plans while working on things.
In outdoor lighting they are too bright, and over-illuminate at night, but people use them anyway. They just turn them on, walk away, and blast their surroundings with obnoxious, eye-gouging light bulbs. Especially construction crews, that install cheap temporary utility lights.
Many people install them in places where the lights point away from them, and outward at their surroundings, such that they don't have to suffer the excessive brightness. It's like subjecting oncoming traffic to your car's extra bright high-beams, except it's the house across the street, and it never goes away.
It's bad enough that vandalism is warranted. Outdoor LED lights are that bad.
> In outdoor lighting they are too bright, and over-illuminate at night, but people use them anyway.
I couldn't disagree more. Not only can you control the lumens to a much finer extent, as more cities retrofit to LEDs, you're seeing better illumination with less distortion and far lower light pollution.
There are striking images[0,2] from the LA Bureau of Streetlighting[1] that have made the rounds at a lot of lighting shows to prove this point, after they conducted the world's largest LED retrofit.
The Bureau got praise from the Dark Skies movement for reducing light pollution while increasing pedestrian safety and lowering energy costs.[1]
You might find a few in Hollywood that are nostalgic for the hazy yellows of sodium, but most photographers are really struck by the improvement, "in nearly every case, better for cinematography."[2]
EDIT - On second read, you seem to be referring to the specific case of neighbors who don't understand how many lumens they need. It is easier to crank them up on LEDs, that's true. But that's really a problem with people installing high-lumens for no reason. They could as well install incandescent spotlights and be equally noxious. LEDs in general have so many upsides, attacking them broadly seems like throwing the baby out to spite your face.
Light is a crime deterrent, so I use LED porch lights and live next to a street lamp. One night, all my neighbors had their cars broken into, but the 2 I had parked in the driveway weren't touched. Your neighbors may have the same mindset.
What's weird -- not to say ignorant -- about this article is that it ignores the base-load / peak-load issue that lies at the heart of actual utility economics.
Electricity is sold to households at a flat rate by the joule (metered by the kWh), typically with a fixed monthly fee tacked on.
But wholesale electricity price fluctuates based on demand. Everybody knows this: in the height of summer when everybody's A/C is cranking away, the utilities have to fire up their nasty diesel-powered generators. They beg their commercial customers to reduce demand. Long distance transmission lines warm up and sag a bit. If worst comes to worst, the utilities reduce the voltage on their sendout.
The utilities have to invest capital in well, capacity. If you know you never need more than 1 mW, and never less than 0.5 mW, you can build your distribution system for the peak, install four 0.25mW generators, and run two of them all the time and the other two when you need them. But if you might need 10mW on a hot day, and 0.5mW all the time, you have to spend a bundle on reserve capacity.
Load management is persuading energy users to avoid surges, and to shut off nonessential stuff when not needed.
But the typical electric grid isn't smart enough to handle this automatically. The load manager at the power company has to telephone Wal-Mart stores and ask them to turn off some lights and raise their thermostats. My electric vehicle charges at an appointed time of day, not when there's excess power capacity.
The smarter the grid gets, the better use of capital the power company can make.
Restoring power from blackouts is the worst. Electric motors draw a surge of power when they start. If a large city gets power restored all at once, the surge is huge.
I'm not sure VMs really changed that much. Before VMs, people just ran more services on the bare metal. You had your web server, database, email, etc. daemons all going. Hosting services separated things by accounts and filesystem permissions. VMs made a lot of things easier/more secure but the raw number of physical machines has still exploded. VMs are bottom line less efficient than bare metal due to virtualization overhead.
Maybe in the internet-focused UNIX sysadmin world.
Your average small to medium business in meatspace has a closet running:
- Windows Domain Controller
- Exchange Server
- File server
- Accounting server
- Several line of business apps from different vendors, each with an onsite server component (often just a distribution of MS SQL Server).
- Card access control server (if there are more than the ~4 doors that fit on one beige-box-on-the-wall controller).
- Security camera DVR.
- Cisco, Avaya, or similar PBX.
- HVAC system controller (if you're a large, modern building or complex).
The DC, file, and Exchange servers, as well as desktop management and support, are probably handled by your primary IT contractor who sells you all your servers, network gear, and desktops. They'll tell you it's not good practice to colocate any of these services on the same Windows install, and resell you (with heavy markup) a different Dell/HP box for each one.
Each line of business app has its own vendor, which created and manages your site's installation of its package.
The door controls and security cameras are likely installed and managed by the same company that does your fire alarms. The two software packages likely state they should each have a Windows Server to themselves.
The HVAC system controller is likely installed and managed by a company that knows barely enough IT to be dangerous, and also doesn't want to colocate.
The PBX for your VoIP desk phones is likely installed and managed by the same company that ran the old DEC PBX for your analog desk phones in the 80s. They might have also done your ethernet wiring.
None of these guys trust all of the others enough to share an OS instance. According to the old-timers, you previously you needed a separate server in the closet for each one. Now your Microsoft Partner is also a VMWare Partner and gives each other vendor their own little slice of an ESXi box.
Oh, and there is probably no one on your payroll who understands this stuff or even has the passwords. Just the operations manager type who manages the relationship with each vendor and knows which one to call under what circumstances.
That's the opportunity VMWare capitalized on, and that Microsoft is trying to worm its way into with Hyper-V.
A conservative nontechnical business is unlikely to invest in an unproven desktop management system, office suite, email/messaging, security, and building automation system just to beat virtualization overhead. They're even freaked out by Red Hat.
If you're adventurous, you'll just use Macs/Chromebooks, the Google suite, and cloud-based "IoT" building automation (or you won't have a building automation problem because your office is small or nonexistent). You'll use Skype/Hangouts/Slack/Hipchat/cell phones instead of a PBX.
If you're less adventurous, there's something to be said for widely used components, backed by tech giants who are likely to stick around for a few decades, which multiple interchangeable sales/support firms in your area know how to work with.
Microsoft is trying with Office 365 and their other SaaS/PaaS options. If I was building out the infrastructure for a bootstrapped business today there would have to be a good business reason for me to not go that route.
I think they have lead to consolidation. In many orgs it wasn't unusual to buy servers as new applications were required, and then those were replaced individually once they died. Traditional core services might very well have been on shared services, but if department X wanted application Y suddenly, it got it's own server. With virtualization, a lot of these setups got consolidated.
Less than that, though it depends on the exact drives you're comparing and whether you're comparing idle or working. Also note that an SSD will be able to idle more since it will serve I/O requests faster (in fact that was a common fuck-up in some old benches which would loop work but not take the amount of work in account when measuring power consumption: an SSD would increase power draw because it would be able to keep the computer significantly more busy than spinning rust).
And SSDs are continually improving that by increasing performance (this maximizing the length of idle time (which on SSDs is a small fraction of a watt)).
This is a common misconception - manufacturing output, beyond drops during major recessions, has steadily increased[1]. Recently, increases have slowed but outside recessions, "the general decline of US manufacturing capacity" is just not happening.
What is dropping are inputs for the same output - employment being the most obvious / painful, but power, raw materials, etc. are all being used more sparingly as manufacturers improve processes and adopt new technologies.
Actually, it's not. Those manufacturing output numbers are both tricky and misleading. They are dominated by computer/electronic production and automotive. Because of how the statistics account for imports and technology advancement, the electronic components in particular are frequently overstated. For example, when Intel was shipping chips that were 2x more powerful than the last generation, that effectively doubled the measured output.
Simultaneously, the stats under account for services and non-market production. So things like Medicare/Medicaid ($1T annually) are measured at their cost, not based in the value delivered.
The reality is that industrial production in the US is 15-20% less than it was 20 years ago. Automation kills employment but most of the value creation has been exported to Asia. Denying that is denying reality.
Can you please supply a direct quote from one of those articles to support this assertion?
> The reality is that industrial production in the US is 15-20% less than it was 20 years ago.
I wonder how recently and carefully you have read these articles. For example, the Economist and Real Clear Policy pieces are directly contradictory on the subject of measuring value vs. price for electronics.
The Bloomberg piece is 8 years old. And even then, it says:
> After the adjustments, however, the new growth rate for manufacturing output might be as small as 0.8% a year,
Growth of 0.8% a year, while anemic, is not a decline.
I certainly wouldn't count America out. However, looking at the numbers in terms of growth, there's been a pretty steady downward trend since (an unusually high) 2010 Q3: https://fred.stlouisfed.org/graph/?g=dmxz
Although the very last recorded quarter could be the start of a trend reversal... Regardless, you're correct on your original point: a reduction in energy usage would not be due to an output decline in manufacturing (which is the most energy intensive sector in most modern economies).
That's not what I meant to say. Sorry, reading back on what I wrote I can see my wording was a bit ambiguous. They are definitely positively correlated.
The chart shows that from about 2010 onward, manufacturing output has been growing, but at a steadily declining (but still positive) rate. So manufacturing output was still growing until it hit zero growth in 2014 and started crawling along the floor, roughly at a steady-state.
So between 2010 and 2014, manufacturing should have been consuming more electricity (in absolute terms) from one year to the next. I suppose if manufacturing became less energy intensive per unit of output over that period then they could have contributed to reduced energy usage. Since I'm just some random internet dude reading a chart (who doesn't live in the US) I have no idea if that's the case.
I'm still skeptical. Basically wondering if "Manufacturing Sector: Real Output" is somehow "dollar driven" vs "actual number of electric consuming plants" driven.
Government has been targeting industrial users heavily for the last decade. If you have spinning machines or industrial scale use, you will attract investment to reduce load, and very substantial investment if you allow the utility to shed your load (i.e. Turn stuff off) turning peak periods.
Mainly because we're in a recovery period from the recession that started in 2008. The trend is clearly still upward, though it's starting to level off. Too soon to tell if that's temporary or not.
The main reason why the manufacturing sector in the US is still large is for the increasing expenditures in defense. The military/industrial complex receives huge cash infusions every year - corresponding to a large percentage of the federal budget. It is hard to predict what could happen to manufacturing in this country without such defense expenses.
Of course, his last point is probably the most telling:
"Dig more coal...the Teslas are coming."
One of his major sources is eia.gov, and their 2015 report has electric power use at 38% of total use.
Transportation is another 28% of total energy use, so the sooner that can switch over the electric, the sooner we can drop the 36% of energy being sourced from petroleum.
Better than that! I prefer to look at the actual, most recent up to date data that EIA has collected. It goes all the way up to January 2017. In the 12 months rolling up through January, coal was just 30.5% of US electrical generation:
https://www.eia.gov/electricity/monthly/epm_table_grapher.cf...
Below gas (33.5%). And below carbonfree energy at 35.4%.
That 35.4% carbonfree electricity can be broken down into: 19.8% nuclear (this part we really should fight to keep operating as long as possible, certainly until we shut down every fossil fuel plant... going from 80% renewables to 100% renewables may actually be harder than going from the current 15.5% to 80%).
7.6% wind, geothermal, small hydro, and maybe some biomass (unfortunately, EIA doesn't break these out separately)... though most of this is wind and geothermal
6.7% conventional hydro.
1.4% solar. (and this includes distributed solar)
AS recently as 2003, over 50% of our electricity was generated from coal. We're now at 30.5%.
As long as we keep nuclear operating, it should be fairly cheap to squeeze out the rest of that 30% of coal by installing high capacity factor wind turbines where it's windy (and perhaps anti-correlated with other sites), lots and LOTS of cheap utility-scale solar, mounted on single-axis trackers to increase capacity factor, and lots in Texas and the American South where it's not already common.
Adding 15% additional grid penetration of solar and wind both, along with probably some storage (lots of pumped hydro in Appalachia to replace coal, batteries everywhere else) to smooth out demand and UHVDC powerlines and East-West interconnects, we can squeeze out the rest of that 30% of coal without huge costs.
If we try to do this without that ~20% nuclear, we'll either need a lot more gas or a LOT more money.
Just FYI, you can get a more fuel and technology specific breakdown of generation and inputs from form 923, but it's timelagged and many plants don't show up in the monthly reports, so you have to wait for the annual.
> so the sooner that can switch over the electric, the sooner we can drop the 36% of energy being sourced from petroleum.
but it's not a 1:1 relationship. A lot of electricity in the US comes from burning petro-fuels like natural gas (if you count natural gas as a petro fuel) or coal.
There was a study[1] last year that argued that driving a Tesla in Ohio caused more pollution than driving a BMW 3-series, because the source of the Tesla's electricity was likely coal.
In general, it looks like even in the dirtiest states, driving an electric car is about as efficient as a 40mpg car.
The argument also goes that electricity generation only gets greener so your electric car actually pollutes less over its lifetime, unlike a conventional gasoline powered car.
I remember my college desktop and how happy I was when I got a 400W power supply! Now the Mac I travel with that has 100x the processing, storage, etc uses a 85W supply if it's plugged it. If you do that at scale with more efficient everything, it makes sense.
Remember when turning off the lights saved money? It was dubious then but with bulbs that use 1/10 the power, it's almost humorous.
Ya, that 8 watt LED bulb will cost you $8/year running 24/7 at 10cents/kwh. If it takes 2 seconds to turn it on and off each day, that's 24 minutes a year saved by leaving it on. If your time is worth >$20/hr, consider uninstalling the switch.
Are you saying electricity delivered to the home costs 3 cents/kWh in Austin? I flat out don't believe that. We're paying over 20 cents on Cape Cod. I can believe 10 cents in some areas. 3 cents, sorry; no.
> As population growth slows it's only natural for electricity to as well.
You missed the point. It's not that growth in total energy use is slowing, like growth in population is slowing. The point is that per-capital energy use is _declining_. You can't describe a decline as "slowing growth" with a straight face unless you are a Fox News commentator.
A few years ago I measured the power consumption of various devices around my home, using an inline meter of unknown accuracy (e.g., TV plugs into meter, meter plugs into wall outlet).
There weren't many surprises, but my cable TV boxes were disappointing: Two different models both used the same amount of power, 25-30 W, regardless of whether they were 'on' or 'off', and no matter how long they were 'off'. No other devices that drew any significant power behaved this way; in comparison, my laptop used 1W when asleep.
The cable boxes are merely computers, and not even general-purpose ones - i.e., only a small number of predictable features have to be implemented - and the vendor is complete control of the hardware and software. There is no reason an effective sleep mode couldn't be implemented. And think about the mass deployments of these devices by the cable companies - how much power could be saved by investing in some very standard, available tech, and add up the impact on customers' electric bills and the environment.
However, I wonder if other models of cable boxes are the same.
Yeah, I measured the same thing on the cable box, an object which gets about 40 minutes use a day in my household.
So when my local utility (UK-owned National Grid) did a promo for power strips that shut off all the plugs when one plug stops drawing power, I bought one. Works great. Takes about two minutes for the cable box to boot up when I need it.
National Grid lost a 35w base load (24x7 load) by selling me that strip. 35w costs me about $45 a year, and the power strip cost me $15.
If my cable box loses power then it loses all its channel guide data, and it takes maybe 15 minutes to reload.
It really seems unbelievable. Why isn't the data stored on the local disk? How much data could that be and why does it take more than 10 seconds to download? On my box, it is all text.
Agreed. The monopoly regulators who tell the cable companies what to do should give them a limit on the electric load their customer-premises equipment (cable boxes) may draw when idle.
It should be no more than dozens of milliwatts. When the load exceeds that, the cost of that electricity to their customers should be refunded via a credit line item in each bill. That will give the cable operators a way to transition to better equipment, and an incentive to get the transition completed.
This seems a pretty clear case of misaligned incentives. There's no reason for the cable company to make more energy-efficient boxes - they don't pay for the electricity use, and customers can't generally switch to different provider in order to get a more efficient cable box.
The article doesn't make it clear if this is commercial production of power, or if it include private owned solar etc. I could see centralized commercial energy production dropping due to home solar alone, but TFA isn't clear.
Either way, I'd bet that the electrification of our cars and trucks over the next few decades will absorb most of the excess capacity and drive the need for more production.
The article addresses private solar, though admits to incomplete data and some uncertainty:
"The Energy Information Administration actually started estimating power generation from small-scale solar installations at the end of 2015... and found that it accounted for only about 1 percent of U.S. electricity. That estimate could be off, and there's surely room for more study, but mismeasurement of solar generation doesn't seem to be the main explanation here."
Seems like it's mostly about traditional manufacturing, or the lack thereof. Energy price spikes (2008) kill off energy-intensive businesses. But even when prices come back down these businesses don't start back up in the US, because a competitor in China has already expanded to fill the gap, and they do it cheaper.
There's a much simpler, technical explanation: the transition from linear power supplies to switch mode power supplies, in devices of all sorts including but not limited to computers. The timing of the peak and beginning decline in per capita power demand coincides with the point at which SMPS's became a practical option for new designs beginning at the end of the 1970's. The decline since then may just be the result of gradual replacement of linear supplies by attrition.
Efficiency of CPU's and tech do not drive electricity demand. If anything, we use more power for tech now than ever before. This is the direct result of cheaper natural gas being on the market, which removes demand for electric heating/heat pumps, electric appliances, etc. There is also a shift in industrial use. Just this year Washington lost a large aluminum arc furnace smelter, for instance.
"Consider the shift to cloud computing. From 2000 to 2005, electricity use by data centers in the U.S. increased 90 percent. From 2005 to 2010, the gain was 24 percent. As of 2014, data centers accounted for 1.8 percent of U.S. electricity use, according to a 2016 Lawrence Berkeley study, but their electricity demand growth had slowed to a crawl (4 percent from 2010 to 2014). What happened? The nation outsourced its computing needs to cloud providers, for whom cutting the massive electricity costs of their data centers became a competitive imperative. So they innovated, with more-efficient cooling systems and new ways of scaling back electricity use when servers are less busy."
I am torn because centralization leads to more R&D and cost savings and better security. I just wish the innovations would propagate to everyone. Decentralization is much better for everything except those things that come with economies of scale.
The R&D being done here doesn't only apply if you run extremely large data centers. Facebook, Google are contributing back to the Open Compute project [1] , and anyone could use this knowledge if they want to build a small data center with things like the Open Rack[2].
But it seems to me that when it comes to security, efficiency etc. only large centers can get it right, because they have more at stake. The majority of small providers will mess up. Like compare AWS uptime and security to a regular host.
And sadly this explains the rise of centralized everything including gmail and facebook and iOS app store and - for thousands of years - centralized cities and states and federations.
Rather than this being a measure of energy efficiency or
reduction - could it not be a measure of GDP artificial inflation?
Though that would be a bit of a sensational title, the trend also suggests a decoupling of GDP from physical processes (mfg, service jobs) directly requiring power for their use..
Electricity generation (and use?) is below its 2007 levels. Is this a trend? Let's skip to the last paragraph (as one always should when beginning to read these clickbait articles):
"So is electricity use in the developed world fated to decline for years to come? Well, not exactly fated. Check out that bottom line in the last chart. Transportation now accounts for just 0.3 percent of retail electricity use in the U.S. If the shift to electric vehicles ever picks up real momentum, that's going to start growing, and fast. Dig more coal (or drill for more natural gas, or build more nuclear reactors, or put up more windmills and solar panels) -- the Teslas are coming."
If you just skipped to the end of the article, you missed the fact that the last sentence is a slightly tongue-in-cheek reference to this article from 1999:
Interestingly he's stating that the transition to electric vehicles (EVs) will make up for the flat energy usage / non-existent growth. However, his own argument should negate a substantial amount of that: if large players such as tesla/GM/lyft/uber/whoever own the majority of future transportation, they will also have a strong incentive to be energy efficient.
Never thought of it that way, but large players don't just make efficient use of hardware, but of the resources (energy etc) they consume too. That means the traditional energy industry is in for a beating...
Or a strong incentive to vertically integrate electricity production. I believe Google have done this in some form to secure and reduce cost of datacenter energy.
The source is the same as the original article (U.S. Energy Information Administration), it just shows a much broader (and deeper) picture. As a result, you may draw different conclusions.
I don't think it's related to this plateau, but it sparked the question:
If the global temperature were to increase by 1° would that cause a net gain or loss in electricity use?
If you oversimplify you'd assume that since heating is more expensive than cooling consumption would decrease, but I imagine it might also depend on where the majority of humans are located in terms of climate?
The first thing I notice about this article is that it talks about generation, not consumption. Where is the data on US energy consumption? It's especially odd given that he compares US generation growth to China's consumption growth toward the end of the article.
That's great! Just wait for EVs though, you'll suddenly get all that electricity use back, and a lot more. But, overall energy use per GDP will fall through the floor due to their much higher energy efficiency vs gas powered cars.
"1993 and 2005, air conditioners in the U.S. increased in efficiency by 28%, but by 2005, homes with air conditioning increased their consumption of energy for their air conditioners by 37%."
Just in computers there have been huge advances. My MacBook Pro requires about 1/6 the electricity of my 2007 Desktop, and my LCD monitor is about 1/6 a CRT.
Transition to EVs may start to soak up all the gains we're making in efficiency so the usage may start to tick back up. But we're building Solar and Wind sources at an increasing rate that will effectively start trading oil for renewable electricity.
These are all positive trends.