LED and CFL lights, SEER ratings, LEED certifications, Energy Stars all are having an impact.
Just in computers there have been huge advances. My MacBook Pro requires about 1/6 the electricity of my 2007 Desktop, and my LCD monitor is about 1/6 a CRT.
Transition to EVs may start to soak up all the gains we're making in efficiency so the usage may start to tick back up. But we're building Solar and Wind sources at an increasing rate that will effectively start trading oil for renewable electricity.
> My MacBook Pro requires about 1/6 the electricity of my 2007 Desktop
Apples to Oranges though (or Apples, but still). The comparison should be a laptop from 2007. I ran a house on solar electricity and windpower back then, my laptop was about 35 Watts according to the e-meter connected to it, my present day laptop is roughly the same.
Laptops have always been substantially more efficient than desktops.
> Laptops have always been substantially more efficient than desktops.
Yes but a decade+ ago, they weren't a replacement for a desktop, so you ended up having both. Now even mid end laptops (or "low end" Macs) are totally feasible as replacements.
I bought my last desktop in 2010 and I can't imagine buying another one.
> I bought my last desktop in 2010 and I can't imagine buying another one.
I still do 95% of my development on a desktop, I just can't see ever not doing that, I'm older now and a good desk, good chair, 3 big screens takes up enough space that a tower isn't taking up much more space.
That and I can put hardware into a PC for half the price of a Macbook Pro that will absolute annihilate the laptop on performance.
Portability and power efficiency does impose a constraint.
For me the trade off of having two machines is worth it rather than having one machine that is permanently hobbled by been designed for a use case I rarely use makes no sense.
> I'm older now and a good desk, good chair, 3 big screens takes up enough space that a tower isn't taking up much more space.
I do the same, 95% of my development with a Macbook Pro with 2 big screens, keyboard and mouse attached. Don't even touch or move it on most days.
A desktop would be much cheaper - but I would still need a decent laptop for the 5% time when I'm not at my desk, so the savings aren't that much any more. And keeping development environments, VMs, config files, keyboard layouts, etc syncronized on two separate computers is too much extra work.
I have an oldish but fairly beefy Vostro that does in a pinch, keeping things synchronised isn't that hard in practice since the vagrant files and project files all live in git but yeah there is some overhead but I have that anyway since I have a desktop at the office and at home.
> keeping things synchronised isn't that hard in practice
I disagree, I think how difficult it is to keep things synced varies significantly between users. Sure, its easy if your work is consistent and you're disciplined but if you find yourself switching between very different projects frequently the overhead of syncing machines and setting up software repeatedly can become significant.
Each client gets a "team" in bitbucket and orchestration becomes no harder than
git pull && vagrant up.
It works great on projects that have a simple structure (webserver/db) and projects that have complex configurations with ES/Redis as well.
The trick is to always treat configuration as code and never make manual changes to a VM (or more correctly, always make a manual change then a change to configuration code).
As an entirely added side benefit I can pull a project from two years ago down and be up in a fraction of the time without polluting my host OS.
Even when that project is currently on 14.04 with particular weird dependencies (exactly this version of wkhtmltopdf etc).
Longer term I'll probably look at docker or something but vagrant/ansible has worked out great (some projects aren't even using ansible, just a bash script).
Fair enough, but... my 2010 desktop was a lot more energy efficient than my 2005 desktop. And my 2017 desktop is probably going to be made from what are effectively laptop parts in a brick the size of a large hardback, with the power supply external like a laptop.
So while most people are moving to laptop-only, even us hold-outs are seeing more energy efficiency in desktops.
The power states and performance per watt have gone way up too, in 2007 power gating CPUs was not commonplace, and Intel was just starting to optimize for performance per watt (especially in the low power laptop/tablet market).
What vendors have done is provide computers with the same power envelope, but much more performance so they may do your calculations @ 35 watts, and then immediately cut all power to large swaths of the CPU so as to curtail power use by at least a few watts, if not 10 watts (depends on your CPU).
>Yes but a decade+ ago, they weren't a replacement for a desktop, so you ended up having both. Now even mid end laptops (or "low end" Macs) are totally feasible as replacements.
You obviously don't play video games, do CAD or video stuff or develop and compile large software. My desktop is never replaced by my laptop.
Video games run perfectly reasonably for a casual player, and even some low-level competitive players, on a mid-spec laptop. Not everybody demands that the latest Crytek game run at 60fps on extra-high settings 100% of the time. Aside from that - most people don't do those things either. A laptop is totally feasible as a replacement for the 99.99% of people who could also get away with a 5 year old desktop.
True, however desktops are getting more efficient, too. You can have an impressive amount of computing power (GPU and CPU) with a couple hundred watts.
Just a couple of years ago I had a 800w PSU to be able to run modern games. Now I have a mini-ITX machine with a 450w power supply with a lot of room to grow. I'd estimate actual power usage to be 50-60% the nominal capacity at peak.
It's hard to compete with a laptop in power consumption, of course. My CPU's TDP is higher than 35w. It's an I3, so Intel says 51W TDP. It will still run circles around the laptop's cpu.
> desktops are getting more efficient, too. You can have an impressive amount of computing power (GPU and CPU) with a couple hundred watts.
Jeeze, my desktop, a full socket 1150 Core i3, is under 100 watts. Heck, I have a file server with twelve 3.5" hard drives and a 4 core socket 1150 Xeon that draws under 200 watts with everything spinning.
The big competitor really is AWS, especially with their multi-gpu instances. For gamers not really an option to have their GPUs at the end of a wire but for a lot of other applications it makes good sense.
>Laptops have always been substantially more efficient than desktops.
No, they haven't; not on a MIPS/Watt basis. A laptop-to-desktop comparison is entirely valid. I use a laptop now, and a desktop in the past; I do the same work, and I get better performance out of my modern laptop. I even have dual 24" monitors on the laptop (docking station).
Demanding a comparison between desktops of the mid-2000s and desktops of now is like demanding a comparison between the fuel consumption of a Corvette from the 1980s and one from now, instead of recognizing that I can buy an economy car now that gets better performance than the 1980s Corvette while also getting much better fuel economy. If you're comparing on the basis of performance and capability, you don't need to stay with the same class of device, because capabilities and performance have changed so much over time.
Why? When we're talking about overall usage, functional replacements are far more relevant than direct device-category to device-category comparisons. In much the same way, the fact that my mom has replaced her desktop with a tablet is the directly relevant comparison for my mom's "electricity usage due to personal computing".
> Apples to Oranges though (or Apples, but still). The comparison should be a laptop from 2007
Nope, in 2007 a lot of people used and needed desktops that are more than satisfied by the available computing power on a laptop today. It's only an apples-oranges comparison if everyone were still using desktops, which is not the case.
In 2007, even more people had their needs met by laptops. Heck, there were people in 2002 who went laptop full time. Anyone looking closely enough in 2000 could see the writing on the wall.
I used to have the same argument... until I realized modern laptops can be desktop replacements (without being those gigantic bricktop "mobile/portable workstations").
Just slap a full sized monitor and a keyboard+mouse, and its more capable as a thin desktop (ala iMac) than the actual thing, and when you want to leave, just unplug all that shit and leave.
The only call for full sized desktops is people like me, where cramming in full sized GPUs, more than one drive, and 32+GB of RAM is required.
> The only call for full sized desktops is people like me, where cramming in full sized GPUs, more than one drive, and 32+GB of RAM is required.
That, or when you need more CPU horsepower than what a laptop will provide. Even if it is not thermal throttled, if it is not a big laptop, odds are it is using a very low power part.
Definitely, heat dissipation and power envelope limitations on laptops are still a major issue, but with power gating coming standard on most chips and very aggressive p states, average power usage is down by double digit margins on your average CPU over the last decade.
This is very beneficial for servers and workstations too, as it keeps your power bill much lower, and makes it not financially viable to stick around on 8+ year old hardware, where it draws 250 watts for half the performance of a year or two old system, that will pull 60 watts minimum, and while topping out at 250 watts, will like average around 70 watts or so, saving you about $180 a year at $0.12/kwh.
I have a 2007 17" MBP, fully loaded for its time, that was still giving me great service as a classic gaming machine (via Bootcamp) as recently as six months ago. In its prime it was essentially a true desktop replacement for most work activity. Sadly, it's about ready to give up the ghost. There are fewer and fewer use cases I can justify squeezing out of it these days. But what a hell of a decade it's given me.
> Apples to Oranges though (or Apples, but still). The comparison should be a laptop from 2007. I ran a house on solar electricity and windpower back then, my laptop was about 35 Watts according to the e-meter connected to it, my present day laptop is roughly the same.
No, it shouldn't. Many people use laptops today for the things they used Desktops in the past.
I permanently turned off my last desktop around 2007, and have been using a laptop exclusively since. I had a media PC I built under my TV, and I shut that off permanently around 2010, replacing it with a Google TV device that I later stopped using. Nowadays I have a laptop (no external monitor, even), a reasonably low-powered NAS, and a ChromeCast, and that covers my computing/media needs at home.
In that time I also went from a CRT TV to a plasma, and at some point in the next year or so I'll probably swap that out for a (lower power) OLED TV or similar.
(As an aside, 35W for a laptop today seems excessive. Assuming PowerTop is accurate, my laptop is using 9W right now. While charging it's undoubtedly using more, but even plugged in while fully charged, with less of the power-saving features enabled, 35W seems a bit high.)
The first thing I noticed about the article was the complete non-mention of energy efficiency. In the span of a decade (maybe less?) LED lighting has made huge advances.
Electric vehicles would certainly provide some growth potential, though overall they would most likely even out 24-hour usage patterns rather than overstressing the grid.
>> they would most likely even out 24-hour usage patterns rather than overstressing the grid.
Take a look at exactly how much power goes into an electric vehicle. I once worked out how many solar panels one would need to charge a tesla once per day in my local. The math pointed to something like a solid acre of land dedicated to sustaining a single car. (not an acre of panels, at my latitude you need proportionally more land on which to mount panels). A shift to all-electric transport would produce a demand for which LEDs could never compensate. Only the very wealthy would ever own enough land to actually self-sustain through renewables. Cities will need power grids for a long while.
Take electric transport out of the equation and things start looking much better. Hydrogen-powered vehicles would imho at least allow the possibility of "de-electrification" in north america.
Do you drive 200 miles a day? I don't. You rarely need to charge an electric car from empty.
I agree that solar is significantly less attractive in high-latitude, cloudy locations, though. All of those panels that Germany installed would have been much more useful in Arizona.
Hydrogen, of course, is a boondoggle: it's primarily a fossil fuel, needs a whole new distribution system, is less efficient than pure electric, and requires very expensive vehicles.
Worse than that: if you buy hydrogen today (e.g. for some fuel cell car prototype), it is almost certainly created from fossil hydrocarbons. So hydrogen is not even a battery now but just a very elaborate way to burn fossils. "Equivalent to batteries" is the best case future scenario (e.g. in presence of abundant electricity from controlled fusion power plants).
Almost no one needs to fully recharge a Tesla once per day, in fact that's really bad for the battery. More likely, they'll be doing only a quarter to half-charge every day, on average. Also, Teslas aren't the most efficient EVs around; they're big and high-performance. A Leaf will use less electricity.
People don't need to own land to recharge their EVs; they can just hook up to the grid, just like normal people today don't have solar panels usually and just buy gasoline and electricity. There's tons of room in cities for more solar panels though, on top of roofs, on large commercial buildings, and also over parking lots. Fill up all that wasted space with PV panels and we'll have more electricity than we know what to do with.
Maybe someone uses less. Maybe two people live together and drive two cars, doubling the load. It's just an example, a way to illustrate how many panels might be needed or, conversely, how much energy is actually used by an electric car. It's still going to be far more panels than most people realize.
1887 hours of sunlight in my city per year.
100w per m2 of solar panel. = 189kWh per m2 per year.
100kWh tesla battery *365 / 189kWh = 193m2 of panels perpendicular to and tracking the sun perfectly. Getting that in reality means far more panels on far more land. Depending on slope, horizon and weather, you get to an acre (4000m2) very quickly.
The math for something akin to a gas station, something that might want to "fill" 200+ teslas per day, becomes staggering. And big electric trucks? ... there isn't enough room in cities or even the suburbs. Widespread adoption of electric vehicles will need an extensive electrical grid.
There's a lot of well-founded speculation that the twin, contemporaneous technology shifts of electric cars and autonomous driving will, in the long run, lead to decreased--possibly radically decreased--car ownership, as the cars become more practical and convenient to summon on demand. That may offset a lot of the per capita charging arithmetic being discussed here.
And nonownership will lead to much smaller cars. It can be witnessed on island tourist destinations: the rentals at the destination airport are consistently smaller than the cars parked at the home airport. Without status considerations or speculative "what if" use case estimations over the whole span of ownership, people take the smallest car that will do the job. Ubers and taxis are big cars only because the driver is the most expensive "component" and cheaper hardware would not affect the cost of a ride enough to make a difference.
Worth noting that this could also lead to incredibly worsened suburban sprawl, as the factors which make driving a long distance to work suck (time, cost) are obviated.
That is indeed worth noting. As a critic of suburban sprawl, I'm disheartened that the balance of Silicon Valley's efforts in this arena seem to be in making the world better for cars, not for people.
Yeah - so far I haven't seen anyone else point this out:
* Automated driving permits you to nap, read, etc. while traveling in a car
* Coordinated automated driving permits much greater throughput on the roads with high speeds and smaller following distances
* Land farther from cities is cheap compared to land close
* Electric vehicles are much, much cheaper than petroleum-powered ones on a per-mile basis
I worry that everyone will want to live on a 3 acre estate on nice cheap land 100 miles from the city. After all, why not? Transportation is nearly free and you can kick back and nap for the 45 minute drive in to work (at an average 133 ish mph).
100 years from now, if humanity is somehow still around, I suspect California will be a gigantic skidpad with one giant metropolis that sprawls for ten times the distance they do now. Why not commute from San Luis Obispo to Los Angeles every day? Or Yosemite to San Francisco? I mean, some people already do Stockton to SF (hellishly) and that's half the distance right there. For that matter, back in college I dated a girl and drove from San Jose to LA every other weekend. I'd wake up at 4:30 AM in LA and be in work by 11 or so, traffic permitting. If I could've slept in the car and made the drive half as long I would've done it a couple times a week.
But you don't need the grid. Instead of pushing power around via wires you make the hydrogen near the power plant and push hydrogen around. That's the de-electrification, the lack of the extensive grid to move power between source and demand.
You've got to be kidding. Due to hydrogen embrittlement and fire risk from leaks there's no safe way to push hydrogen around in an underground pipeline network. The closest we can get is distributing natural gas and then converting that to hydrogen near the point of demand, which still requires a lot of electricity.
"Push" includes transport in tanks ... something that happens today. And if vehicles are going to be running on hydrogen, they are going to be transporting it in tanks too. Just like gas stations today, the hydrogen will be transported from the plant to the stations somehow. That movement of energy, as opposed to pushing electricity over wires, would reduce the need for grids as opposed to electric cars which will increase the need for grids.
I would assume that it's just simpler to synthesize methane (power-to-gas) and use that to (e.g.) fuel vehicles. Assuming you're pulling H from water and CO2 from the air, the methane is carbon neutral, so there's no huge reason to try to transport hydrogen.
I find this difficult to believe. Filling up a low-end Model S requires 60kWh. Assuming you use one "tank" of charge per day (210 miles) and you have 5 hours of sun per day, this is 12kW of panels. Nowadays you can get around 200 watts from a square meter of panels (modulo proper cosine=1 mounting) so this is 60 square meters. Multiply by 1.2 for efficiency losses and it's 72 square meters. Granted that's a lot more than most solar houses need, but it also assumes you're driving 210 miles per day, which is more than most people.
Why do you need an acre to mount 72 square meters? Even if you're in the Arctic Circle, you don't need that much. And you won't be charging in the winter anyway :-)
One thing I like is that it's very targeted. I have a two stage gas furnace, but I also have a leaky, draughty old century home. At night during the winter, it's cheaper to use space heaters in the bedrooms and let the main floor fall to 12C.
Whats funny is at least where I live it costs the same. Per Joule of energy delivered to my living space electricity is about 7x more expensive. When I use electric heat I can target and only heat about 1/7th of my house. Electric still feels more cozy though.
> Transition to EVs may start to soak up all the gains we're making in efficiency so the usage may start to tick back up.
I share this concern, but my current best guess is that while EVs will surge—it won't be instantaneous—and during that ramp not only will efficiency be increasing dramatically (in terms of sharing, routing, and better hardware) but we will be swapping commutes for VR, and moving to more walkable cities, such that if we do it right the net demand won't increase.
Also, as others point out: the transition to EVs "fills in the bathtub" more than it adds "new" demand. In the course of a day, there is a huge demand for electricity during the day and especially the work day and it drops off somewhat sharply as people go to bed. This evening drop-off is sometimes referred to as the "bathtub" in energy demand curves because that's what it looks like in the graphs. (This is often reflected in the "off-peak" pricing and hours from various utilities.)
The majority of EVs charge overnight already. A very rough napkin estimate I saw once showed that assuming the majority of charging continues to happen overnight in the bathtub, you could replace more than half of all cars on the road today with EVs and the electric grid wouldn't feel it from a capacity standpoint (ie, no new coal/nuclear/hydro plants needed).
Throw in some of the proposed "smart grid" enhancements where the electric grid has some leeway to manage how EVs charge/discharge and things are even rosier. (These proposals include the idea of allowing the grid to balance overall demand by balancing EV demand more directly. This includes "pausing" EV chargers during demand spikes and even potentially "loaning" power back from the large batteries of EVs to meet those spikes, repaying that power when demand drops again.)
My office used to be nice and cozy in the winter because of the heat thrown off by computers and CRTs. These days, it's really not that different from any other room in the house.
Zero chance everyone starts wearing VR headsets for office work. First, there will never be a need for the majority of routine office work to take place by wearing VR headsets (ie strapping smartphones to your eyeballs). Second, it would be very damaging to the eye health of employees to do so (for at least half the work day). The legal problems alone guarantees it can never happen.
Long duration gaming is still a niche and always will be. The average person doesn't game hours per day and does not want to (because of the fundamental difference between passive and active entertainment). 5 to 10 hours per week is irrelevant in terms of increasing electrical demand due to 4K gaming.
> Zero chance everyone starts wearing VR headsets for office work
I'd say unlikely not zero, the assumption that everyone is going to strap on an occulus for work is close to zero but if VR finally takes off the 4th or 5th generation systems will be smaller, lighter, cheaper and better that is just the way things seem to go.
I'm old enough to remember when lugging a mobile around was seen as something that made you a bit of a twat because they where huge, clunky, only worked intermittent, expensive and had terrible range.
Now I have a £200 quid phone from 3 years ago (Nexus 4) that is none of those things.
I actually think the first thing we'll see in workplaces is the first none-sucky AR systems, there are a hell of a lot of use cases when you combine them with things like facial recognition.
Receptionists, doctors, pretty much any role that deals with the public.
If they could display decent high quality text and graphics they'd be excellent educational and maintenance tools, back when I worked as an electrician I would have loved to have been able to pull building wiring plans while working on things.
In outdoor lighting they are too bright, and over-illuminate at night, but people use them anyway. They just turn them on, walk away, and blast their surroundings with obnoxious, eye-gouging light bulbs. Especially construction crews, that install cheap temporary utility lights.
Many people install them in places where the lights point away from them, and outward at their surroundings, such that they don't have to suffer the excessive brightness. It's like subjecting oncoming traffic to your car's extra bright high-beams, except it's the house across the street, and it never goes away.
It's bad enough that vandalism is warranted. Outdoor LED lights are that bad.
> In outdoor lighting they are too bright, and over-illuminate at night, but people use them anyway.
I couldn't disagree more. Not only can you control the lumens to a much finer extent, as more cities retrofit to LEDs, you're seeing better illumination with less distortion and far lower light pollution.
There are striking images[0,2] from the LA Bureau of Streetlighting[1] that have made the rounds at a lot of lighting shows to prove this point, after they conducted the world's largest LED retrofit.
The Bureau got praise from the Dark Skies movement for reducing light pollution while increasing pedestrian safety and lowering energy costs.[1]
You might find a few in Hollywood that are nostalgic for the hazy yellows of sodium, but most photographers are really struck by the improvement, "in nearly every case, better for cinematography."[2]
EDIT - On second read, you seem to be referring to the specific case of neighbors who don't understand how many lumens they need. It is easier to crank them up on LEDs, that's true. But that's really a problem with people installing high-lumens for no reason. They could as well install incandescent spotlights and be equally noxious. LEDs in general have so many upsides, attacking them broadly seems like throwing the baby out to spite your face.
Light is a crime deterrent, so I use LED porch lights and live next to a street lamp. One night, all my neighbors had their cars broken into, but the 2 I had parked in the driveway weren't touched. Your neighbors may have the same mindset.
Just in computers there have been huge advances. My MacBook Pro requires about 1/6 the electricity of my 2007 Desktop, and my LCD monitor is about 1/6 a CRT.
Transition to EVs may start to soak up all the gains we're making in efficiency so the usage may start to tick back up. But we're building Solar and Wind sources at an increasing rate that will effectively start trading oil for renewable electricity.
These are all positive trends.