At first I was like "Why would you do that for only $1m? If you had that big of a breakthrough, you could easily generate that (and then a lot more) by selling it yourself." Then I read that they aren't taking the IP, and are just giving you the cash as a pure incentive. They can publish your high level approach documents, but you still own the invention.
I wish more of these contests were run that way. I think they'd yield much high quality and differentiated results with a lot more entrants.
In general yes, it's a myth that grants are only for full-time academics. However this particular grant is indeed only for full-time academics, according to its terms: http://research.google.com/university/relations/littlebox.ht... Though the full RfP does also encourage groups of students to recruit a professor and apply as a team.
Incidentally, the RfP has a nice summary of what Google thinks the main engineering challenges in winning the award are. The first one on the list is finding a way to deal with 120 Hz ripple in a way other than the current solution of huge capacitors.
Indeed. It's a much different prospect to have to not only develop some new piece of technology but also build a business around it, figure out how to transition from your day job on the one hand to being handed enough money to make a lot of those risks and problems go away if you're successful on the other hand. It changes the risk/reward equation for someone talented enough to maybe develop this sort of thing in their spare time.
> (1) for the purposes of allowing Google and the Judges to evaluate and test the Device for purposes of the Contest, and
(2) in connection with advertising and promotion via communication to the public or other groups, including,
but not limited to, the right to make screenshots, animations and device clips available for promotional
purposes
In other words they only get rights related to this contest, they cannot do anything beyond the contest with your IP.
I originally read that part as applying only to the 'display' part of the rights clause. Reading it again, I think you are right - perhaps Google really is getting zero licensing rights out of the contest.
Although they do say in their FAQ: "Google is not requiring any IP or licenses be granted except a non-exclusive license to be used only for the purpose of testing the inverter and publicizing the prize."
I saw this earlier and briefly considered it. 50W/inch^3 is soldering iron level heat dissipation. And my take on it was that it really isn't possible unless you can cheat and have the "inverter" be the thing on the end of a solid copper bar that is sitting in ice water on the other end :-). So really they are looking for a 10X improvement in efficiency. Which is to say to take something which is 90% efficient and make it 99% efficient. Even looking at the wide bandgap semiconductors they reference on the web site I'm having a hard time getting more than a few percentage points more efficient.
I think the power density they quote (50 W/in^3) isn't for dissipation, it's inverted power. I'm not an expert in electrical engineering, but I can't think of a fundamental physics limitation that's a blocker here. If there is one, I'd guess that it'll involve the electromagnetic radiated power of the device. From the specification document [1], they're asking for 95+% efficiency.
Thanks for the link, you're correct they want to pull 2KVa out of a box no bigger than 40 cubic inches. These guys (http://www.apparent.com/products/) have what I consider a very workable technology, basically 2W/cubic inch but one inverter per panel. Since you have to have panels anyway, having the panel produce power directly in the form you want eliminates the need for size as the panel is compelled to be a certain size anyway.
But instead of changing the question (and age old trick of engineers to arrive at a feasible solution :-) The push here seems to be about efficiency. 95% efficient would be a huge improvement.
Whatever they go with, the tech is going to make a really killer subwoofer amp.
Looking at the requirements (mostly the high DC input) it seems that an IGBT class-D would be the logical starting point, but those are hampered by a 150kHz-ish max switching rate. I think Don Lancaster's Magic Sinewaves (http://www.tinaja.com/magsn01.shtml) meet the distortion requirements while offering the slowest switching requirements.
>50W/inch^3 is soldering iron level heat dissipation
...we'll 50W/inch^3 is the throughput. If you assume 96% efficiency (which is common for COTS inverters), you'd have 2W/in^3 in waste heat to dissipate. And 50W/in^3 isn't very high on just a per-piece level. Here's a Vicor DC/DC converter that has 1240W/in^3 power density:
I think that is likely the winning path. When I first looked at this that was my approach, basically a flat sheet of copper with the inverter laid out on top of it. Basically a 10 x 16" copper heatsink with holes in it where vertical components sat, capton tape on the back flex circuit and D9 packaged semiconductors on the other. My battlebots motor controller (which really was just a 24DC to 24V PWM converter at 200amps[1] which was in the ball park (power wise) but FETs don't really like operating at really high voltages so I was looking at an IGBT version.
As I was more concerned with power dissipation early on, more surface area is great for convection.
Wouldn't it be more efficient to simply drive all our electronics equipment directly off DC? Almost all electronics devices now days run off 5VDC (USB) or 12VDC, Solar panels put out 12VDC, all those conversions seem like a waste of energy. What if you just run our big appliances off 120VAC and run all our small stuff off of Solar directly along with a battery back up. It seems like if a new wiring standard were developed that had both AC and DC distribution it would greatly reduce the cost of installing Solar. In fact, I believe it would be possible to put a DC bias on top of the AC (similar to the way old time phone lines work). Just a thought, rather than shrinking the inverter, think outside the box and get rid of the inverter all together.
There are a few reasons for not using DC until the last metre or so.
First of all, you want to avoid having multiple wiring systems in one building. Wiring a building for 110V alone is expensive enough (both initially and later during maintenance). In the vast majority of situations, you are going to have to pick one or the other.
So: AC or DC? Either way, you need to be able to run high power appliances (e.g. a 2000W kettle), and to keep losses in your cables low you need to use as high a voltage as possible, which will keep the required current as low as possible. So either way, in a realistic scenario, you need to retain the high voltages (110/230V) that we currently use.
When you require high power / high voltages, AC is a better solution. It is easier to switch (your kettle switch will burn out much more quickly while trying to interrupt a high current DC supply than it will interrupting an AC supply that crosses 0V 100 times per second). Overload devices work more reliably with AC for the same reason (which makes your house less likely to burn down). AC also makes it much easier to transform voltage levels - you just use an inductor or two. DC requires relatively complex electronics. DC distribution is also more complicated since the the distribution network and everything attached to it has parasitic (or deliberate) inductance and changes in load produce voltage spikes
There's nothing to stop you from installing a DC system in your house, but I think you would find it significantly less practical than you had imagined
Minor nitpick: relatively few quality solar panels generate
12Vdc. Most generate around 40V
This is something that always caught my imagination: using DC for the last mile.
I don't understand electricity well, but it seems that electronic components require regulated DC, and producing that requires an AC somewhere in the transforming circuit. Is that correct?
In a sense, you're right. Although it's perhaps not the style of AC you were thinking of. Inside every DC-DC conversion circuit there's one or more energy storage elements charging and discharging repeatedly. This internal waveform is AC. It's not sinewave AC, but it's technically alternating current.
No, you can generate 'regulated' DC from another DC source (see for example buck converters or more generally any DC-DC converter).
Regulated really just means clean DC which stays at the same voltage and doesn't contain a lot of noise. Not all devices require a regulated supply. A drill, torch or heater will happily run on a noisy supply. Your iPhone will not.
You are right. I remember more details now. The problem occurs when you want to make regulated DC from a lower voltage DC supply. Step-down is easy.
For example, my battery is 12V, but my laptop needs 19V. AFAIK, you can't get regulated 19V supply from the 12v battery without a transformer (or maybe something equivalent).
Step-up is easy too. That's what boost converters (http://en.wikipedia.org/wiki/Boost_converter) are for. Most single-cell LED flashlights contain tiny boost converters to increase the 1.5v coming from the battery to above the drop voltage of the LED.
2. ELIGIBILITY: To be eligible to enter the Contest, you must be: (1) above the age of majority in the
country, state, province or jurisdiction of residence (or at least twenty years old in Taiwan) at the time of
entry; (2) not a resident of Italy, Brazil, Quebec, Cuba, Iran, Syria, North Korea, or Sudan; (3) not a person
or entity under U.S. export controls or sanctions; and (4) have access to the Internet as of July 22, 2014
I wonder why Italy, Brazil, and Quebec are included. The other countries are under special sanctions regimes already but I can't think of a good reason to exclude these three or why the contest would be considered illegal there.
Even contests in Canada are either in Quebec or outside it, for the most part. I don't know the specifics, but there is an extra regulatory hurdle of some sort in Quebec that isn't always worth it.
All contests in Quebec have to go through Loto-Quebec. They are a public organisation overlooking all that is related to lottery like casinos, raffles, scratch cards, bingo and all that. They must check every contest and that is a lot of work. They ask for 5% of the main prize in a contest, even if it is an international event in which quebecers have a slim chance to win. This policy was not a problem before the internet.
What's the consequence if a Quebecer participates in/wins a non-approved overseas contest? Clearly Loto-Quebec can't actually do anything to the contest runner; would they bring charges against the contestant, confiscate the prize money, what?
That doesn't answer my question. A non-Canadian contest runner how no legal reason to ban Quebecers, correct? Obviously it's not good practice to let people into your contest if their government forbids it, but some contest runners might lack integrity and others might lack the legal knowledge/experience to even be aware of the problem, so it's certainly possible for someone to at least attempt payout to a Quebecer. What happens then?
Seems like we just need to switch to DC already. 60hz AC is good for resistance heat and... that's about it. We are at a point now where we're creating DC on our roofs, turning it into AC to go through the walls of our house, then turning it right back into DC to charge our cars and power our other electronics. With losses and expensive hardware at each step.
Lower frequency (50/60 Hz) AC is the only really useful choice for transmission on low cost long runs (think 100 km and more). To push high voltage DC on long runs generally means special cables and possibly super conductors. Pushing 60 Hz AC at 230 kV goes over simple cables (granted with fancy insulators for holding it to the towers).
Stepping AC up or down in voltage is simple, build a transformer. Stepping DC up or down requires switching electronics, which usually will also have a transformer (if the step is reasonably large).
That distribution of power inside houses is AC still is a legacy problem and because half the things in your house still use AC power motors, and generally those things are the big current consumers (air conditioning, fridge, clothes washer, etc). Your PC, phone, etc which run on DC draw tiny amounts of power in the typical house compared to an air conditioning unit, but running an air conditioning unit on DC would likely require an inverter to generate the AC power for the compressors and fans.
Motors like AC, it's what makes them spin best. "Brushless DC" motors use an inverter system, usually. Even brushed DC motors effectively generate AC inside themselves with commutation.
> Lower frequency (50/60 Hz) AC is the only really useful choice for transmission on low cost long runs (think 100 km and more). To push high voltage DC on long runs generally means special cables and possibly super conductors. Pushing 60 Hz AC at 230 kV goes over simple cables.
Not correct. High-voltage DC runs just fine on regular copper. (In fact, in some cases you only need a single conductor because you can use the Earth as the return. This isn't typically in the design spec, but it's sometimes used as a backup plan.) Low-voltage DC requires superconductors, because pushing low-voltage anything any distance requires superconductors.
The only reason most transmission is over AC today is Tesla (the man, not the car) didn't have power electronics to step-up or step-down voltage. We now have the technology to change the grid to DC if we want.
> We now have the technology to change the grid to DC if we want.
True, but the cost of each endpoint is significantly higher for DC than AC, especially at high power. I would imagine the reliability and longevity of power-equivalent DC converters would also be lower than that of AC units. That is to say, a complicated electronic component with many critical parts is more likely to fail than a simple transformer.
Copper is quite expensive for long cable runs, but your point about pushing lower voltage DC needing super conductors is true compared to high voltage DC not needing super conductors.
You can also use the earth as a return in an AC system. This is often done with more remote areas when distributing power in order to keep the cost down. It has downsides, but it works well enough.
> Lower frequency (50/60 Hz) AC is the only really useful choice for transmission on low cost long runs (think 100 km and more). To push high voltage DC on long runs generally means special cables and possibly super conductors. Pushing 60 Hz AC at 230 kV goes over simple cables.
I don't think that's true. The point about HVDC is that is is cheaper to do for longer runs than HVAC (that's why they are planning to use HVDC for the new long distance power lines in Germany).
Why would you need special cables for HVDC and not for HVAC? DC has less losses than AC for the same current.
The only reason AC is used for transmission lines right now is that technology for HVDC wasn't quite ready/cheap enough. It is now.
Why does inverter size matter? The inverter is already smaller than the battery or PV panel components, so it's not immediately obvious to me what groundbreaking new applications will be possible with an even smaller one.
If inverters were smaller and more efficient, they could be placed on a per-panel basis ("string inverters" or "micro inverters") for significantly better performance, particularly when some of the panels are shaded. See this paper:
> Panels commonly run at 12V (or some low multiple thereof),
That is absolutely not true. Generally speaking nontrivial sized photovoltaic systems are designed with panels in series such that the voltage stays just barely within the 600v rating on the wire.
For example, grid tie inverter, MPPT rated 195-550v:
You're right, panels are generally ganged together in battery to avoid the 'thick copper cable' problem.
However, the individual panel assemblies do run at lower voltages (individual cells run at the band gap of the semiconductor, 1 or 2 V).
It should be noted that placing the panels in series has a significant effect on panel performance when some of the panels are shaded (The entire string outputs at the rate of the shaded panel), so it would be much better to place panels in parallel when possible.
I guess you know that, but just to clarify, there already exist micro-inverters small enough to be installed behind the panel. (E.g. http://enphase.com/m250/)
We're in the process of installing PV right now, so I've been looking into this recently. The micro inverter solution is better when there's partial shading, but the micro inverters do not have higher efficiency than the string inverters. If anything, the assertion seems to be that they actually will have lower efficiency because they generally operate at higher temperature, being up on the roof.
And contrary to the idea about string inverters needing thicker wires, because the AC wiring runs at 240V but the PV DC cabling typically runs at 400-600V, you need larger wires for the micro inverter solution. (But in either cases, resistive losses in the wires are pretty negligible, tenths of a percent in our case.)
Perhaps size is a proxy for cost? I imagine many cost factors scale with size, such as materials cost, shipping cost, installation cost, etc.
Or perhaps tiny inverters could be a good example of disruptive innovation, an invention that looks bad along traditional dimensions but opens up nontraditional applications.
Why is there any need to make it much smaller than the solar panels that will be providing the power?
I can see where ultra-thin (and flexible) would be a benefit, but why not allow the electronics to spread out over the entire area of the solar panels? The space is being used up already.
That gets rid of the super high power density problem.
The sun delivers about 1KW per square meter, so even if the solar panels were 100% efficient, you'd have an entire square meter of room for a 1KW inverter.
They used solar panels as an example. If anyone succedes
in making much smaller inverters the technology will be
used everywhere.
I applaud Google for doing this! I think Google knows there's a bunch of undisovered Einstein's in the world, and they just using the Internet to find them?
I like contests like this. I thought Bill Gates condom
contest was a great idea.
Google are the sort of company who would rather have 5,000 crappy, unreliable, but efficient and tiny inverters attached internally to crappy, unreliable commodity servers than spend 5000x on a single, huge, expensive, only slightly more reliable inverter covering all 5,000 servers.
It's just another way of pushing unreliability to the network edge where it minimizes systemic effects and can be replicated away cheaply, much like they did with GFS or even their UPS system (at least previous server generations at Google included a large per-server battery).
Parent comment assumes one inverter per solar panel, my comment suggests it's more like one per 19" rack unit. Today's inverter vs. the kind of size they're looking for is a difference of perhaps 4-10TB per U, or approaching enough space for an extra half petabyte of spinning rust per rack
Switching power supplies were not needed for consumers until labor & material cost of linear supplies became exorbitant enough to drive all manufacturing overseas.
After that, freight was still on the increase and soon it became more economical to expensively engineer a lighter weight, smaller replacement. Regardless of orders of magnitude more complexity, switchers replaced simple transformers.
Inverters are like that too.
At 50W/inch^3 one ocean container of inverters will handle as much power as two ocean
containers at 25W/inch^3.
Surely we need to do whatever we viably can to make solar as inexpensive and practical as possible, so I think the hope is that a smaller inverter means higher efficiency and lower production costs.
I might be completely wrong. Still, I'm sure we all agree that it's not ever a bad idea to incentivize any sort of energy innovation.
My exact question. This whole thing has an emperors new clothes feel about it. Why is an inverter the size of a cooler an issue under any circumstances?
Already done. When I worked in EVs we used a HybridPack2 power module from Infineon. It's about the size of a sandwich but longer, and skinnier. You add a driver board, a logic board, a capacitor, connectors, cold plate. It's about the size of a shoe box and can deliver 100kW continuously. I pushed one under ideal conditions to 200kW.
Of course, liquid cooling means a total system that is quite a bit larger than I describe. In order to get rid of liquid cooling at that power level you'd have to get the losses down by a huge margin. We were dissipating 2-3kW at high power, so for air cooling you'd need to get that down by a factor of at least 10. The only way to drive the heat down like that is at the semiconductor device level.
This is a challenge that everyone in the field is already aware of and working on, while people outside the field have no ability to do meaningful research.
At the small scale, an Arduino with the mega-moto shield can push some hundreds of watts in a few cubic inches. So what exactly is the challenge?
Typically you want AC on the input or output and DC on the other. You can use that shield to produce AC by driving the outputs with PWM (it is intended for that). If there was a 3rd channel on the mega-moto, you could drive a 3-phase AC motor with it. If you put a transformer on it, you can connect it to the line and do DC->AC or AC->DC. Normally we want the DC voltage to be higher than the AC, so it's not the ideal shield for line connection - hence the transformer instead of simple inductors. And yes, the inductors add to the size, but if you are driving an inductive load like a motor, they are normally not needed.
I actually worked a little on this problem some years ago (as a lab tech; take my knowledge with a gain of salt).
One of the largest components in an inverter (such as found in a Toyota Prius) is the capacitor bank. I'll ignore the electrical design and just assert you need X capacitance to get this done. At least in automotive world, polymer film capacitors are used for this purpose. A polymer film capacitor is made from a (very long) sheet of polymer coated on both sides with a thin layer of metal and rolled up into a cylinder. They get quite bulky at capacitances required by these inverters. The other downside is polymer film cannot handle high temperature. I believe the Prius includes an whole extra cooling loop (in addition to the main loop attached to the engine, which runs hotter than the capacitors) to keep the capacitors cool, so that's even more bulk.
A multi-layer ceramic capacitor of similar capacitance can be much smaller, and can handle far greater temperatures than any polymer. The reason why polymer film is preferred is that when ceramic capacitors fail, they do so catastrophically in much the same way as a ceramic dinner plate shatters. At sufficient voltage, or at lower voltage with a sufficient defect, the ceramic will breakdown: a conduction path will form between the electrodes through the ceramic, which will heat the surrounding area, causing thermal expansion, shattering, and permanent destruction of the capacitor.
The same thing happens in polymer film capacitors, except that because the material is flexible, it does not shatter, and only a small hole around the defect will be ablated away. The remaining capacitor loses some capacitance, but otherwise functions normally.
So one way to create a smaller inverter is to use smaller capacitors, but you've got to match capacitance, voltage-handling ability, and fail gracefully.
Increase the frequency, yes. That's what I thought as well.
But then you run into other issues. The higher the frequency, the less "neat" are the up/down transitions, so your power elements (MOS-FET or whatever) spend more time in that twilight zone, which is exactly where they dissipate most power. And you want to avoid that.
Anyway, it's worth investigating along these lines.
Anyone have any ideas why they highlight only "wide bandgap device manufacturers"? I'm hope they'd accept a winning solution with different tech, but surely there are other possibilities they could mention right at the start?
Some solutions to the problem will utilize a high frequency, high voltage switching inverter. The performance of this type of inverter is limited by the losses generated by the devices making up the inverter. The wide bandgap devices (switches in essence) offer the lowest losses for this type of topology. Furthermore, some novel inverter designs work only if they operate at very high frequencies. Only the wide bandgap devices can switch at these high frequencies.
Wide bandgap transistors have two relevant advantages: they work at higher temperatures, so you need less cooling; and they can switch higher voltages, so you don't need a bulky transformer. The only downside is, they are more expensive.
Assuming one was starting from scratch and didn't care whether the available appliances of the day required AC or DC power, but all the power coming into the home was solar, what would the motors on the appliances look like? Would it still be desirable to use AC motors, and if not, would it be practical (other than for the obvious reasons) for appliances to use a standard DC voltage?
I don't mean to suggest that we abandon ac powered appliances, I'm just curious about what electrical wizards would come up with, if they were doing it all over again.
Motors would almost certainly still be AC. They are way more efficient, cheaper to build and require almost no maintaince. DC motors do not have those same characteristics.
But they'd probably run on a higter voltage and frequency.
"Brushless DC" motors seem to be taking over due to switching electronics dropping in price, no? And while those are technically AC motors, it's not the kind of AC that comes right off the line.
Motors that run on constant 60Hz seem to be a historical shortcut, whose demand is fading as the control benefits of variable frequency drive are available for less and less. And if HVDC transmission is gaining popularity, then how long are utilities going to keep doing the conversion to AC "for free" ?
It seems to me that if we were in a bizarro world where common end-user power had always been DC, every motor would just be paired up with an appropriate driver circuit, even designed around the specific inductance of the motor. With solid state circuitry, all house fans would be infinitely variable, etc.
Of course there's a huge installed base of a few types of items that would need 60Hz backwards compatibility. I get a good chuckle from thinking about legacy clocks requiring an inverter that contains a high-accuracy crystal - maybe that inverter could even run ntpd.
I looked into this a while ago, when I was replacing the pump on my swimming pool. This is a bit of a special case, as there are gains to be made from using a variable speed drive. Whilst a variable speed drive may increase electrical losses, the slower water flow may reduce losses due to turbulance by a greater amount, leading to a net increase in efficiency for volume of water moved.
As far as I can gather, for variable speed motors a brushless permanent magnet DC motor is more efficient for small power applications (< 1-2kW), but as the power goes up, a high efficiency three-phase induction motor with a variable-speed drive become more efficient than the DC motor. A high efficiency induction motor has extra copper in the rotor, to reduce resistive losses.
For fixed speed applications, you'd think the above variable speed performance would reflect the performance for a DC supply, as the DC supply requires switching in both cases. For a three-phase AC supply, you'd think the induction motor would win, due to the absence of switching.
Could someone explain, in layman's terms, what the difficulty in building a smaller inverter is? I unfortunately paid less attention in high school Physics than I wish I had.
The output of a solar array is a DC voltage (constant over time). Our homes are fed by an AC voltage (varying sinusoidally over time at a frequency of 60Hz).
A circuit is needed to convert the DC to AC. There is a loss in energy due to the functioning of the circuit. The circuit size and complexity depends on the specifications of the DC to AC inverter including the maximum power capability desired.
Traditional converters operate at low frequencies and lose a lot of energy due to the technological limitations of the semiconductors switches used. The switches essentially chop the DC input into a square-wave type output of a frequency in the low kHz range. This square wave output needs to be low pass filtered to allow only the 60Hz to propagate through to the inverter output. For low kHz type square wave, the inductors and capacitors used to make the low pass filter are large.
New semiconductor technology has resulted in switches that can operate at MHz frequencies. The inductor and capacitors used to make the low pass filters can be much smaller for MHz frequencies. These switches also have much lower conduction losses than the previous silicon-based switches but they need to be used in more novel topologies in order to minimize what are called switching losses.
To see a real-world example of what improvements can be made with the new semiconductor technology, compare the brick power supplies that come with our laptops to the much touted FINsix Dart (http://finsix.com/dart/). The latter uses new GaN switches that operate in the MHz range AND a novel topology that minimizes switching losses.
Maybe a stupid question: Would that work: create a small electric motor, mount one coil on the rotating wheel and mount another on the stand. Now put DC on fixed coil and AC would be generated on the other side (like in transformer)... Or will this not work? :-) (I am not an electrician.)
I know, I know - mechanical parts are not optimal - and also there are losses for electric motor - but the size is in question here...
I don't know what the limits on efficiency are, and the Wikipedia article mainly addresses AC-to-DC conversion rather than DC-to-AC, but I assume simply from the fact that they aren't used these days that they aren't an improvement on solid-state inverters.
Not sure, but some Googling reveals that these three are commonly excluded from contests of all sort. They probably either prohibit all forms of contests and sweepstakes, impose taxes, or regulate them so tightly that it's better to just not bother.
Also note that it may not just be a function of the law, but also of the benefit gained by offering a contest in a given jurisdiction. I'm seeing some indications, for example, that Japan and Brazil have some similar sweepstakes regulations, but it may be more worth the organizers' time to comply with Japanese law than Brazilian. Just a guess.
http://en.wikipedia.org/wiki/Sweepstakes ("There are similar laws in Brazil, where sweepstakes must include a "cultural contest", often giveaway questions like 'which brand gives you a house?'")
http://www.theglobeandmail.com/report-on-business/small-busi... (In Quebec "contest runners have to pay tax on the value of the prize. For another thing, contests with prizes over $2,000 have to register their rules with a government agency, the Régie des alcools des courses et des jeux . . . . To top it off, contests with prizes worth more than $5,000 actually have to deposit an amount as a security with the Régie, as a means of protecting consumers should the contest runner fold or renege.")
"the province’s Lotteries Act ... require you to post security for contests open to Quebec residents where you do not have a place of business in Quebec, the value of any single prize exceeds $5,000 or the total prize value exceeds $20,000."
I'm just tickled by how many commenters have a totally obvious solution to this problem. Surely, the hundreds (thousands?) of experts at Google, the IEEE, and the ~8 manufacturers who put this contest together are just fools who couldn't come up with such amazing, brilliant ideas themselves.
Congrats in advance, and enjoy your million bucks!
I have no idea what you're talking about. I see only two posts suggesting anything close to solutions. One of those posts named an existing product with a probable misunderstanding of the details of the requirements. The other post said that we should avoid AC and not need an inverter. Neither said they had a plan that could win the competition.
I don't see a single post that fits your description. Am I looking in the wrong spot? Are you being baselessly condescending?
(Shrug) There is no good solution, because it's the wrong problem. We shouldn't be using AC at the home/light-industrial level at all. No matter how efficient the inverter is, it's going to be constrained by the inefficiency of putting switching regulators in everything from wall warts to machine tools.
That's the part of the situation that needs to change, but of course it's the chicken-and-egg problem from hell...
DC in the house doesn't remove the need for switching regulators - you still need to step down from the house voltage to the device voltage. Don't forget that devices that accept 5V or 12V from a wall wart almost all still have switching regulators to step down to 3.3, 1.8 or less. It does remove the need for transformers and rectifiers, which would buy us some efficiency.
And don't forget that if the house DC supply was at less than 120V, then you'll loose more power in the house wires, because the electrician who wired your house was too cheap to buy superconducting romex.
If your house is wired with the lights and the outlets on separate circuits (mine is, because I'm an EE and I built the house), you could change out all those lights to low-voltage LEDs and supply 12-24VDC to them and light up your house just fine on the 14-gauge copper romex that's in your walls now. This works because the required current would still be well below the 15A at which the wire is rated. (Getting your electrical inspector to approve it might be another matter, though.)
In some cases, this trick would work for the outlet circuits too because most of the things we plug into outlets now are low-voltage DC wall warts, and they could be replaced with low-voltage DC-DC converters. The problem is that your refrigerator and your dryer are not run by wall warts, so outlets are tougher to convert to DC than lights.
True, I should've written "rectifiers and filters." The inductor(s) are most likely still going to be needed, but you can get rid of the diode and filter capacitor losses on the primary side.
That's likely true, since DC motors need either permanent magnets or energy-wasting field coils. But motors are only a part of the energy-usage picture.
Ideally, we'd distribute "last mile" power as DC, and devices that really need AC would handle the conversion themselves. For instance, a polyphase inverter that's designed to drive a specific motor would be more efficient than a general-purpose single-phase inverter of the sort being discussed here. So even motor-drive applications could still be a net win for DC distribution.
AC gives you cheap synchronous magnet motors. These are what you find in most off the shelf power tools. But, they can;t be electrically speed varied, and don't produce max torque from a standstill (treadmills for example use DC motors for this reason).
Its not a show stopping constraint anymore - so much so that a ton of electrical appliances can happily run directly off of 200-300V DC because it just bypasses their internal rectifiers.
I would suggest that the most obvious reason for putting this contest up is because you do have a whole bunch of ideas for solving the problem, are pretty convinced that the problem can be solved, and would like somebody to go and spend the few years of engineering work needed to turn one of those ideas into a working product.
I don't think this is the next X prize. I think this is closer to an open contract for somebody to build the damn thing, when that is a somewhat risky venture that none of the people offering the prize feel they have the ability to execute.
There are all sorts of specifications/requirements listed (box size, ripple allowed, EMI limits) but the most interesting that is not mentioned is cost. There is no upper limit set on the BOM cost.
The oft repeated engineer's mantra of "quality, price, schedule: Pick 2."
The contest has picked quality (the engineering requirements are not easy to hit) and schedule (there's a timeline for demoing). I bet it's expected that the cost, even if it's high right now, will only come down over time. But since there exist 0 inverters which can do this today (presumably), cost isn't a big concern if you can do something new and novel that's never been done before.
Likely peak and typical power output( order of magnitude difference ), as well as waveform( automotive inverters tend to output square-ish waveforms ).
The simplest solution (just give me the $1mil now) is to cut out the middle man. I mean, there is a needless conversion here from DC to A/C, and then back to DC. Not that many devices need A/C these days - maybe just your alarm clock, if it's cheap enough (since cheap alarm clocks use the alternating current frequency for keeping time, instead of a precise resonating crystal.)
Example of what they have now: [solar-dc] -> [inverter] -> [ac/dc transformer] -> [device].
Cut out the inverter, the ac/dc transformer and you have:
[solar-dc] -> [device]
Required materials: wire cutters, cheap voltage regulator IC, some wire. Done. I'll take a cashier's check please.
People often say this without thinking about all of the details here. The largest power loads in your house (and in the world, period) are motors which are normally AC so you may not have as much of a benefit as you think.
Also, there are other small details to consider such as the safety of switches and plugs. When you disconnect AC it will arc and then self extinguish when it crosses 0V. DC will keep arcing longer than AC will making it a bit less safe. Just keep a lot of that stuff in mind.
Big appliances are already big, justifying putting an inverter on them if they can't be converted to DC. An inverter of appropriate size and power, of course, rather than a single mega one.
My argument is that this contest is a bandaid on an outdated way of thinking about power. Yes, safety is important. However, in your example we don't have to use physical-contact plugs. There are plenty of devices that can be powered or charged via induction.
Each element on an electric stove is roughly 2 kW. Let's say you have a 4-burner stove, and they're all on. That's 8kW of power.
Let's say we run your 'simplified' household at 48V (the highest voltage commonly used in boat/cabin DC systems). To run 8kW on 48V you need 167 Amps. For 167 Amps you need something like 2/0 wire. That's really thick and heavy cabling!
Can you imagine replacing all of the wiring in your walls with this stuff?
That reminds me of a funny story: a friend was working on some servers in a big telco and saw two guys carrying something that looked like a heavy, thick pipe, and a big hammer. "Hey, what's that?" "Power cable for the UPS [one that kept the whole building up]" "And what's with the hammer?" "Well, we have to bend the cable..."
This is exactly the issue. To deliver the power needed at such low voltage, you need thick cables. And thick cables are expensive in materials and installation. Can you imagine a typical j-box or load center with, say, a dozen of these bad boys coming in and leaving?
Due to the size, the changes needed would not be confined to the electrical system -- you'd have to figure out a way to maintain structural strength and fire safety despite the large openings for cables.
It's possible to imagine houses wired with low-power DC and high-power AC (both), but it'll never happen in my lifetime, hence a general lack of enthusiasm.
Erm, you can series enough storage batteries to make double-ended DC at +-120V, which will actually work just fine with existing electric stoves / water heaters. Yes, there will be some new safety issues and it will take the electrical code a while to catch up for safe residential installation (hopefully they'll deprecate romex!), but these are not insurmountable.
No, I certainly can't imagine doing that. However, a person doesn't typically cook 12 hours a day. Instead, they cook in bursts, so it makes sense to put a bank of batteries in or near the stove (firewalled of course), and trickle-charge them with normal cables.
No, it doesn't make sense to do that. The nice, efficient system we have is far superior to having everyone invest in expensive, fragile batteries everywhere and never be able to cook more than 30 minutes without recharging for 12 hours.
The "Battery Bank" appliance, with no user serviceable parts inside, and consuming valuable real estate in my kitchen? I'll pay the extra few bucks per decade to not have that in my kitchen.
Solar power is one thing. Electric cars are another thing.
Power transmission and the whole grid is based on AC. All device you buy are based on AC. The reason is, transmission of three-phase/AC is much more efficient than DC. DC has also other side effects like they are not potential free (hard to explain for me in English as I am German.)
But to make it easy, beside the electrical stuff, the whole world is based on AC. So redesign stuff like TV, dishwasher, computer, … just for the few poeple who run their own system is economically not possible. (The hen and egg problem.)
The other thing is, everything that runs with a drive these days, like your electric car with a permanent-synchron drive, needs to convert DC to AC to run the drive. The AC frequency defines how fast the drive runs.
So even, if the first thing change, the second (running a electric drive) will not change.
Besides from those points, I find this contest interesting. Because, I thing in the industrial world, where I am working in, this kind of problem is addressed every day in electrical engineering.
EDIT: for the grid, it is not only solar, also wind energy is important. Wind parks are the next big source of renewable energy, in Germany (of course), but also in Spain, China, and the US. GE Wind Power is very strong in the US, AFAIK. With wind power AC is generated with the frequency depending on the wind. So this AC is converted to DC, which then is converted to AC synchronized to the grid. Actually, the market leader for those inverters is based in Nuremberg, Germany, where I am from. AFAIK, they have more than 60% of the market. Because moving such high power requires 3 or more inverters in parallel. But running such things in parallel is a pain in the ass. Because they have to run with less then 1ms accuracy in parallel. Otherwise the whole system will burn.
I actually thought transmission of DC was more efficient than AC, because you don't have inductive losses (hence HVDC underground cables etc) but that the issue is more the ease of AC transmission, since building transformers is easy but converting from AC to HVDC and back was difficult before the advent of switching converters.
Correct. HVDC is significantly more efficient than AC transmission. The problem before modern power semiconductors was always that converting voltages levels was difficult. The Pacific DC Intertie had to use huge Mercury Vapor Rectifier TUBES.
> Even for a frequency as low as 60 Hz the skin depth is less than the 1.6" radius of the conductor used for the Intertie. Hence the effective resistance is greater with AC than DC, so that more power is lost to heat. A DC line is also ideal for connecting together two AC systems that are not synchronized with each other. Also, cascading blackouts are less likely.
Interesting point that this was the longest HVDC transmission in the world until the Three Gorges Dam projects.
Most residential solar installations in the US are grid tied, meaning that excess generation heads out onto the grid (this is counted as a credit under net-metering regimes). This explains why AC inversion is required. Since most solar production is midday and most household electricity use is in the morning and evening, this arrangement allows for a net-benefit to the homeowner (and to some extent the utility since daytime electricity is more expensive for market-based customers than nighttime electricity for regulated-rate-payers). Otherwise you have to store the electricity and that is expensive and very inefficient.
There are other concerns as well (e.g. no standard DC outlet, most AC appliances with DC motors would have to be modified, there would then be two electrical standards in the house since DC is still not suitable for most transmission or distribution needs, firefighters would need to be retrained, etc).
And people have talked about that, or just having some sort of dual track with an inverter/UPS for the entire house and some sort of switch between solar and outside electric.
One problem is that we would have to create a whole new standard of how to connect DC appliances. Now I know on a smaller RV level that exists but we need something that can take more juice if I remember the conversation correctly.
No, I get it. Tri-phase is name of the game in the old transmission model. But that model is long overdue for a rebuild. Remember the 2003 blackout? Managing and synchronizing grids is more than a full time job, and this contest seems to promote a bandaid.
Anyhow, my point is that in the localized-power game (say solar cells, fuel cells, etc) it's all DC already, and it's all right near where it needs to go. Fuel cells are near cars, and solar panels are right next to what they need to power: TVs, etc.
Complications are airconditioners and big appliances, except those things are already massive enough that it makes sense to install an inverter next to each one.
That DC might as well be AC since it is not as stable as you'd like. Consider this, you have a DC source 5-15V and want to run a usb gadget from it. You still need a regulator between the wall plug and the device. Why not make it compatible with AC grid so your kitchen appliances don't need their own inverters each?
We almost went with DC. So, yes DC would have been great, but the politics at the time went with AC. I'm just an
Electrician, and the only downside to a DC electrical grid
is Voltage Drop. Maybe an Electrical Engineer can chime in?
DC is not synonymous with low voltage - voltage drop is a red herring. Problems with DC that I can think of off the top of my head:
- Utility companies need to change out pole pigs for DC-DC converters.
- When electrocuted, muscles grab and lock up, rather than a mere painful buzzing.
- Lower mechanical switch ratings (no zero crossing for arcs to self-extinguish). Check out the printed DC ratings on a listed switch some time.
- Corrosion on exposed conductors due to constant potential difference.
- Low level magnetization of things next to power conductors.
- General disruption and uncertainty that change brings. I bet you could annotate most clauses in the NEC with the incident that prompted its addition.
- Nikola Tesla may wake up and finally use that death ray.
Anyone who has lived off the grid understands that an inverter is the least efficient solution to the problem of running consumer devices from a DC source. We laugh at the newbies running an inverter to supply a laptop PS that takes AC right back to DC. Hopelessly inefficient.
DC-DC is the way to go, or else if you need higher voltage, take an auxiliary feed from the charge controller, since most solar puts out 21-25VDC anyway. For more efficient and powerful motors, use series battery banks. Duh.
This is just another way of pandering to the people who do not understand efficiency and who are locked into the idea of "house current", in other words, dinosaurs.
Our industrialized world is so inefficient that we throw away about 80% of generated power. What a holocaust for the natural environment! When you go off-grid, that just won't fly, because no one wants to upsize their generation capacity five-fold to run some inefficient consumer device, except for the aforementioned newbies who have yet to notice an open artery.
Inverters simply extend the inefficiency of the consumer experience to alternative forms of power generation. The smart solution is not to make the inverter smaller but to lose it entirely.
I wish more of these contests were run that way. I think they'd yield much high quality and differentiated results with a lot more entrants.