Given the relative simplicity of scaling down existing designs for lower current ratings, one would assume that Google is looking to achieve reduced heat dissipation (<=> increased conversion efficiency) from existing designs at a given current rating, resulting in a higher kVA/m^3 value.
There are few immediately obvious ways to attempt this, but all of them involve trade-offs in either cost, THD or operating flexibility. It'll be interesting to see what the full requirements are.
There was a HN article about "Dart" (and their Kickstarter) about a month ago. They're building vastly-smaller switching power converters in the VHF frequencies (upwards of 300MHz). I'm not sure about the validity of their claims, though I believe the tech originated at MIT. This could be one compelling route?
I believe you have it backwards, Dart is a switched mode power supply (SMPS) which is sometimes called a converter, and things which take DC and return 90-240VAC are called inverters. (not to be confused with logic inverters)
A typical inverter uses MOSFET or IGBT transistors to switch a DC voltage which is then fed through a capacitor as a form of isolation. Typical inverters are use on Solar panel systems to convert PV cells energy into AC, battery backup to AC (Uniinterruptible power supplies), or single phase to multiphase AC to drive AC motors efficiently.
One of the reasons they are "big" is because typically they operate at 60 or 50 hz, and at those frequencies if you are using magnetic fields for isolation (like you would if you drove one side of a transformer) the transformers are annoyingly large and hard to make efficient. [1] Many modern inverter start with 280 - 480v DC and use a series of transistors to create an approximation of a sine wave (this is how the cheesy plug into your car lighter inverters usually work). Once you get above a 100W it starts to get a bit more difficult to do cheaply and with reasonable efficiency.
Efficiency drains are also present in the upscaling the voltage (whether your using a boost switching circuit or a simple diode/capacitor pump). So getting these things to be efficient is hard, and they are of course generally fairly large per watt.
I suspect Google is looking for something to invert PV solar arrays, but high density power conversion is always valuable.
[1] That said, a lot of people made high voltage supplies out of using a 555 to switch a transistor on and off which fed the 'low' side of a power supply transformer. I had a Xenon Strobe circuit that did that, made a nice little 600V supply.
I'm not saying that Dart's technique is directly applicable. What I am saying: Fundamentally rethinking assumptions (like operating frequency) is probably critical to a breakthrough in inverter design.
BTW.... Dart and others (like the iPad recharger [1]) typically convert AC->HV DC (rectifier)->Flyback (at 10's of kHz)->low voltage DC. The intermediate conversion to DC followed by "chopping" at a higher frequency on the flyback transformer allows designers to use smaller magnetics than what would be required of "classic" 50-60Hz wallwarts. I'm sure you're already well aware of all of this given your comment. But so am I (despite the comments suggesting I'm misunderstanding converters-vs-inverters). ;-)
If the Google call is for energy generation, then there's also the added difficulty of maximum power-point tracking as well...
... Fundamentally rethinking assumptions (like operating frequency) is probably critical to a breakthrough in inverter design.
Works fine except for the output which by spec has to be 50 or 60hz. I believe even existing designs use a boost switcher to convert x DC to ~ 200V DC before shaping it into something that looks nominally like a 110V sine wave.
So if you break the problem in two (input to source DC) and (source DC to 110V AC sine wave) then I completely agree that advances in SMPS design components and techniques can really help the first part, but I think we're still searching for a low loss power amplifier for the second part.
The Dart and nearly all other wall-plug AC to DC converters use a rectify-invert-rectify method. They first rectify the 50/60Hz mains AC to DC, then convert it back to AC at much higher frequencies in the kHz - MHz range, or in the case of the Dart, hundreds of MHz, and finally rectify it again to produce the final DC output. For a given amount of power output, at higher frequency less energy has to be stored in the circuit per cycle (i.e. stored in reactive elements like capacitors), which enables more compact components to be used.
However, there is also a downside to a higher switching frequency. In the inverter transistors are used to generate the high frequency AC voltage.
A transistor generates no loss if it is in the full on or full off state (saturated). But every time a transistor switches from the on to off state (or back) it goes through its linear region. While in the linear region the transistor acts as a resistor and generates heat. If you increase the switching frequency the transistor switches more often and thus generates more heat.
The solution to this problem is to use more efficient transistors or decrease the switching time (the time it takes to switch from high to low, or back).
Of course higher switching frequencies also have lots of other problems such as radiation, skin effect, etc.
> A transistor generates no loss if it is in the full on or full off state (saturated)
Not entirely true. They make a lot LESS loss when fully on than when linear, but there's still some loss.
Even with a highly efficient transistor you can still get losses while in the linear region if your gate drive circuit can't push enough current. When designing a switching power supply you don't just hook the microcontroller output to the gate of the transistor. To do it right you might need one or two or three intermediate stages of power amplification so that you can switch the main transistor's gate very quickly.
How does switching current behave in the limit of power transistors? I know that it's significant in high-performance ASICs but I could certainly imagine the junction capacitance of a power-MOSFET being small enough that kHz or even low MHz switching would cost a miniscule fraction of the power being switched. I mean, I've been able to switch the floating gates of sizable MOSFETs by waving my hand at them, I'd be surprised if the power usage at tens to hundreds of kHz was more than a few mW.
There are at least three different ways switching a transistor consumes energy, the power for charging the gate is only one of them and tends to not be the limiting factor in small power bricks. The other two are current spikes in the channel during the switch in logic circuits (because you normally cannot let the output float, which means that during the switch you have to switch on both transistors at the same time, but that's not a problem when driving some inductor/transformer), and energy loss due to voltage drop while the transistor is moving between on and off states (that's where all the heat tends to come from - apart from the heat from the rectifying/free-wheeling diodes).
It's not AC because the flow of current doesn't reverse direction. It's pulsed on/off. The last stage isn't rectification, it's an LC circuit to smooth out the resulting voltage. The reason higher frequency switching is for smaller value capacitors and inductors.
If we're talking off-line power supplies, it most definitely _is_ AC for the simple reason non-ideal transformers will pass no DC.
And, further, inductors and capacitors (LC) are all merely losses to DC signals. These components only have a meaningful function for AC waveforms (even though, in the case of a buck converter, a DC signal is superimposed).
Why don't companies do this more often? Write down a list of stuff they need/would-like for their projects but aren't yet actively devoting resources to, and then offer prizes to anyone who comes up with the goods. I know that crowd-source design contests have been common (and controversial) but R&D I don't hear about so much.
But the cool thing is that the companies will attract all the non-standard R&D people: smart kids, people in between jobs, university students, etc. Smart people with extra time on their hands, who would love for a tangible challenge to work on with real deadlines. But of course, it's also not really reasonable to devote so much time to something that they aren't guaranteed to win. Maybe there could be something in between? You show the company an early prototype or some sort of promising direction, and then they pay you a stipend to continue working on it. The prize money is then distributed among the most promising results, instead of just one.
The same companies that treat 100m-1b sized acquisitions as a cheap alternative to R&D?
They are showing in practice that they are willing to pay huge amounts of money for "outsourced" R&D; sure, a prototype stage is worth less than an implmenetd product, but $1m or more seems to be a reasonable and affordable ballpark for such offers from the "buyer" perspective.
The remaining question is if ~$1m is an appropriate amount for the inventor, given the effort required - but that seems very subjective and depends on each individual inventor and their location; in many countries you could fund a whole research laboratory of a good university for $1m.
Note that it's the norm in the film industry. For a new movie project they'll peocure the element and people they need, do the movie and then disband.
In The Nature of the Firm [1] Ronald Coase wondered why companies didn't contract out instead of having employees and why indeed firms exist in the first place instead of using the movie industry system.
His answer was that there are transaction costs associated to using the market. There are also other costs such as search/informgation, bargaining, trade secrets,.. The costs of going to the market makes it less attractive. He got the Nobel Prize for that.
Internet lowers that cost and makes it possible where it was impossible before. It's definitely a new opportunity companies don't exploit.
I can see this kind of thing become much more common - There is an obvious need for X, we're willing to motivate Y. The internet allows Hobby tech guy to join the fray and change the world. Lofty goals.
As a side note, this kind of thing very much reminds of me 'nwavguy'. An anonymous audiophile who created a cheap and stunning piece of hardware.. Sometimes a single guy with the right motivation and knowledge can innovate.
"NwAvGuy boasted that his minimalist amplifier—which can be purchased for as little as $129 — “proves you don’t need exotic parts or esoteric circuit designs for best-in-class sound, accuracy, and performance.”"
To the inventor? Perhaps, but there is a great deal of risk and additional work involved. Most inventors take the easy route rather than trying to market and sell their inventions themselves. However, there is a lot of risk involved in doing so. Risk that your idea won't actually translate into a workable invention. Business risk, and so on. With an easy payday in sight there is a greater incentive for people to spend their spare time working on such things, if they have domain expertise. In contrast, working on their own they'd have to contend with the risk that their invention won't actually have as much of a market as they thought or that they wouldn't be able to monetize it very effectively, and so on. With a prize a lot of that risk becomes less important, and shifted onto google's plate instead of the inventor's.
Also, google isn't necessarily depriving the inventor of patent/licensing rights, the $1M dollar prize is likely just the beginning of monetary return for the invention.
It reminds me of that show Silicon Valley where the Hooli (Google) guy offers him $10m for his algorithm, but he decides to go it alone for the chance of big bucks.
This was mentioned in the White House's FACT SHEET [1] on solar initiatives, fwiw.
Certainly the inverter is an expensive hunk of metal, but soft costs like installation and permiting dominate the average residential PV installation. And if it were easy to make a more efficient inverter, wouldn't Xantrex have made one by now?
Kinda confused. Oh well. Guess we'll find out more later.
EDIT: Maybe they're talking about microinverters. I'd still like to see the spreadsheet describing the economics though.
In order to achieve parity with coal, utility solar installations (let alone residential) needs a lot of cost reductions across the board and will likely need significant infrastructure upgrades (like Germany's) [1]. Better inverters (cheaper, smaller, longer life time) would help keep driving costs down on large scale PV installations while also giving us a lot more breathing room when upgrading our energy infrastructure to deal with these drastically different power sources.
Soft costs dominate residential installations, but not utility or commercial installations, and it's only a matter of time until they fall away significantly (especially in the US, where the market doesn't seem to have shaken them out yet). Ultimately, you want solar to be significantly cheaper than the alternatives, and in order to do that, you have to drive down all aspects of the cost.
I agree, though, that it seems odd for all the existing interter manufacturers not to put the money in themselves, if there is an opportunity.
Ahh, thanks, that makes so much more sense! It seems that 5kW inverters have efficiencies around 97% which is pretty good. I wonder if they could print the micro-inverters as part of the panel manufacture and then just add the caps and inductors they need during packaging.
This might also be a good opportunity to add features like distributed power generation, so that for remote locations, a power grid could be setup with residential panels, but no central power station.
but given google is sponsoring this. What kind of application could this have, or better yet if we get one that is super small/meets the qualification. What kind of potential does it have?
If google wants to "own the grid" and we know they do. Then power will need to be efficiently converted from DC to AC.
If you do this at the outlet, then LED's and other DC favoring electronics could be powered by wires that are DC, and AC would only be used when necessary.
Think of all the devices you have with a power brick to take AC to DC. All of those are losing electrons along the way... (most power supplies are less than 75% efficient).
Power does not need to be stored as DC. Flywheels store rotational energy, and starting with a spinning flywheel it's pretty simple to get AC out of it.
True, but your electrical output would be variable-frequency AC at the drive terminal. This isn't particularly useful unless you use a back-to-back converter (i.e. DC link) to get constant-frequency AC.
I'm no mechanical engineer but at first glance the conversion efficiency of using a CVT with a synchronous AC drive (vs. direct coupling the motor rotor and the flywheel rotor and using a variable-frequency induction motor + back-to-back converter) would seem to be lower.
Modern VFDs have very high part-load efficiency and we can easily maintain a constant power characteristic through the full speed range by operating in the field weakening mode. One way or another you're losing energy in frequency conversion, it's just a question of whether you do that mechanically (with CVT) or electrically (with back-to-back converters).
Also keep in mind that for grid storage devices, we're usually talking about 500+ kW on each flywheel which, at low speed, is A LOT of torque.
That would not be efficient. Yes you could also store the energy by pumping water up hill and letter it turn a turbine later, but you'd lose a LOT of power.
You could store the energy in springs ala the da Vinci cart, but again not efficient.
All of these methods require a conversion, where as a Battery, Capacitor, or Leyden Jar will store DC as DC.
Converting electric power to and from a battery requires an electrochemical reaction to occur. That is a conversion just the same - I don't really buy this "no conversion required" argument, just because you started and ended with direct current.
The efficiency gained by stepping up AC to higher voltage (and correspondingly lower current) is far greater than that lost to skin effect. Since there is no way to efficiently step up DC (modern DC-DC converters are still horribly lossy), AC transmission ends up being more efficient, by a long shot.
There are scenarios where HVDC can be more suitable than AC, in particular, very long distance links (e.g. to a remote hydraulic station). All of the substation equipment is much more expensive in HVDC but the benefit of eliminating line reactance can outweigh this if you're doing 1000+ km without any interconnections.
Also, interconnection between neighbouring AC grids is an important HVDC application since we don't have to worry about transient stability.
Neither AC or DC are superior. Different technologies for different applications.
Can you explain like I'm a physics major why HVDC becomes more efficient than AC over ~1000km? A scale analysis would be great - where does the distance D come into the equation?
I get why AC is better than DC on ~100km, but I don't understand how it changes again at larger scales.
Both systems have resistive loses proportional to the square of the current. However:
1. Total power transferred in a DC system is proportional to the voltage, whereas power transferred in an AC system is proportional to the RMS voltage (which is roughly 0.7 of nominal for a sine wave), so more energy is transmitted at the same current level in HVDC.
2. AC systems manifest impedance which has a resistive (aka DC) component as above as well as a reactive (aka AC) component, i.e. Z = R + jX. In DC systems X = 0. In a theoretical transmission line no energy is absorbed or supplied from line reactance, but in practice we have to transmit a certain amount of reactive power (VARs) to charge the line capacitance/inductance each AC cycle. This reduces the amount of our current capacity (limited by thermal constraints) that actually carries current that can be delivered to the load as active power (watts).
This effect is somewhat although not directly proportional to distance (characteristic impedance has no dependence on line length, but voltage drops along the line due to resistive effects meaning the variation from the optimal reactive power-minimizing voltage level increases).
The effect of (1) and (2) is that for any given conductor, at a given voltage level, more usable energy can be transmitted with DC than AC, and that differential increases with distance.
That being said, building DC converter and switching stations is much more expensive than AC. So for a shorter line, or one that has many switching stations, I could counter the above by simply generating 5-8% more power at the generating station and still come out ahead (because in real engineering everything is about $).
Therefore, DC is only more cost-effective ($/MVA of energy delivered) at long distances.
Reactive power is the control then. With AC, we have to increase reactive power to shove more real power along the line, but reactive power doesn't transmit very well.
Follow on: in a national grid, could we just distribute the production of reactive power with capacitor banks in each town / neighbourhood? Heavy flywheels spinning at 50hz?
> building DC converter and switching stations is much more expensive than AC
Is this intrinsic to the technology or is it more because we have economies of scale from building infrastructure around AC for 100 years?
'Depending on voltage level and construction details, HVDC transmission losses are quoted as about 3.5% per 1,000 km, which is less than typical losses in an AC transmission system.[16]'
At line-frequency skin effect is not significant for almost any practical conductor (ie. reasonably conductive and with reasonable cross-section). What gets significant for long distance transmission is that few thousand kilometer long power transmission line starts behaving like, well, transmission line even on frequencies as low as 50/60Hz. Also HVDC elegantly sidesteps problem of grid synchronization.
Oh, that is completely not true for commercial AC power transmission -- just look up at a big power pylon and see the multiple power conductors per phase.
I assumed that skin depth at 50/60Hz is on the order of 20mm and that 40mm diameter conductor is not entirely practical to install on pylons (or generally in longer lengths than few meters), obviously I was slightly off in that estimate.
It's slightly pathetic that when viewing the website with Google's own browser, the font in the "SOMETIMES THE BIGGEST..." block is rendered incorrectly with stray pixels on top of the S's, O's, and G's.
I'm assuming it's not intentional since zooming in causes the artifacts to disappear, but they come back when zoomed back out to 100%.
Chrome doesn't render Google web fonts well, it's been that way for as long as I remember. It's a known issue, and it's been reported countless times.
You would expect this to be high priority, since reading text in the most important aspect of the internet. For whatever reason, Google doesn't seem to care, and they have no problem using these poorly rendered fonts all over the place.
This is slightly incorrect as the problem only affects Google Chrome on Windows. Also, the Chrome team is actively working on this problem as seen in in its issue report [1]. The progress can be seen by launching Chrome Canary with the --enable-direct-write flag. chromestatus.com indicates that its scheduled to be released in M36.
Chrome for Windows is one of the most common setups for browsing the internet, so this isn't an edge case situation. That post originated in 2009, over 4.5 years ago.
It's good to hear it might be finally addressed, but I wouldn't say they've been actively working on it, when all the other popular browsers had this sorted years ago.
The problem with building the world's best power inverter in your garage is: Who do you sell it to? It's only worth $250M if you can connect with the buyer who is prepared to spend that much. The real value with something like a $1M prize is that it will lead to interest from people who are prepared to pay $250M. $1M for non-exclusive rights may be cheap for Google, but it is probably still good value for the inventor if it leads to bigger things.
I would assume if you actually did build it in the garage and it worked, the next step would be to get a patent and after that go talk to a venture capitalist.
Strangely enough, probably not much different to the way Google itself came to being.
If such incentives are already in place, Google risks nothing by drawing attention to them. Either way potential inventors go, Google may make this happen sooner.
It's a great recruiting tool. Plus the possibilities of how it will scale for projects like robotics/fiber/wifi etc... who wouldnt want an inside track like that?
It's probably a lot cheaper to crowd source the solution. You'll get a lot more effort for the money. Crowd sourcing a problem for a prize is a well proven method that's centuries old. Here are some famous examples:
The one improvement would be to crowd-source the challenges and the money. Think KickStarter but participants vote for the projects and donate the prize money.
It's hard to tell if one is even in the intended audience. A smaller inverter? What are the specs on the current one...how much improvement is necessary...how long does it need to last? Do I need access to a semiconductor fab?
I can sort of infer what they're asking for since they're not making a deal with LTC or maxim or lambda or whomever,
but it's a bit annoying to have to standby while google starts the hype train.