Hacker News new | past | comments | ask | show | jobs | submit login
New ARM-powered chip aims for battery life measured in decades (arstechnica.com)
187 points by xbmcuser on March 31, 2015 | hide | past | favorite | 88 comments



  The chip is so low power that it can be powered off energy
  capture from the body, as Andreas Eieland, Atmel's Director
  of Product Marketing for low-power products, demonstrated at
  CES earlier this year.
The human body generates more bio-electricity than a 120v battery, and over 25,000 BTUs of body heat. Combined with the Atmel SAM L21 32-bit ARM MCU, the machines had found... all the energy they would ever need.


british BTU or american BTU? please use SI standard measures for those who don't happen to live in America :-)


He is quoting The Matrix:

    The human body generates more bioelectricity than a 120-volt battery and over
    25,000 BTUs of body heat. Combined with a form of fusion, the machines had
    found all the energy they would ever need.


Which of course always bugged me. Not just because of what I know of biology, but the second sentence.

"Combined with a form of fusion". The thing we understand to eventually provide abundant clean unlimited electricity (okay, relatively speaking).

It's a bit like saying "The human trigger finger can exert 300 pounds per square inch of force, and perform over 500,000 foot-pounds of work. Combined with a standard-issue pistol-grip chainsaw, the Zombie-Hunting Academy had found all the weapon it would ever need."


In the original script the machines used fusion for power and human brains for extra processing power but that was cut because people thought audiences would find it too confusing.


This makes potentially way more sense... except when the machines by themselves are supposed to be super-smart. Well, fiction isn't perfect. But it's always better when it aims for more plausibility..


It's like having a neural net coprocessor.


From a physics point of view it is complete bull-shit.


The original script had machines using human brains for their computing power but the execs thought that would be too complicated for people to understand.


With that little tweak the movie could be almost the same, except it would make actual sense.


How would that script have ended? With humans required for computing power there's no way the machines give up their existence to release the humans.

At least I can imagine them using alternate power sources with collaboration for the humans. Cleaning the sky for example.


Humans could volunteer their brain power for some periods of time and both men and the machines could focus on rebuilding after the war and manufacturing enough processing power for the machines to live alongside mankind.


They could still cooperate. The movies even had a human character who, having been given freedom, chose the Matrix.

(there could also be reasons for trade; maybe the humans spend time in the Matrix in exchange for energy or whatever)


TIL. That one concept if carried through the movies changes a lot.


I'm trying to figure out if you're joking or not. BTU = British Thermal Unit and is pretty much the standard measure used around the world when talking about energy use associated with heating. It's the amount of energy needed to heat one pound of water by one degree Fahrenheit, regardless of where you find yourself in the world...


While only one data point, in Germany BTU is very uncommon. Energy used for heating is expressed as kWh, same as with electricity.


BTUs are used a lot in Canada, but Canada only made about 9 years of progress toward metrification between 1976 and 1985, I was still in high school when our Metric Commission was abolished along with most legal requirements to use metric units, while still retaining them.

Maybe it's generational (I'm GenX), so I'm not sure if all Canadians muddle through both systems as much as I do but I seem to live in a constant superposition state between Metric and Imperial, with little sense about which I'll use for any particular purpose.

I use Celsius for cooking and low temperatures, but Fahrenheit for body temperature and most room temperatures; kilometres for long distances but inches and feet for height and most short distances; pounds for body weight, but never ounces (for that I use grams), for cooking and large weights I seem to mostly use metric units. Volumes are almost exclusively metric, I suspect because we also went through a transition when gallons shrunk from Imperial (4.54609 litres) to US gallons (3.785 litres).

I envy European consistency when it comes to metric.


Imperial is mostly for measurements related to the body, (temperature, height/length, weight), presumably due to the amount of people who grew up with it and it's ubiquity in the American media. Pretty much every other aspect of those measurements not related to the body are done in Metric (It's -10c out, my yogurt is 750g, and it's about 5km to the store). Most room temperatures are in Celsius, or have dual options, and those that aren't are typically because we heavily use American made thermostats due to our trade connections, or it's an older model. Almost everything else uses Metric except for small exceptions like Construction which is again due to long/common use and trading with America. Everything with scientists/doctors is Metric.


As an additional data point, BTU isn't used in Britain very much these days. As with germany, kilowatt-hour is the preferred unit of (heating) energy.


Um, I should have elaborated in my original post - it most definitely is still heavily used in the UK by, for example, architects when evaluating how much heating a house will need. My original post wasn't terribly clear about that, my bad... Yes, consumer-facing information these days is indeed more likely to be expressed in kilowatt-hours...


I built a BTU-measuring device for a british customer, and he explained they had been using the metric system since the 70's.


"It's the amount of energy needed to heat one pound of water by one degree Fahrenheit"

Now, presume there is no gauge at hand to measure a pound less a thermometer to measure fahrenheits. Often outside of commonwealth territories and US we use SI :)


OK, I'm going to get grumpy here, sorry about that. Your comment is a perfect demonstration of middlebrow criticism, and adds exactly zero to the discussion.

Firstly, I'm Australian. We use SI measurements - your comment is not just smug, but wrong.

Secondly, are you honestly postulating that people understand the underlying reality of what a kilowatt hour actually means? That they have a "gauge" that can measure kilowatts? Of course they don't, and for that matter most people that use BTUs don't really understand what they represent either. They just know that it's a measurement of energy-use. All they really want is something that allows them to compare two options, and frankly, the units matter not at all.


That quote is so totally non-scientific. For movies to get basic physics so terribly wrong is very sloppy.


Sometimes I feel like I'm the only person who isn't terribly bothered by sloppy science in movies. I prefer movies that are entertaining or thought-provoking, not factual.

Besides, the Matrix is about how our world is a simulation - who's to say how physics works in their "real" world?


If you want something to be 'thought provoking' then make it as close to reality as you can, that's good SF. The only thoughts this provokes in me is 'ouch'.


MORPHEUS: For the longest time, I wouldn't believe it. But then I saw the fields with my own eyes, watched them liquefy the dead so they could be fed intravenously to the living -

NEO (politely): Excuse me, please.

MORPHEUS: Yes, Neo?

NEO: I've kept quiet for as long as I could, but I feel a certain need to speak up at this point. The human body is the most inefficient source of energy you could possibly imagine. The efficiency of a power plant at converting thermal energy into electricity decreases as you run the turbines at lower temperatures. If you had any sort of food humans could eat, it would be more efficient to burn it in a furnace than feed it to humans. And now you're telling me that their food is the bodies of the dead, fed to the living? Haven't you ever heard of the laws of thermodynamics?

MORPHEUS: Where did you hear about the laws of thermodynamics, Neo?

NEO: Anyone who's made it past one science class in high school ought to know about the laws of thermodynamics!

MORPHEUS: Where did you go to high school, Neo?

(Pause.)

NEO: ...in the Matrix.

MORPHEUS: The machines tell elegant lies.

(Pause.)

NEO (in a small voice): Could I please have a real physics textbook?

MORPHEUS: There is no such thing, Neo. The universe doesn't run on math.

(http://hpmor.com/chapter/64)


120v is not a measure of power, it is a measure of potential energy.


Early drafts had the machines using human brains as CPUs, but the idea was killed by suits who thought it'd be too hard to understand. It would have made the whole premise of the movie make much more sense.


Dan Simmon's 'Hyperion' trilogy used this idea (before Matrix) and I seem to recall it being one of the less silly things in there. A book can afford to be less mass-market-friendly than a movie, obviously, but I think it did pretty well.


It was 4 books, not 3: _Hyperion_, _Fall of Hyperion_, _Endymion_, and _Rise of Endymion_.

The Technocore AIs of the book series were considerably more clever than the Matrix AIs, in that they were able to arrange their affairs such that the humans did not even realize the AI were using human brains as their processors. There was actually a human military objective to locate the Technocore's processing hardware, which I'm sure the AIs found amusing. Kwatz!


So, I'm curious how they convert the body heat to electricity in such a small area.

If it were for something like a Stirling Cycle, you'd need a fairly substantial surface area to disburse the heat difference.


Peltier? And I suppose they would. You could still make a piece of clothing with heat-sinks sewn in, but I have no idea how expensive that'd be.


I think they are referring to kinetic energy from body movement, although tapping into heat loss and other wasted energy sources would be awesome too.


Valid reference. Well, I think it is, might not fulfil Wikipedia guidelines though.


It all started with this:

https://www.youtube.com/watch?v=nbNnNF9JHFQ

http://www.theregister.co.uk/2012/05/03/unsung_heroes_of_tec...

ARM low power consumption was an accident caused by financial constrains. Small team with no money (Acorn was starting to go downhill at the time) forced Sophie Wilson/Steve Furber team to be clever with resources.

In 1987 hand routed ARM2 build with 30K transistors ran at 8MHz reaching 4 MIPS (of 32bit operations). Coincidentally Intels 386 build by an army of hundred engineers supplemented by state of the art emulation/validation/automation software (Mossim etc, &) and divided into large groups working on individual parts of the design achieved ... 4 MIPS at 33MHz using 275K 2x smaller transistors and >20x the power.

&http://webee.technion.ac.il/people/kolodny/ftp/IntelCADPaper...

&software born from https://en.wikipedia.org/wiki/Mead_%26_Conway_revolution and following Darpa programs


I don't think that is a fair comparison. The 386 was a true CISC design while the ARM2 was RISC. One of the major points of RISC was to reduce silicon complexity by using a less complex design (and shoving the complexity onto software).

Intel could have built an equal or better chip than the ARM2 but they were worrying about servicing the already established market for x86.

P.S. Is it even reasonable to compare MIPS between RISC and CISC designs? A RISC chip has to execute several instructions to do what can be done in one CISC instruction.


Intel did in fact build a RISC chip contemporary with the 386/486: http://en.wikipedia.org/wiki/Intel_i860

It had great instruction throughput performance by the specs but lacked in real applications, thanks to the sufficiently smart compilers never coming to be.


The i860's theoretical throughput was achievable only in very special cases in tight floating point kernels. Most code couldn't possibly perform as well, no matter how smart the compiler.

Intel's sane RISC was the http://en.wikipedia.org/wiki/Intel_i960​, somewhat by accident.


And of course, by the time we got to the Pentium Pro - and more or less everything ever since - we ended up with a hybrid pipelined superscalar design which takes advantage of the legacy of support (and, compared to VLIW, tight instruction coding), but as part of the decode pipeline, translates that to RISC µops inside with microcode.

The Transmeta Crusoe was a particularly notable (if not particularly successful) case in point, which brought that layer a little more visibility than most, although in all honestly, was probably mostly well-known for having been Linus Torvalds' employer.

There never was a particularly bright line between RISC and CISC, and it's only gotten blurrier with the decades as the two paradigms stole good ideas from each other.

That's not to say there isn't the occasional throwback, sometimes for a good reason. I've got an adorable little slug of a 'transparent' microprocessor on my desk which I hope sees the light of day sometime (when it actually works, because I've bricked it - first time designer == way too many errata! :P) because it's got some fun ideas for trust, like the host being able to directly read (and verify) all the software it's running.


i960 was actually a very nice architecture. It had features like proper memory protection, hardware GC support and others that we would absolutely love to have in this day and age.

It was a successor to the Intel iAPX 432 architecture. It also had shades of Burroughs B5000 in it (only RISC). I would love to see a successor to this. Heck I'd like to just have a development board to play with.


Interestingly, there is a thread on RWT that talked about this where Linus Torvalds was involved. It seems that it wasn't until 486DX2 in 1992 that x86 was beginning to beat RISC and even 68040 on the lower end. And remember that 33Mhz 386s did not even arrive until 1989 or so.


This is very true. At that time, even 68K-based workstations ran circles around any x86-based design. x86 was, however, cheap, high-volume and with a lot of compatible hardware, so there was sufficient interest to pour resources into it.


The other reason it isn't a fair comparison is Intel didn't get to start with a blank sheet of paper. The 386 had to run code written on the 8086, before anyone expected home computers to support protected mode kernels. I'm sure the x86 silicon Intel produces today would be quite a bit more efficient if it didn't have to support applications written 30+ years ago.


risc with multiply and add instructions :)


Amusingly, Intel have been marketing a 486 as their Internet of Things solution, in the form of the Quark CPU used on the first Intel Galileo board. It's been augmented to support the Pentium instruction set and ported to a newer process node and bus, but architecturally it's a 486. It's about as competitive as you'd expect a 486 to be against modern ARM designs - not very.


I think that this just killed the one of the triumph card of PIC micro-controllers.

Seriously, Microchip uC have now any interesting thing apart of a lot of embed I/O hardware inside ? ARM based uCs now are cheap, low powered and have far better IDEs and compilers...


The strongest selling point for PIC is always the quality of the I/O hardware and the attention to the low-volume market. The CPU core is just something that you need to deal with. ATMEL is always great presenting fabulous MCU than sometimes takes forever to reach the low-volume markets.

Lead time is always an important parameter to consider when choosing a new MCU for a low-volume project :-)

I used to start new projects looking on digikey website more than anything.


>> ATMEL is always great presenting fabulous MCU than sometimes takes forever to reach the low-volume markets.

But in the low-volume markets what wins is usually better software support, and with it's mbed-os(and other software frameworks availble for ARM mcu's), ARM is surely to win here.


Well...

PIC is weird. But it does the job and as the other comment said, they had a lot of models with a lot of devices (ADC, DAC, USB, fully configurable ports, etc).

Also cheap and available in "hobbyist" form factors (DIP)

Yes, you would either go to Assembly (which is harder to use than "tradicional uCs" - like 8051 family, Z80, etc), C compilers (some limited), or there was even PIC basic, heh

ARM? Write in C, "fire and forget" unless you are doing something more complicated (like audio synth - yes, I did this)


I quite liked PIC stuff for the hardware USB support, which I used with a PIC32 DIP chip. They also have a gcc port for their architectures now, which is quite nice xc8,16,32. Although apparently the 'free' version of it, doesn't provide all optimizations annoyingly.

I'm starting to use ARM dev boards now, because of things like the MHz and the nice compilers available.


PIC32 is MIPS. The older, smaller PICs are quite different -- also from each other.


I actually refuse to use PIC32 on principle because they took the MIPS version of gcc, added a license manager to it that disabled -O2 and higher unless you paid them a substantial sum of money for a license key, and then tried to make people use their crippled gcc by forbidding people from using the official PIC32 peripheral libraries with any other version of the compiler. It was a really sleazy end-run around the intent of the GPL.


Yeah, it's hard to justify when PIC has bytes of RAM and ARM has kB of RAM...


Ah! But this chip only has a sub-uA sleep mode. So it will have the said decades battery life if it sleeps most of the time, which might be true in the cases of some kind of IoT application, but not if it's actually running often.


Meet marketing. I will personally buy pizza for one week to any marketing department that, when announcing a new product, publishes figures that are actually useful.

Oh look, we just launched a MCU with a power consumption so low that you can keep it powered for twenty years. As long as you don't do anything with it. Of course, every other figure that would make this one useful (such as per-module current consumption, time to sleep) is missing.

It's especially funny when everyone claims they have "the best" or "state of the art" consumption, but when you factor in everything else it turns out they're pretty much within the noise figure of the current state of affairs.

In my whole engineering career, I'm yet to see a press release or a market launch that isn't full of bullshit. Just hand me the bloody datasheet and give me a break, folks.


Virtually any IoT device will (in a manner of speaking) still be running often.


Now we only need batteries that don't self-discharge within decades.


This is likely intended to be use with batteries that can recharge from very low-energy sources, e.g. energy captured from the body. It has very small energy consumption in sleep mode, so the battery can charge between wake cycles. 200 nA is, indeed, very low.


Lithium thionyl chloride batteries can last 40 years, based on a self-discharge rate of 1% per year. They're used for things like sensors buried in concrete where it's a hassle to change the batteries.



meanwhile the latest and greatest 2015 device is an all-day wristwatch the weight of a pack of cards [1] (for the smaller Apple watch [2]) that in Apple's own "watch test" tells time for up to 48 days before dying --- whoops, no, I meant 48 hours [3] :

>Testing conducted by Apple in March 2015 using preproduction Apple Watch and software paired with an iPhone using preproduction software with 5 time checks (4 seconds each) per hour.

This is as compared with simple mechanical wristwatches that are self-charging (through wrist movement) and can be worn indefinitely, or have 72-100 hour power reserves from winding. [4] This is mechanical power we're talking about.

So don't talk to me about decades until you can get more than 48 hours of battery life....from a watch.

1 http://en.wikipedia.org/wiki/Standard_52-card_deck

2 https://www.apple.com/watch/apple-watch-edition/18-karat-yel...

3 https://www.apple.com/watch/battery.html

4 http://forum.tz-uk.com/showthread.php?250883-List-of-watches...


The Pebble has much better battery life than the Apple Watch; it can last for a week on a 130 mAh battery. To achieve that, the Pebble uses a microcontroller with 512K of flash (plus 4 MB of flash on a separate chip) and 128K of RAM. Presumably the Apple Watch uses a higher-end application processor and DRAM.


Sure, the Apple watch is fairly likely to be a ridiculous geegaw.

It's also going to do a lot more than the mechanical watches you are comparing it to.

If you just want to tell time, there is a wide market of solar watches:

http://www.casio-intl.com/asia-mea/en/wat/standard/solar_pow...

http://www.seiko-cleanenergy.com/watches/solar/

The Seiko tech runs for 6 months with no charging (left in total darkness).


Yup, I've had a Citizen eco-drive (solar powered watch) which has been going 10+ years.

It's an interesting model - there's some form of drive mechanism for the analog hands that let them move independently of the time. I.e. if you turn it to stopwatch mode, the hands will move to 12:00 and use the minute hand for marking minutes.

I've always wondered what kind of electronics are in there - and if they're even digital.


I wonder what the actually numbers for these watches are. Energy received via the solar cell, energy consumption, battery capacity.


Some bookending it is as least possible. They imply that the draw is no more than 1/60th the input (1 minute of sun for 1 hour). The 3 hours for 6 months would be much lower, checking their documentation, it enters a low power mode that freezes the hands (it still keeps time, the hands are advanced when power becomes available).

The input is limited to something like 10 square cm (that's a bit more than 3 cm by 3cm, or a circle ~3.5 cm across). 1000 watts / square meter is usually used as peak sun, so the max input would be ~1 watt (but actual energy captured would probably be more like 1/10 of that).

3 Watt hours is a very respectable figure for a AA battery, so it clearly isn't storing that much power. After looking up that figure, 10% efficiency for the solar cell is probably high.


This reminds me of the people who, at the dawn of the smartphone age, were proudly boasting of their dumbphone's week-long battery life vs. the nightly recharging required by smartphones.

Until they realized they were completely missing the point, since they were comparing a phone to _a pocket computer with a constant data connection_. Most have since stopped boasting and just silently plug in their phone every night.


Pebble lasts a week


If we're talking about low-power mcu, here's something that might interest the the hacker community:

A new API,to be released in april, that enables programming mcu's to very low power,easily(versus very high complexity of currently needed to achieve low power). It uses the mbed, which is of similar complexity to the arduino:

http://community.arm.com/groups/internet-of-things/blog/2015...


True to the Internet culture Atmel could not help it but had to Rickroll people at their booth. I like it.


Would it be feasible to power this cheap of electromagnetic radiation?

http://www.technologyreview.com/news/413744/wireless-power-h...


This guys sell components that enable battery free operation(according to their claims) using rf energy:

http://www.powercastco.com/

BTW, rf energy is generally among the lowest energy harvesting sources, and in many cases environmental light, vibration(for example on a machine) ,and heat are much stronger energy harvesting sources, and most likely available at low cost.



It'll be an odd world if we ever reach the point where we throw hardware away before the battery runs out.


Still waiting for clockless chips here...


Been done - on ARM instruction sets, no less.

http://en.wikipedia.org/wiki/AMULET_microprocessor



Not trolling... is this still a going concern, shipping eval boards, etc? I saw this a couple of years ago, someone commented on HN, and then nothing. I liked the eval board specs and it seemed reasonably priced given the low volume (maybe a hard sell when STM is all but giving away kits).


I got my eval board one year ago. I'm sure if they had closed shop they'd have put a warning on their site, but you can always email them to make sure.


I see that it's 32-bit, is the year 2038 problem a significant concern for a device like this?


That is purely a software issue. Word size does not limit how big numbers computer can handle, it just implies how big numbers it can handle efficiently. Even lowly 8-bit computers/microcontrollers can do arbitrary precision/bignum math.


Depending on your definition of big.


Not if you run an operating system/code with 64 bit time_t, like NetBSD or OpenBSD. I can't imagine you would run an OS on it though.


Linux too - but you wouldn't run any of that with 32k of RAM.


Probably not. These would be useful in wireless sensors and other devices where having an accurate absolute time is not as important as knowing precisely how fast time is ticking away. If I were going to use something like this I probably wouldn't even bother noting the absolute time of a sample at the point of measurement. Rather, I would just periodically synchronize the network with a clock pulse and let each individual node figure out whether they're running fast or slow, and by how much, if it even matters.

If it doesn't then just note the time of arrival at whatever server is controlling the network and use that time exclusively. Then you can buy one 64-bit machine and leave it running all the time, and as an added bonus you no longer need to worry about whether each node has the correct time.


Just because it's 32-bit doesn't mean its code can't store 64-bit ints. There's just a few more instructions that are needed to manipulate it. But it's not a big deal.

That's how computers are able to store, retrieve, and manipulate any variables (like strings) that might be bigger than their address space.


I'd rather use another degree of freedom: how small can I make my battery?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: