Hacker News new | past | comments | ask | show | jobs | submit login

Hmm, so, if their metaphor really held, the brain's computation could be simulated with a 45MHz CPU? Well, let's fix this up a bit...

(1) Give it 1000 clock cycles of cpu work to simulate a single neuro-tick.

(2) The clock is actually variable 5-500Hz (from the article).

So, 500Hz*5m = 2.5GHz, at 1000 cycles required, so 2500 Ghz of CPU power. An amazon large-instance cluster box is 8 cores of xeon at 2.93 Ghz, about 110 cluster instances to simulate a brain?




Bare in mind that you also need to somehow do a brain scan with enough resolution to map every single connection if you want something functional. Also, what are you going to do about the sensory inputs (sight, hearing, nervous system). I'm sure it can't be good for a person's sanity to suddenly get completely 100% disconnected from the world.

And while a 1:1 map of the brain would maybe be feasible to do at the moment or in a few years, but very expensive (although getting ever cheaper), it's kind of like the straw-man AI researcher's answer to everything : do a neural network, repeat training set and hope that it computes what we want. Trying to flaunt your lack of knowledge and trying clever ways to avoid solving the problem yourself is just asking for trouble in so many ways.

In the end, the point is that AI is a software problem. Adding hardware will allow you a more stupid approach in solving it, but effective solutions are those that matter most.


Imitating the brain may be a matter of having enough computation power to allow a simple method to work. It reminds me of NLP where it seems we are doing a full circle back to the simple methods like Naive Bayes as they are showing better results than complex methods with enough data.


Sadly the metaphor doesn't hold. I heard somewhere that something like 19 petaflops is roughly equivalent of the processing power required. I don't recall exactly from whose arse that number managed egress but I suspect it's more on the mark. Obviously 1e15 transistors is not going to give you that.

A transistor makes a bad neuron. Neurons have complex behaviour in and of themselves, and maintain various kinds of state that persist after action potentials (i.e. after firing.) Transistors don't do that individually, so the analogy is quite bogus. Not only that, but connections between (synapses) are not as trivial as a simple wire and even engage in two-way communications. Considering each neuron may have multiple inbound and outbound connections, sometimes over (relatively) vast distances, counting the neurons alone is inaccurate. And I haven't even gotten to diffuse signals. There's a lot of overhead.


I don't think the point is that 1 transistor = 1 neuron. There need to be x transistors to simulate, in software, the behavior of 1 neuron (I'm not sure if the calculations we're talking about take that into account, but your second paragraph is a straw man imo).


The article compared the processing power of the human brain to 1 transistor per neuron. No idea where you get the strawman from, since that's exactly what I was commenting on in that paragraph.


I don't think his 5-500Hz numbers are even remotely correct.

Just because that is the aggregate signal detectable outside the brain doesn't mean the individual neurons are anywhere near that slow.

In fact considering I can easily react to something in under 1/5 of a second, it's provably wrong.

And I can react even faster to expected stimulus, for example to keep my balance, or play a piano. 500Hz simply makes no sense.


A typical spike is of the order of 2msec, so required time step should be of that order. In detailed neuronal simulations, one usually uses a time step of 0.1msec


There's a bet over on longbets that in hundred years time (or so) somebody will have found a way to run AI on computers from around now.

I'd bet that's probably actually doable, but I believe that the first version of AI will need more hardware than that. Just because we don't know what we are doing.


The networking costs will kill you.


The author was using a play on words using IT terms in the title. This has nothing to do with actual brain simulation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: