The PI was also involved in an earlier study [1, 2], that found "Jennifer Aniston neurons", i.e., neurons that get activated when the proband gets shown an image of Jennifer Aniston, but not activated when shown the image of another celebrity.
It's probably not that surprising, that other cells are active during other specific activities and inactive during others.
Anyway, it's fun seeing my old institute featured on hn.
I'd rule out patch clamp. That would be too difficult and slow.
It sounds like they have very sensitive electrodes that are either very close to individual neurons, and then they shift those around, or they have a lot of those sensitive electrodes and compute the location of signal sources in 3D space. The latter would be a lot more economical.
Ok, but I guess that recording which neurons fire when the subject is performing certain tasks does not fall into the category of "sticking a needle into a single neuron".
For humans, unlike for computers, adding and subtracting are different concepts. Think about how most people would transform "-3 + 7" into "7 - 3" and only then compute, because it's easier to "reverse" the operation than work with negative numbers.
It's not that surprising, but not for that reason (to a neuroscientist). While the circuit implementation for subtraction is non-obvious, the circuit implementation for addition is and with Dale's 'law' in mind, even without knowledge of the subtraction circuit it is a very reasonable hypothesis that it would be implemented by a different circuit than addition.
I'd be interested to see what difference, if any, there is in people who learned addition and subtraction in fundamentally different ways. Number line vs counting grids, pattern based, mental imagery, etc.
This is why, when someone asks me to pick a random number, I try to think of some random process going on in my surroundings, and from there derive the random number.
A common tactic for Poker players who want to make some of their actions in a common scenario random (call in one instance, or perhaps re-raise instead) is to look at their wrist watch, and depending on where the second hand is they can simulate a coin flip or a dice roll.
GPT-3 and these kind of natural language generators in general can already present opinions if prompted to do so. Depending on what you mean by "opinion", you might also require some memory to let it be consistent in what opinions it presents to you. (Not that humans are always consistent either.)
If you dig any further than this into the question, you quickly get back to the age-old question of "when is it 'real' consciousness and not just an automaton that acts and sounds conscious?"
True. I'm just thinking of what makes AI different from people. Could "opinions" be such a thing.
When we say somebody has an "opinion" we often mean they have some hidden agenda why they raise their "opinion". It is not a fact but opinion. It is frequently about what should be done, and to whom, and thus is most often self-serving.
Humans are intention driven creatures which continuously try to advance their own agendas. So I wonder are there AIs which would exhibit similar behavior, trying to influence the behavior of others with their "opinions". AAnd do we need such AIs? Are they not evil?
I've confirmed this experimentally. (As in, I asked a bunch of people to "pick a number from 1 to 10").
But that was a long time ago, and I wonder if the common knowledge that '7' is the most popular would sway people to avoid it. I know I do. I should re-run this.
In any case, it's clear that 7 is the most random digit, right? The other digits are either even or otherwise "nice". 7 is chaotic and unpredictable. 7 sells loose cigarettes to middle school kids. 7 will leave the shopping cart in the middle of the parking lot. Of all the digits in [1-10], it's 7 who'll more likely than not be the one who left the bathroom stall without flushing.
You just reminded me of something from when I worked at a pizza place. Most of the time we sold pizzas cut the normal way. Either 6, 8, or 10 slices using a circular cutter. The normal style you might have in your kitchen.
But one Wednesday each month, we had a massive lunch order for a local school. Hundreds of individually-boxed slices, delivered just before 11:30. The slice box was sized for a 1/7 slice of our extra large pizza. We had to use a "wagon wheel" type slicer for those. It was a huge stainless thing that must've cost a fortune.
I always wondered why it was 7 slices and not 6 or 8. The best theory I could come up with was that these slices all had to be the exact same size; no variance from sloppy cutting. And the only way to ensure that would be to specify it as an odd number to make it impractical for the normal cutter.
Did you have to use a different cheese as well? My brief stint in the field had us dragging out school district special cheese and a wagon wheel cutter as well (although it was 8 pieces). I can only assume that you're correct about strict tolerances in slice size. My. imagination creates a scenario where somebody got a small slice once and somebody overzealously mandated more equality.
I don't think so. I was a delivery driver and so was also responsible for pulling the pizzas off the end of the oven conveyor, slicing them, and boxing them, so I can't be sure if some special government cheese was used in the prep line. But my memory of the pizzas was that they were the same, just sliced different.
When asked to "pick a number" or "pick a random number" why should people avoid the popular 7 or 17 ? What property you are expecting them to hit when you ask this question?
Also:
why was six afraid of seven?
because seven ate nine
Well when someone is asking me for a random digit, the last thing I want to be is predictable! To avoid the shame and embarrassment of being basic, I go with 2. Or maybe 9. Depends on my mood.
You feel like 1 - 5 are very familiar to you. You've known those numbers the most out of all of them. So you'd rather pick 5-10 which seem more spontaneous and mysterious. Your balance mechanism kicks in and you want to choose something that is in the middle. You choose seven because 7.5 is the actual middle and hey, there's a 7 in that number so let's do that.
Not very well. The brain evolved in environment that encourages the assumption that events are dependent. The concept of independent events is not natural to us.
I wonder if it is also plausible that these neurons are not showing as active for calculation purposes, but instead active as they are triggering the human emotion to loss aversion.
> Be kind. Don't be snarky. Have curious conversation; don't cross-examine. Please don't fulminate. Please don't sneer, including at the rest of the community.
I haven't read the article, but isn't such a statement tautologically true? Because neurons exist in all abstraction levels of the thought process, if you compare any two distinct processes, you're bound to find (by the fact that they are distinct) some neurons that fire during one and not the other.
I don't think we know what the abstraction level of neurons are. It seems reasonable to assume that it's at the level os individual thought processes, but trying to prove that assumption might lead to interesting results.
People cling very hard to the idea that humans (or human "consciousness") are somehow special, and not at all like a computer. It will take a very long time to break this idea.
> if you compare any two distinct processes, you're bound to find (by the fact that they are distinct) some neurons that fire during one and not the other.
I would doubt that. It depends on what neurons fired before it. The neurons firing before it might drastically change context. It doesn't seem like a single neuron is responsible for a single piece of information, it is rather an emergent phenomenon of combinations of neurons firing. But I'm speculating.
This is garbage research. Nine participants took part in a study where they were prompted to do mental arithmetic adding or subtracting numbers between 0-5 while wired up with ultra-fine electrodes measuring the how frequently 585 individual neurons fired while completeing the task.
I presume at that point the data were mined for a publishable result. What they came up with was that by selecting a very small subset (~%5) of the neurons they measured they were able to tease out a result that certain neurons 'encode addition or subtraction'.
Is this real? Maybe. It's in now way explanatory. It doesn't offer any sort of model hypothesis or predictions worth testing. It's really a waste of time.
It's probably not that surprising, that other cells are active during other specific activities and inactive during others.
Anyway, it's fun seeing my old institute featured on hn.