Hacker Newsnew | past | comments | ask | show | jobs | submit | alexvr's commentslogin

Just a friendly reminder that morality is just a little human social construct associated with the pain-pleasure mechanism that evolved to guide intelligent machines toward evolutionarily advantageous ends. It's not a fundamental "problem" or anything if someone dies or suffers unthinkable agony. It's just that collections of molecules that avoid these things tend to survive. Outside of silly little human opinions, you're not a "bad person" if you stab people with scissors for fun, and you're not a "good person" if you cure cancer. These are both just "happenings" of nature, nothing more than what they are. Please don't waste your time pondering ethical dilemmas because it's as silly as arguing about the location of the garden of Eden or whether the population of leprechauns is greater than the population of unicorns.


Yeah but if none of us had this construct of 'good' or 'evil', we wouldn't be able to co-operate. We would all be killing each other to get resources - Physical strength and brutality would become the most important evolutionary traits.

I almost certainly would not be able to exist in such a world - So my existence (and that of millions of people like me) relies on the fact that people believe in 'good' and 'evil'. So I support the notion.

That said, the notion of 'good' and 'evil' doesn't have to be homogeneous throughout society. In our current economic environment, it pays to be slightly evil because it lets you take advantage of people who are less evil than you. If you can do 'evil' things but manage to deceive people into believing that what you're doing is actually 'good' - Then you have a huge evolutionary advantage.

I think hypocrisy is a huge evolutionary advantage as well.


Everything is just happenings of nature, so by itself that tells us nothing on how we should use our time. If you consider that particular action a waste, you're implicitly making a judgment using that silly social construct.


A class on moral philosophy screwed me up for a while early in college. All the critical thinking and fancy vocabulary about the topic made me think morality was in some way real - I was all concerned about violating "moral laws." It's amazing how smart people can be so grossly deluded and incorrect about things like this.

"Am I wasting my potential?! Is this action maximizing my contribution to general welfare?! Is Famous Person better than me because he helped more people?!" Totally neurotic.

This kicked off an era of serious philosophizing, and I began to see countless contradictions and paradoxes with utilitarianism, etc.

For example, I started to see that the notion of "selfhood" was just a social invention or cognitive construct, because I reasoned that we're just perpetually changing aspects of nature, and our separateness is just opinion. So then I wondered how the hell anyone could be deserving of blame or credit if they don't actually exist, or if it was their "former self" who committed the crime, etc.

It's kind of annoying but cute to see some popular "thinkers" and writers -- fancy-smarty-pants _neuroscientists_ and _atheists_, even -- who actually think morality is real, as though there are actual objective problems out there somewhere. As though you could actually do a "bad thing" or a "good thing." That grinds my gears a little because it's very hypocritical: They'll write an entire book disparaging religious people who believe things without evidence, and they'll write another book on why, according to their pseudoscientific-philosophical horse shit, morality can be "derived from science" [vomit].

But it's easy for smart people to cling to morality as an existential anchor point when they don't have religion to fall back on. It's hard to accept that you're in free fall. But it's nice once you come around and accept reality for what it is.


What words would you use to tell your friends and family about why someone shouldn't stick a gun in your face and take anything they want?

And if the complicated and often contradictory paths through considerations of ethics are 'annoying' to you, why? Are they 'wrong' on some moral plane that 'doesn't exist?'.

Ethics, life, and why we are all here is hard stuff. But if you think there's no point, then please don't vote in the rest of our elections this fall!


It's not wrong to stick a gun in someone's face. It's fine to pull the trigger too, if you're not considering the laws our civilization invented. It's all just opinion: You might not like being robbed, shot, or killed, but that's your opinion; it's not "bad" in any real way whatsoever. I don't like pain or the idea of dying before I'm ready either, but it's not an actual problem or anything like that. Just preference. It would be quite remarkable to somehow violate the way things are. Next time you see someone do something "wrong," or "bad," or "unethical," please try to use your senses to observe the "bad" or "problem" in the situation. Where is it? I'd love to see a picture of a real violation of nature, a real problem.

To your second question, such things are annoying to me because it is my nature to be annoyed by ignorance. Many humans are naturally compelled to seek understanding. There is nothing wrong with ignorance; it's just my nature to find it annoying.

Also, downvoting my comments doesn't make them incorrect.


> why someone shouldn't stick a gun in your face and take anything they want?

If they're the government, they call that taxation.


>It's kind of annoying but cute to see some popular "thinkers" and writers -- fancy-smarty-pants _neuroscientists_ and _atheists_, even -- who actually think morality is real, as though there are actual objective problems out there somewhere. As though you could actually do a "bad thing" or a "good thing."

It seems that you conflate real with "made of molecules".

Things like morals are real in the sense that people agree on them.

Doesn't even have to be all people -- after all some people disagree also for concrete, made of molecules, type of stuff (e.g. a crazy person believing a tree is a demon, or a conspiracy theorist not believing in the moon landing, or a psychotic seeing spiders on his arms, etc.)

And, in that respect, it's quite easy to see that helping an old person who fell down to get up is something good, while raping children is not.

It's not even the case that people will take sides on the matter, the huge majority will agree on both those labels.

>That grinds my gears a little because it's very hypocritical: They'll write an entire book disparaging religious people who believe things without evidence, and they'll write another book on why, according to their pseudoscientific-philosophical horse shit, morality can be "derived from science" [vomit].

You do understand that both your charge against this "hypocrisy" and your disgust at "pseudo-philosophy" is based upon a moral stance, right?


> Things like morals are real in the sense that people agree on them.

That is sufficient for a cultural relativist. But not for the bulk of moral realists, who want to say that it is possible for whole cultures, or even the human race as a whole to be just wrong about some moral proposition.

It's not even a hypthetical excercise. For example: to the moral realist, branding slavery as being unacceptable was not just a matter of switching from one social convention to another. It was about switching from being wrong (both factually and morally) to being right.


>And, in that respect, it's quite easy to see that helping an old person who fell down to get up is something good, while raping children is not.

I think it's a bit more difficult than that. After all, in many cultures, it's fine to rape children, as long as you rape the right children.


That would only mean that morality is relative across cultures, not that morality doesn't exist within the same culture.


That doesn't really get to the heart of the question, as regards the social psychology of morality: why has some culture arrived to the particular moral notions it teaches and enforces? What properties, objects, or circumstances are they representing?


>why has some culture arrived to the particular moral notions it teaches and enforces?

Based on its history -- and because enough people came to believe that such a moral code promotes its wellbeing and interests better than alternatives.


That is no kind of precise causal explanation.


Morality is "real" to the extent that we acknowledge that all humans express (or are capable of expressing) moral intuition. You can't hold the emotion of happiness or the feeling of hunger in your hand, but I would wager that you have strong intuitions about their existence as "real" things.

Utilitarianism isn't the end-all-be-all of moral philosophy. By modern standards, it isn't even particularly popular, compared to hybrid (partially deontological) theories. I think you would also be hard-pressed to find a philosopher who thinks that moral theories can be derived from science alone. Scientific knowledge is often used by philosophers to explain an intuition or support a theory, but few are likely to advocate for cannibalism because we have observed it in other species (and some remote uncontacted groups). That would be an appeal to naturalism or the mere state of things, neither of which is compelling in theories that are meant to explain how we ought to behave.

Thinking about morality as a nonabsolute lands you squarely in the land of relativism. That's a very comfortable place to be, until you meet someone who likes to burn the paws of cats for fun (to borrow Singer's analogy) and have no recourse against their unambiguously immoral behavior.

To wrap up, the goal of moral philosophy is not to "anchor" your preexisting morals and make yourself more comfortable - it is to take (all) moral principles to their logical limits, exploring inconsistencies and gaps that would be unacceptable if applied consistently.


Thanks for the comment. I have to disagree with burning cat paws being unarguably immoral, unless you're talking about a completely fabricated, cognitive, social definition of morality. If you declare, rather arbitrarily, that anything causing pain intentionally for no good reason is in our discussion referred to as "bad" or "immoral," I will agree that burning cat paws is immoral. But that's just a definition we invented; in reality there's no issue with burning cat paws for fun or cutting your fingers off when they get dirty or shooting rockets at the tailgater behind your car. Yeah there's going to be pain involved, but there's nothing fundamentally wrong with pain at all.


Consider the following: What type of world would ours be if everybody tortured cats for fun? Intuitively speaking, would that be a "good" world, or at least one that is more "moral" than our current one?

Morality itself may not be universal in the same way that protons and neutrons are - it's unlikely that CERN will ever find a fundamental particle that interacts with humans to produce moral behavior. That being said, it isn't necessarily impossible to conceive of a (universal, consistent, intuitive) code whose rules maximize some moral good[1] (whether that good is pleasure, longevity, number of M&Ms per capita, etc).

We can argue back and forth about whether moral goods are social or universal (I personally think that it depends on the moral good being considered), but neither option seems to detract from the core calculus of moral philosophy - whether or not someone ought to do something. If that fundamental question resonates with you, then you've just made a moral consideration. The trick then is to understand why you've made that consideration, whether or not it is a consistent one, and what first principles inspire it. If you think that question doesn't resonate with you, you might be an amoralist[2].

[1]: I apologize for re-using "good" here - I'm using it in the sense of resources, not the moral sense. In the context of (utilitarian) moral philosophy, moral goods are those resources that we ought to maximize.

[2]: Bernard Williams has written very extensively on amoralism, and particularly on the inconsistencies that amoralists must concede upon being presented with their own behavior. Ethics and the Limits of Philosophy and Morality: An Introduction to Ethics both contain great passages/chapters on amoralism and its inviability as a philosophical model.


>For example, I started to see that the notion of "selfhood" was just a social invention or cognitive construct, because I reasoned that we're just perpetually changing aspects of nature, and our separateness is just opinion. So then I wondered how the hell anyone could be deserving of blame or credit if they don't actually exist, or if it was their "former self" who committed the crime, etc.

I have a great rebuttal to that kind of nihilism-- not that you need it anymore. People get weird ideas because they subject about half of the universe to nihilism while forgetting the other half.

If everything is just meaningless matter and meat, then so are ideas like credit and blame. "Blame" is just an abstraction for a class of neural signal configurations. Someone might say that there's nothing "wrong" or "right" when it's all just bullshit atoms, but then there's also nothing "wrong" with thinking right and wrong are real.

These ideas don't need to be somehow universal truths... they just need to help physical systems of matter do the shit that physical systems of matter "want" to do.


I think it's obvious that there's also nothing wrong with thinking that nonexistent things are fundamental aspects of reality. You (not you specifically) can have mental models that are not in line with reality to your heart's content. It's not a problem. Trees and rocks don't know what's going on either.


You sound dismissive and petulant. Objective moral truths have to be real (or at least tractable if you're, e.g., a utilitarian) unless you're willing to make some very dubious concessions, such as: it's not morally and objectively wrong to hurt someone for no good reason.

Of course, some DO make such concessions (Singer being one of them), but I think that's just throwing the baby with the bath water.


It's NOT objectively wrong to hurt people on a whim! I concede this very easily. It's not dubious whatever. It might be illegal or make you unhappy, but please let me know why this is objectively wrong. Do you think "wrong" is more than a little human symbol referring to a little human opinion? How does hurting someone violate nature? How is it a problem? Are there invisible arrows that point to "bad" things? Do you actually think pain is a fundamental problem? What's "dubious" is thinking there's such thing as an objective "bad" out there somewhere. Go find it and take a picture. Measure it and please document it for the human scientific enterprise because it would be a fascinating (but very disturbing) discovery! I'm not trying to be mean here - I just want more people to realize this. Our society is brainwashed. Nothing matters whatsoever outside of pure opinion, as far as we know. Wake up, people.


> Singer being one of them

Example??


Singer, to defend (late-term) abortion, argues that infants aren't "people", as personhood is a function of quality of life. A baby or toddler, for example has very little quality of life compared to a full-grown adult [1][2].

This is a pretty unintuitive position (and imo wrong), but at least the man takes it to its logical conclusion, so, you know.. respect.

[1] http://www.utilitarian.net/singer/by/1993----.htm

[2] https://www.lifesitenews.com/opinion/just-being-human-doesnt...


But the reason, to my understanding, why this is important is the cost to bring up a baby that's born with severe mental issues is a lot of money.

Human lives have a cost, it varies a bit on how you measure it, but around $1 million in western countries.

So if the cost to bring up this child is more, you are effectively killing someone.

So by stopping the creation of this costly life society is better off and lives are saved.

Certainly the concept a baby suddenly becomes a life just after being born is very flawed.

So Singer doesn't take the side you believe because he believes life don't exist at that point. He is not arguing for just killing someone.

If you scientifically prove a baby is a person he will change his opinion immediately.


Some of your points resonate. But are you saying you choose a definition of "exist" whereby nobody exists?


I'm glad. They are disturbing thoughts at first but then they become profoundly liberating.

The "self" or "ego" is an illusion. It's very challenging to see beyond it when you're raised in the competitive, egocentric America. It's not at all obvious until it hits you.

I'm of course not saying that bodies and brains don't exist, just that they are only subjectively and almost arbitrarily separated from Nature. For example, outside of opinions, people are no more significant as entities than, say, doors. Can't doors be considered walls? Sure; it's an opinion, a name for a fuzzily-defined feature of the world. It's not a special entity in a computer system. Same with "person": It's just a name for a concept we have. It's arbitrarily defined. You could just as easily create a new word, like "personite," and define it to mean what we normally think of as a "person" plus everything in a 10-ft-radius sphere from their nose. It's just like "solar system" - it's almost arbitrarily defined. There's not a hidden sphere delineating solar systems just like there's not a hidden mesh delineating our bodies. We just approximately agreed on these definitions. It's also just like the notion of "alive": there's not a cosmic Boolean keeping track of whether or not an aspect of nature is alive or not; "life" is just a concept we invented, almost arbitrarily defined. As I see it, we're always "dying" in a sense: Are you really the person you were this morning? Can you have a conversation with him? I'm pretty sure he's gone, or in a position along the dimension of time that we can't access from this point in time. The idea that you're in some way the same "person" is an illusion of memory: It's just that a very similar brain has memories of what it was like to experience the world in a very similar body this morning.

So yeah, the thing "my body" refers to certainly exists, but it's not some kind of separate or significant "entity" except within our opinionated minds, perhaps; it's just "what the Cosmos is doing in that general area."


I share that slightly nihilistic vantage point in principle. I don't find it helps make decisions too often. I'm much more helped by thinking I exist, treating existence as a concept, requiring no metaphysics.

Nitpick on "illusion": A map is not an illusion of territory; it's a tool for navigating. Mirages and disappearing coins are illusions, useless for navigation.

Nitpick on "bla bla competitive America": lots of people outside America believe they exist.


Remind me how the correlation between openness to experience and IQ test scores suggests that seeking novelty might possibly increase your cognitive potential. I think I missed that part. With the same kind of "reasoning" I might have to write an article claiming that you should take lots of psychoactive drugs and have more sex to increase your cognitive potential, because there are positive correlations between these things and IQ scores. Also, if you want to get smarter, listen to the bands on the right of this chart! http://musicthatmakesyoudumb.virgil.gr/ ...why? Because correlations, of course! Much scientificful.


Certainly it's a little misguided to suggest that an OCEAN correlate implies you can train intelligence; OCEAN is relatively stable across your life (that's what makes it a good personality metric). That said, I know at least one person who scored low on O taking an OCEAN battery but is in fact very diligent about seeking out novelty, so I think the O-factor depends more on how enthusiastic you look at the world at large versus deciding you want to be a polymath and working very hard at accomplishing that goal.

On the other hand, being able to have lots of sexual partners implies being able to convince lots of people you're worth having sex with (this is one of the two main goals which developed human intelligence in the first place, the other being killing people who were doing the first too effectively), and taking a lot of psychoactive drugs I think also implies a sort of mental fortitude (it's very difficult to habitually take strong psychedelics without having a solid grasp on reality and how to adjust your actions to account for the change in perspective, the people who can't do this but enjoy psychedelics anyway aren't who anyone is thinking of when they say intelligent people use psychedelics).

I listen to mostly metal, so you can trust what I say [https://www.theguardian.com/music/musicblog/2007/mar/21/whym...]


From the article 'Excellent learning condition = Novel Activity—>triggers dopamine—>creates a higher motivational state—>which fuels engagement and primes neurons—>neurogenesis can take place + increase in synaptic plasticity (increase in new neural connections, or learning).'

Not sure if it's scientifically correct, though.


My rather unscientific observation is that 'seeking novelty' enables you to think about things in different ways, to consider new approaches. I think 'expanding one's consciousness' could be aptly said in this regard.

Playing a sport, on a team involves so many dynamics that just don't exist when thinking in a classical intellectual sense whilst writing code, for example.

Living in a country for an extended period wherein the prevailing language and culture is not your own ... this can be really quite mind altering.

I think that the above correlation, pulled from the article maybe doesn't quite capture it. They basically state basically that 'novelty keeps you interested and motivated', I suspect there might be more than that.

Though obviously it's hard to discern since we don't really have a true model for brain functions and cognition.


What? Why doesn't it work like this:

Cell phones have SIM cards with an ID and a secret key. Cell service providers have a database of these SIM associations. Cell phones encrypt IP packets in their entirety with the symmetric key and send it as the payload of some cell protocol packet that might expose my ID, if anything. Assuming the cell provider is secure and not on the dark side, this is the safest part of my my packet's trip.

I don't understand how a cell-site simulator could see what websites I visit, much less the messages I send, without knowing my key. And it's not like one could trick my phone into thinking it's the actual cell site, because it won't be able to respond to my transmission with a message that my key can decrypt.

What the heck am I missing?


FBI: "Hey, cellular provider, give us the secret key and ID for X." Provider: "Sure, thing, just one moment." ... "Here you go."

---

Or, if your provider has a bit of a spine:

FBI: "Hey, cellular provider, give us the secret key and ID for X." Provider: "Got a warrant?" FBI: "No problem, give a half hour to call our go-to judge." / "No, but here's a NSL."


That's assuming that is even necessary. Harris made an upgrade to their Stingray equipment called Hailstorm that intercepts 3G and 4G standards.


2G ruins everything. It is effectively wide-open now and handsets will connect to the strongest connection. This is one of the oldest problems in cryptography. It doesn't matter how great the latest and greatest is so long as the old broken standard is still widely used and supported.


Until you can purchase a phone that is not compatible with 2g, you will always be at risk of fallback attacks.

It works sort of like what you are describing in 3g & 4g networks.

So to answer your question: You are missing phones that don't work on 2G (unless there is a function to disable it in a user-unfriendly engineering menu).


The ID is probably the most important thing for them to track.


These devices do not necessarily have insight into the contents of your communications, their main feature is that they can uniquely identify and locate a phone.


https://www.youtube.com/watch?v=DU8hg4FTm0g

I watched this DEFCON talk a while ago. Not sure if it's still relevant, but it is quite worrying.


I find this deeply disturbing, particularly because there has never been a better time in recent history to NOT give a damn about how you perform in traditional school systems, if you are unlucky or ignorant enough to be imprisoned in one in the first place. Competitive anxiety is one thing when it compels people to achieve amazing things, and another entirely when it clouds judgment and makes people miserable and on edge so they can merely out-compete the next person, even if the entire competition is largely just a contrived and futile ego game to begin with. People should first think long and hard about why they are striving for whatever it is they are stressing over.

But the truly disturbing thing is that traditional schools still exist, and a frightening number of people -- even some smart people! -- still consider the system humane and somehow necessary. I sometimes can't believe that my younger self was effectively forced to do push-ups, run laps, eat lunch in a crowded cafeteria (that is, navigate the social landscape as someone with anxiety), and memorize court cases, poems, and Greek column types because some older person decided that it was good for me. Because overall it wasn't, and the reason I managed to finish K-12 as a reasonably educated human being instead of a sheep is that I realized the contrived nature of traditional education very early -- like, second grade -- and placed actual education (and, maybe unfortunately, silly but outrageously fun MMORPGs) above “school for the sake of grades”. I learned a lot from school, and even plenty from coursework, but I can hardly imagine the superior human I might be had I been allowed to focus on my personal interests and learn at my own pace.

School is a really bad place for smart kids. It’s disgusting. For every would-be gang member or fry cook who turns out significantly better because he was forced to attend school, there are probably many thousands of bright young people who would much rather be inventing something, experimenting, productively socializing, or studying what they find interesting. If I do one meaningful thing in my life, it will be helping to make that the norm. It’s really a sad thing that some kids with plenty of potential are misguided into thinking that a GPA is more important than knowledge, experience, curiosity, and comprehension. And it’s an atrocity that staggeringly gifted young people can grow up thinking they’re poorly endowed weirdos because, instead of memorizing their way through school, they struggle to actually understand things since their minds refuse to take “that’s just how it works” for an answer, and they bother to "waste time" wondering, "What if...?"

There are far more humane ways to criminally detract from the most important years of young American lives, if that’s the goal of compulsory education.


Such a great comment!

"And it's an atrocity that staggeringly gifted young people can grow up thinking they're poorly endowed weirdos because, instead of memorizing their way through school, they struggle to actually understand things since their minds refuse to take “that's just how it works” for an answer, and they bother to "waste time" wondering, "What if...?"

This is exactly how I felt through school and did poorly because of it. I see these issues with my son too but he is excelling because I heavily encourage his curiosity and he has been lucky enough to have some great teachers that appreciate his obsessive need to have to know "why" or dig in deeper. I do worry though that he won't be so lucky as he moves into high school.

"It's really a sad thing that some kids with plenty of potential are misguided into thinking that a GPA is more important than knowledge, experience, curiosity, and comprehension."

This is so true. Something that recently really stood out to me was the lack of critical thinking or troubleshooting skills in kids. I discovered the lack of this in a summer workshop I was part of. Only about 10% of the kids (ages 8 to 12) had decent critical thinking or troubleshooting skills. The rest just wanted to be told what to do and how to do it. Why? Is it because we are so focused on getting kids to reach certain milestones that we are not teaching them or even encouraging critical thinking?


By the time kids who are taught to program in K-12 enter the workforce, knowing syntax, computer details, and data structures will be a thing of the past or a thing for really hardcore engineers who have to use a strict language like Python. By that time, most common programming tasks will be a matter of writing plain English and testing the AI interpreters, and students will be stuck learning archaic programming just as they are presently forced to learn archaic and impractical math techniques that could be done very intuitively with a few lines of Python and arithmetic. I say this jokingly, but there may be some truth to it.


I hope he gets out of this rut and finds a way to be happy and do more good things. It's a little painful to watch a smart guy with billions spend his time making gimmicky video games and partying and being generally unhappy.


I'm been worried about him since long before the Microsoft sale. I hope he gets some real help.


Thanks for the warning. I was really hoping to see Halo or Quake or something, because in my experience practicing these games in a casual-competitive way, I hit a performance ceiling. In high school I played Halo nearly every day, and a bright friend of mine, who didn't even own the game, was a surprisingly good player for his limited experience.

On a similar note, I once read that reaction time and IQ are correlated (not sure how strongly), which is interesting because you might expect motor functions like that to be orthogonal to higher-order cognitive abilities.


On a similar note, I once read that reaction time and IQ are correlated (not sure how strongly), which is interesting because you might expect motor functions like that to be orthogonal to higher-order cognitive abilities.

Speed is good, in many contexts. Say you've two people, and person A has overall cognitive speed 25% greater than the other.

That doesn't just help with Jeopardy!, it gives you 25% more time to think in normal conversations, on SAT tests, while playing video games, at work, etc.

Even a 10% slower speed differential from a baseline human is a big, big disadvantage.


> Even a 10% slower speed differential from a baseline human is a big, big disadvantage.

This would make sense if all thinking is equal and only the rate of thinking varies. In practice, the quality of thought processes seems much more important than the the rate at which such processes are carried out.

For example, attempting to assess "cognitive speed" can be very dangerous in interviews. It seems like such a promising metric. A candidate who answers questions 10% faster than average will generally be much more impressive than a candidate who answers questions 10% slower than average, and it's tempting to think person A will be 22% more productive. Of course, over time, it may turn out that he occasionally provides more bad solutions to problems, or can only solve superficial problems, or solves the wrong problems. And you realize the person who consistently solves problems correctly (albeit at a slower pace) is a better choice than the person who introduces new problems as quickly as he solves existing problems.

It could always be argued that person B is actually thinking faster, but thinking through the problem much more carefully, causing him to verbalize his answer later. Or perhaps both people are thinking at the same speed, but person B has simply thought more. Maybe person B is actually thinking slower but more effectively. From an outsider's perspective, we can't know what's happening under the hood, which makes "cognitive speed" a weak metric for judging effectiveness in practice.


In practice, the quality of thought processes seems much more important than the the rate at which such processes are carried out.

Of course the quality of thought matters.

But if you have two individuals that have the same quality, but different speeds, the faster one will seem smarter.

Think of a fast flowing conversation. A slower thinker is going to miss references and associations that a faster one will get. So the faster thinker will learn more from the conversation, and be able to participate more. This has many benefits.


This. You want some easy proof? Watch a video of Steve Jobs answering questions, like the WWDC 1997 Q&A video. Time how long it takes him before he starts speaking after the questions are asked, and note the quality of the answer.


Yes, but presumably in those long pauses, a faster thinker would have time to consider more possible replies, make predictions on how the audience would receive them, and generally make more adjustments and improvements than a slower thinker.

In other words, faster cognition would probably help you formulate a slow, considered reply just as much as it would help you answer more quickly. That might be especially helpful in a public speech to a group asking unplanned questions.

I agree with freyr, though, that quality of cognition/ideation matters more than speed, and doubt that 'thinking speed' is necessarily correlated to better thought output.

Like IQ itself, faster basic cognition probably means something, but I don't think anybody knows what it means or how (or even if) it relates to "intelligence" (whatever that means).

In my personal experience, I have seen more great ideas and solutions come from "weird thinkers" than "fast thinkers". I realize that sentence does reduce to the cliche, "Think outside the box, bro," but it seems true over the course of all that I have observed in my own life.


In that example, it may also simply have been that Jobs had been prepped (or prepped himself) thoroughly with answers to a lot of likely questions.

I don't put a lot of credence to the speed thing here. I know one senior executive who made a point of writing down question on cards when he was asked them live at an event. Which I always thought was a rather clever approach to 1.) Make sure he understood the question correctly and 2.) To give himself some time to formulate the best response.


Reaction time is good in general, but rather useless in the traditional IQ tests, i.e. culture-neutral multiple choice pattern-based questions. If you are 25% faster than your neighbour it gives you the benefit of the additional 25% of the reaction time, not total answering time, which is usually much larger in the context of IQ tests. Or nearly any test, for that matter.


Oh, sorry if I wasn't clear. Reaction time is strongly correlated with the overall speed of brain function. So it applies to other brain operations too.


Normal IQ tests should also contain a computerized part where you have to react and solve problems on a computer with a time limit. Last time I did that, they still used Win2000, but I can only imagine this increased.


A possible explanation for the correlation with reaction time is brain damage. If you are exposed to more lead as a child, that will damage both your motor cortex and higher cognitive functions, even if those aren't otherwise connected. Same mutational load. If you got a random mutation that slightly affects your neurons slightly less efficient (everyone carries such mutations, but our genetics is redundant enough to handle it.)

But I wouldn't expect a perfectly "normal", undamaged human to be a genius, but they would have superior reflexes. And there are tons of animals that are dumb, but have amazing reflexes.


Processing speed is only one part of IQ tests. You can see a pattern at higher IQs where the processing speed is lower than normal, but other parts nearly max out. People with jobs where you always have to be meticulous (like software), can show this pattern. Also processing speed is weighted lower in IQ tests vs things like pattern matching or solving.


In an FPS like Halo the relevance of skill sets is much harder to measure, but I think it's a delicate balance.

If I were forced to choose, I'd say motor skills and reaction time combined with spacial intelligence are the key drivers of success. I have yet to see an FPS professional that doesn't have tremendous shot accuracy and agility, relying only on their strategic skills. But I have seen some with great shots and spacial awareness make many questionable decisions. But my exposure is limited and my perception is biased.

I'd guess general intelligence shares relevance almost equally as it affects weapon utilization, detection avoidance, enemy action prediction, strategic positioning, verbal skills (team games), random trickery, etc.


Well, you could think about in in the sense that quick mental reaction time is necessary but not sufficient for a quick physical reaction time.


It's really sad, but in one way kind of laughable, to see so many ostensibly bright students wasting their time cramming useless crap into their brains for AP history and calculus tests, and doing socially acceptable "extracurricular activities," all so they can get into socially acceptable or "impressive" universities where they can pay obscene tuitions and try to figure out what they want to do with life. Compulsory K-12 education is really bad for smart students who know what they love to do and what they're good at, because it's a massive, needlessly competitive distraction. Not all smart kids need to learn the details of calculus or chemistry or history, it turns out, because not all smart kids want to be math professors or biochemists or historians! Who woulda thought?!

-----

Oh, you're a talented young engineer? You can write a program in 3 minutes to approximate nearly any integral? Fuck you, we don't even teach programming in high school. In fact, such witchcraft is prohibited here. Instead, spend half a year of your life memorizing these integration tricks because we're definitely still in the 19th century. Speaking of which, don't forget that you have a big history project due tomorrow because you will no doubt be required to distinguish between Greek column types when you're in the real world!

Oh, you have a natural gift for writing? Too bad you don't know what a gerund or past participle is! You must be dumb! Let me tell you: In the real world, it is imperative that you be capable of diagramming sentences. Yes, you deserve to fail grammar tests even if your grammar is impeccable in practice.

Oh, you taught yourself conceptual aerospace engineering in elementary school? Fuck you and your creativity and advanced knowledge; you must follow directions to the point in engineering class to build this cardboard rocket! Engineering is all about following directions!

-----

Yeah, I've seen some nasty things in school.

I hope I see the day when students don't feel pressured to learn things just to make the grade or "keep up" with the fierce competition. Maybe young people will increasingly realize that competing to be #1 in the Great Conformity Competition is really dumb because it actually makes them less competitive where GPAs don't matter.

Imagine where you'd be if you were given the chance and encouragement to really focus on the things you loved while growing up. Young people should not be led to believe that there is one correct path for everyone.


How could you possibly write a program that "approximates nearly any integral" without having learned calculus in the first place. I by that I don't mean to "spend half a year memorizing" any tricks. (If that's what they taught you, I am sorry to say you had a pretty awful teacher and you should seek to educate yourself elsewhere).

Oh yes, I forgot about the 3 minute part. You didn't write anything from scratch. You just at best googled for a math library and hooked it up to your hello world program... if not downloaded the whole damned thing from easy-Aplus.com!!!

Nevermind, rest assured that grownups never had to learn anything in order to build the infrastructure you happen to enjoy today.


First, these are all exaggerated cases, inspired by experience. I appreciate the infrastructure a great deal, but I'm saying that there's a tremendous amount you can do if you focus on learning concepts and have a computer. School doesn't even care if you understand concepts; memorization is usually sufficient. And anyone who understands basic conceptual 1-var calculus and programming can indeed easily write functions in plain C that perform numerical integration and derivation because the hard part about calculus is putting up with the countless algebra tricks that are not obvious to mortals. Don't know how to manipulate that monstrous algebraic expression to take the limit? Good thing your computer can use some tiny floats. People who are not going to be math/physics professors or NASA physicists who need to plan a Mars voyage to the nearest nanometer would really benefit from focusing on the concepts instead of algebra. And my calculus teachers were pretty superb, by the way.


Why am i not able to downvote this. You are wrong on so many levels I don't even know where to start.

I'll just say that a good general knowledge of various topics will greatly help in life in a variety of situations. What you get taught in high school after all is the basics of the basics. If you don't know what a gerund is... sorry, you're an ignorant, and I don't care how many stars your JavaScript library has on github.


I don't know what a gerund is, and I've never needed to except in 8th grade grammar tests, which I failed. Yet I got a 79/80 on the writing section of the SAT because my grammar is pretty solid. I've been in a remedial English class, actually, with ESL kids. Like yourself and most other rational people, I am a big proponent of acquiring knowledge on diverse topics. I just think people should be able to pursue the topics they're interested in instead of dealing with distracting busywork that old people decided was a good fundamental curriculum. Do you fail to see the problem with schools forcing kids to memorize things about Native Americans and Greek columns and long division (etc.) when 1) most do not care about such things and 2) even more will never need to know such things? It's one problem to have an outdated curriculum and another to force a set curriculum, even a great one that works really well for the average student, on kids.


I understand and completely agree with your issue about memorization. In my experience good teachers don't force you to simply memorize formulas/details but make you appreciate the meaning of the concepts behind those. After understanding the concepts the formulas come out by themselves and look obvious.

E.g.: It's important to know that Greek history is divided in 3 pretty different big main periods and each of them had different column designs but after that knowing the details of those designs is pretty unimportant.

A gerund is a pretty easy and obvious notion. Personally I just know it without even thinking about it probably because I was taught about it at a pretty young age. To me someone not knowing what gerund is is like someone e not knowing what multiplication is.

Honestly anything before college is a pretty young age and 99% of people aren't sure (or shouldn't be sure) about what they're going to do with their life because they don't know enough about it yet. Even that 1% will benefit out of basic knowledge in the long run because that's what makes you a man with basic culture that can discuss on a range of topics.

We humans are not computer programs specialized on solving a single task. High school makes us more like an OS, providing the features to solve (hopefully) any kind of problem we encounter in life.


Those "integration tricks" are introducing you to some of the most basic fundamentals of mathematical thought. Pay attention, kid, you'll have a lot harder time picking up maths in the future if you let it slip now.


Actually, most algebraic tricks in calculus, which are what make it challenging, are pretty unnecessary to know if you can program a computer or use existing math software. Conceptual tricks, on the other hand, which might be more aptly called "conceptual applications," should be understood and derived. The unfortunate thing is that many students get so accustomed to memorizing tricks that would indeed require really advanced math abilities and lots of time to derive that they gloss over valuable concepts that they should actually understand very well.


No, not actually.

I only used the word "trick" because you did. There are NO TRICKS in math. If a practitioner uses the word "trick" its because he's being facetious or joking. The algebraic manipulations which you find challenging are absolutely essential to master inside-out to be able to follow more advanced mathematical reasoning. You'll see these "tricks" again in a profoundly more generalized form if you study Algebra again (abstract algebra, that is).

There is utility in doing algebra on your computer when you're dealing with literally pages for one expression. That is done to save time and reduce the probability of errors-- and NOT because you "can't" otherwise do it. Even then, you'll need to manually sanity-check the work using skills you learned doing all those tedious problem sets.


Memorizing culture is a social issue and an issue about teachers, not about curriculum. Or you think a teacher can't tell the difference between a student that memorized a few formulas and the student that understood the concepts behind those?

EDIT: When I was in high school there were a few teachers that frowned upon memorizing things. They would give low grades to such students. Then halfway my education I went to live in another country where memorization was the accepted learning method and I've felt the difference greatly so I know what you're talking about.


I guaran-fuckin-tee you that Earnest Hemingway knew what a gerund is.


Isn't this obvious to anyone who takes a little time to think about evolution and biodiversity? Yes, little changes caused by chance persist if they don't hinder the creature's ability to survive & reproduce. And these relatively benign mutations can accumulate until one family is very different from one that was once more obviously related. I don't see how this "surprising" and "provocative" proposal that chance plays a role in biodiversity is at all "controversial" or even new. What's more, the journalist seems to think that this "finding" means she can downplay natural selection, when in fact this article should really be nothing more than an emphasis of the role of chance in speciation since natural selection is basically predicated on the biodiversity caused by chance mutations. This really seems like publishing something 100 years after the first airplane was built to say: "Hear ye! Amazing new finding: In the absence of wings or power, an airplane simply falls due to gravity!" Maybe the actual research/paper is of more merit than the article suggests?


In the 1950s, nobody would have believed that "genetic drift" happened at all. The reason was because back then, the thinking was that any genetic change must be adaptive and caused by natural selection. The accumulation of "neutral" changes-- genetic drift-- was very controversial when it was first proposed, and wasn't really widely accepted until at least the 1980s or 1990s.

This main thrust of this paper seems to be the idea that genetic drift is the primary driver of speciation (the process of creating new species.) The paper also makes claims about a "speciation clock" that seems to tick every 2 million years, for many different species.

These two claims seem somewhat at odds. Presumably genetic drift will happen more rapidly in species like fruit flies or beetles that have short generation times. Genetic drift should be slow in species like whales since they take so long to reproduce, and drift can't happen except during reproduction. So if species are splitting off every 2 million years for both whales and fruit flies, that suggests genetic drift can't be the only factor, since it's 1000x faster (or more) in fruit flies as in whales. Anyway, I didn't read the paper that closely, so maybe I'm missing something.

In general, speciation is a complicated and poorly understood topic with a lot of controversies remaining. For example, is it more common for a species to split into two because of a physical obstruction (body of water, mountain range, etc.), or because different sub-populations start occupying different ecological niches due to slightly different environments, and gradually grow more distinct? (allopatric speciation versus parapatric speciation). It's hard to actually measure how much mixing there was between populations millions of years ago, so this is still controversial. And parapatric speciation gives more weight to natural selection so it used to be the preferred explanation. But this is far from a settled debate.


Just look at those graphs of distributions for speciation times. Reading the description of the "speciation clock", i would have imagined a sharp distribution with a very tall peak at 2 million years. Yes, the mode is about 2 million years. But the distribution is veeery broad. Looking at the graphs, there is no reason to believe that whales and flies have similar speciation times. A very misleading description, unfortunately quite typical of the scientific news cycle.


I think the interesting conclusion is that most speciation, emergence of populations that an no longer interbreed, is the result of neutral mutations, not ones that are naturally selected for. Adaptive mutations, I suppose, become universal before they split the species.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: