irrealfriend
Note that the publication date of the original article was 19 December 2012. That doesn’t mean that it is no longer true! But if you’re feeling some deja vue, there’s a good reason.
Via kenobi-wan-obi:

IQ Myth Debunked by Canadian Researchers

An individual’s IQ score… is not a valid way of assessing brainpower, say Canadian researchers…
The results showed that… perform[ance] could only be explained with at least three distinct components: short-term memory, reasoning and verbal ability… not any single measure, such as an intelligence quotient…
They found that each cognitive component related to distinct circuits in the brain, supporting the idea of multiple specialized brain systems, each one with its own capacity.

Moral of the story: Don’t hold… ‘IQ’ with high regard to your intelligence. It doesn’t…represent your actual mental capabilities and will likely misinform more than inform…

Intelligence quotient v. influence quotient
For better or worse, it is ironic how obsessed we are now with quantification* e.g. social influence through Klout or Kred, impact factor in journal publications, researcher H-score etc. However, conclusions such as the results of this study, regarding the lack of meaning of IQ score, are welcomed warmly by nearly all. I believe that the conclusions of the study are mostly reasonable; more about that on a future post.
I’ll belabor the point. Humor me. Which is more representative of an individual, an IQ percentile result (remember, numeric values are irrelevant for cross-comparison between the many IQ tests) or a Klout score? Fortunately, many people acknowledge the inadequacy of Klout-like tools, outside of very limited contexts, or so I hope!
* Self-quantification, also known as quantified self, is different! I believe that it can be very useful to some people e.g. for weight management, diabetes-related blood sugar control, for athletic achievement tracking. That is because “quantified self” data should not be used for comparison purposes such that it has the effect of making one feel unnecessarily bad, or good about oneself. It is probably most meaningful when the results aren’t shared with anyone.

Note that the publication date of the original article was 19 December 2012. That doesn’t mean that it is no longer true! But if you’re feeling some deja vue, there’s a good reason.

Via kenobi-wan-obi:

IQ Myth Debunked by Canadian Researchers

An individual’s IQ score… is not a valid way of assessing brainpower, say Canadian researchers…

The results showed that… perform[ance] could only be explained with at least three distinct components: short-term memory, reasoning and verbal ability… not any single measure, such as an intelligence quotient…

They found that each cognitive component related to distinct circuits in the brain, supporting the idea of multiple specialized brain systems, each one with its own capacity.

Moral of the story: Don’t hold… ‘IQ’ with high regard to your intelligence. It doesn’t…represent your actual mental capabilities and will likely misinform more than inform…

Intelligence quotient v. influence quotient

For better or worse, it is ironic how obsessed we are now with quantification* e.g. social influence through Klout or Kred, impact factor in journal publications, researcher H-score etc. However, conclusions such as the results of this study, regarding the lack of meaning of IQ score, are welcomed warmly by nearly all. I believe that the conclusions of the study are mostly reasonable; more about that on a future post.

I’ll belabor the point. Humor me. Which is more representative of an individual, an IQ percentile result (remember, numeric values are irrelevant for cross-comparison between the many IQ tests) or a Klout score? Fortunately, many people acknowledge the inadequacy of Klout-like tools, outside of very limited contexts, or so I hope!

* Self-quantification, also known as quantified self, is different! I believe that it can be very useful to some people e.g. for weight management, diabetes-related blood sugar control, for athletic achievement tracking. That is because “quantified self” data should not be used for comparison purposes such that it has the effect of making one feel unnecessarily bad, or good about oneself. It is probably most meaningful when the results aren’t shared with anyone.

A story in doodles.

This is a lovely blue ballpoint pen drawing.

Christopher Niemann (I remember him!) has an irregular spot with the New York Times now. The prior entry was nearly two months ago. But that seems appropriate, for something this whimsical.  

A stream-of-consciousness mind map while waiting on the phone, on hold: Clair de Lune muzak… Mitt Romney… campaign finances… fluffy hair… woolly… sheep… Dimm-dimm dah-dideldimm… shepherds… playing lutes and dancing…

azspot

People have known about metaphor for millennia. Until recently,

metaphor was seen as a linguistic device in which you call one thing by the name of another thing that it’s similar to.

But in their 1980 book Metaphors We Live By, George Lakoff and Mark Johnson proposed a new explanation… if metaphor is only based on similarity, then you should be able to metaphorically describe anything in terms of anything else that it resembles. But Lakoff and Johnson observed that metaphor wasn’t used haphazardly, but rather, systematically and coherently.

You don’t metaphorically describe any thing as anything else.

Metaphor is unidirectional, from concrete to abstract. 

Morality is more abstract than cleanliness. And you can’t reverse a metaphor. “He’s clean” means he has no criminal record, but “He’s moral” wouldn’t be used to mean that “he bathed recently”.

Metaphorical expressions are coherent with one another. Consider understanding and seeing. All metaphorical expressions coherently cast aspects of understanding in terms of specific aspects of seeing. You always describe the… understood idea as the seen object, the act of understanding as seeing, the understandability of the idea as the visibility of the object… the aspects of seeing you use to talk about aspects of understanding consistently map to each other.

This led Lakoff and Johnson to propose that metaphor was deeper than just the words… the reason metaphorical language exists and the reason why it’s systematic and coherent is that people think metaphorically. You don’t just talk about understanding as seeing; you think about understanding as seeing…

You talk metaphorically because you think metaphorically!

Yes. That’s how I learn new concepts. Metaphor is also my mnemonic for later recall.

But I’ve been reading, quite often lately, that my cognitive (epistemological?) process is seriously flawed. Reductive. Simple-minded. How nice to see that in 1980 it was okay to do what I do! I hope it hasn’t been disproved, not reproducible now!

If so, if I am denied my metaphors, I may be be forced to resort to similes. Did Lakoff and Johnson have anything to say about similes, I wonder…

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Reductive? As used by Madonna recently, to describe Lady Gaga’s musical originality, or lack thereof? No, not exactly. If I were Madonna, I would have chosen to say derivative instead of reductive.

Honestly, I don’t know what I would have said! It wasn’t a fun question. I like many songs by Lady Gaga and by Madonna. But the Lady Gaga song-in-question does seem quite similar to Madonna’s song of 15+ years earlier. It is awkward, because Madonna probably doesn’t want to play the copyright game, nor be called out for making accusations in a live interview. Yet she doesn’t want to feel forced to lie, and compromise herself, not at this point in her life.

Reductive may be the superior word choice after all! It avoids the legal connotations of derivative, which is a good thing! Yet it allows Madonna to make her point.

I don’t think this has anything to do with cognition, metaphor nor Lakoff and Johnson. Maybe I have a slight anti-SOPA hangover…. Yesterday was intense!

rodica

On a fine autumn day, three mathematicians go duck hunting

via rodica:

Sighting their first duck in flight, the _applied mathematician_ carefully calculates his angle of fire, correcting for gravity and wind velocity, and pulls the trigger. Forgetting to correct his path for the acceleration of the duck, he misses to the right.

The _pure mathematician_ takes aim, calculating his trajectory based on the apparent motion of the duck, velocity of his projectile and the curvature of the earth. Forgetting to correct for air friction, he misses to the left.

The _statistician_ jumps out of his seat and exclaims:

We got him!

— Francis Chang

via MIT Technology Review

Back in the early 1990s, British anthropologist Robin Dunbar began studying the social groups of various kinds of primates… Primates tend to maintain social contact with a limited number of individuals within their group.

Dunbar noticed that primates with bigger brains tended to have more friends. He reasoned that the number of individuals a primate could track was limited by brain volume.

Then he plotted brain size versus number of contacts and

… extrapolated to see how many friends a human ought to be able to handle. The number turned out to be about 150. This number appears to have been constant throughout human history—from the size of neolithic villages to military units to 20th century contact books.

Dunbar’s Number = 150

Dunbar's Number

But now we have Twitter! And Facebook! Do these modern social networks allow us to break through the biological barrier and physical limitations dictated by Dunbar’s Number?

No.

Not according to this recent research paper Validation of Dunbar’s Number In Twitter Conversations:

…even though modern social networks help us to log all the people with whom we meet and interact, they are unable to overcome the biological and physical constraints that limit stable social relations

The bottom line is this: social networking allows us to vastly increase the number of individual we can connect with. But it does nothing to change our capability to socialise. However hard we try, we cannot maintain close links with more than about 150 buddies.

And if Dunbar is correct, that’s the way it’ll stay until somebody finds a way to increase human brain size.