Joi Ito named head of MIT Media Lab

I just wanted to throw up a quick post to congratulate Joi Ito on his new position as the head of the MIT Media Lab. I first met Joi years ago through his IRC channel, which played an integral role in exposing me to the internet and its culture, and gave me a wonderful community of friends all over the world. Joi’s work is a great example of how embracing technology can make the world a better place, and I’m sure we’ll see exciting things happen during his tenure at MIT.

Update: Joi’s post about this.

Kottke on Neurons

“Our brains have Oprah neurons, Aniston neurons, Eiffel Tower neurons, and Saddam neurons that fire when we see pictures or hear the names of these people and places.” - Jason Kottke

While I’m all for public interest in science, especially neuroscience, its a pity when undecided questions are reported as solved.

The issue in question is one of the neural coding of semantic information. Jason and the New Scientist article he links to describe what is known as the Grandmother Cell theory. In short, the theory argues that most distinct semantic concepts each have their own dedicated neuron which fires when we access that concept.

The problem with this theory, despite the fact that our brains are never actually this simple, is that there simply aren’t enough neurons in the right areas to encode all the possible content we might encounter. What would happen when we run out of neurons?

An alternative to the Grandmother Cell theory is the Distributed Representation theory (also called a neural network), which argues that semantic content is encoded by the specific structure of connections between neurons. This, to me, sounds much more reasonable. Realistically though (and as seemingly suggested in the article, though they don’t outright say it) is that our brains probably work in a way that combines the two theories.