It certainly seems like human brains would need to store quite a lot of information to learn their first language. Yet, according to researchers from the University of Rochester in New York, a bit more space than what it would take to fill a floppy disk is necessary.
Younger folk could be forgiven for not knowing what a floppy disk is as they've been out of circulation for a number of years. But they were pretty common before and their representation is still used as the save icon for loads of software programs, games and the like.
Anyhoo, Frank Mollica and a team from the aforesaid University have used information theory, a branch of mathematics, to quantify the capacity needed for storing English-learning information, and the estimates show that it really isn't that much.
"I thought it would be much more," Mollica admits, via NewScientist.
The team started off with the distinct sounds needed for the making up of words, called phonemes. There are around 50 of these in the English language and they each require 15 bits of storage space, which amounts to 750 overall. The researchers also determined that the average English vocabulary consists of around 40,000 words, which creates a need for 400,000 bits.
Understanding the meanings of all of these words obviously requires a lot more information (12 million bits per 40,000 words) while the frequency at which they appear is also very important.
“It’s lexical semantics, which is the full meaning of a word. If I say ‘turkey’ to you, there’s information you know about a turkey. You can answer whether or not it can fly, whether it can walk,” Mollica adds.
The understanding of the frequency is said to take up around 80,000 bits while syntax takes up another 700 bits. The total number adds up to just over what a floppy can hold.
That figure only applies to English, but the estimates are quite broad and are likely just about the same for other languages.
Who would have thought that we would only need such little space in our heads to be able to process an entire lexicon?