Linguistics and Semantics
Today I went to a meeting with some very clever guys. Sometimes it amazes how intelligent some people are, but also how they are able to apply their intelligence to solve practical problems and also to to financial profit.
We were talking about linguistics, specifically about how to apply linguistics theory to make computer searches and man-machine communications intelligent.Semantics is the study of the meaning of words or phrases. But before we get to phrases, words are not unambiguous! Say "orange": do you eat it, paint or color with it, drink it or communicate with it. Maybe phrases are easier: you the get some context sensing.
And what about word morphology, synonyms, homonyms and homophones? Humans have an uncanny ability to differentiate different parts of speech.
Imagine how much more useful a Yahoo search would be if you could ask your question in your own words, in English, or any other natural language (and translate that into "good" English) instead of some form where you have to guess what your computer might like to hear. Google just hired its first (real) linguist, and so their team comprises not just the computer programmers and the analysts that make up the usual crop of its employees.
Linguistics is a theoretical framework used to describe language. This representation can be graphic, trees or other. Work has been, unsuccessfully, going on for about sixty years, largely at Harvard or MIT in Boston, and largely influenced by the work of Noam Chomsky. Others include Steven Pinker, The Stuff of Thought and The Blank Slate and Jerry Fodor who loves carburetors and doorknobs, a childhood fetish of his.
The problem is that in 60 years these guys haven't gotten too far. Sure, they understand a lot about the structure, but as of something useful to solving the problem of defining how humans develop language, well they haven't produced anything usable. Most of their knowledge is tied up in academic papers and seminar papers, discourses and dissertations.
How do we trawl through masses of data. Do we need to build tags with human intervention? Should the search ask the human user for help along the way ("do you mean an orange coloured pencil?").
So how do we bridge the gap to provide natural language for Google, Yahoo and any database retrieval work? Everyone is using the term "semantic web", but most people don't have a clue what this is. We still don't have real have Web 2 and they're already talking about Web 3 using semantic web technology. Can Google maintain its market dominance as other smaller and leaner startups develop real technologies?
There are a number of theories on how the human brain comprehends words and translates concepts. Do we know all these things as part of our biology, what the witty Fodor calls Extreme Nativism, or at the other end of the spectrum, Geoffrey Numberg's Radical Pragmatism, where no word has any meaning and depends totally on context and the culture of the speaker (and listener if they are to understand each other). Deirdre Wilson agrees and so do others from outside Linguistics.
Well then is Linguistics a science as defined by Wikipedia? Do all languages work by the same rules? Are some more or less dependent on context? culture? are some brains wired with innate language ability? Can new-born babies differentiate speech from nonsensical sound? Any language? Backwards speech? Would a human brought up by apes (Tarzan) speak? If so, then what language?
This is a cool subject. I want to see more on it!