web analytics

Learning to talk

I’ve been thinking lately about how a computer might learn to speak normal English. Of course, there are lots of really bright people working in the area of natural language processing (NLP), and it seems to be a difficult problem to solve. I wonder how children seem to do it so easily, also a topic much studied, and if a computer might be programmed to do it in the same kind of way.

Children learn languages in the context of a body existing in a physical world. They learn to match words, or sounds really, to things they see. Later they learn to connect word-sound-things to actions in order to interact with other speakers. Children seem to naturally want to communicate with others and interact with things. Even so, it takes several years of constant interaction to learn the vocabulary and grammar of a language.

Computer programs don’t really seem to have a direct connection to the physical world. I expect this means the learning process would be different. It’s pretty easy to start the program off with a large vocabulary of words (e.g., WordNet) and some fairly sophisticated grammar rules for parsing sentences. What’s hard is to do is get the program to “understand” what the words and sentences mean. The program might be able to parse a sentence given it, and generate a grammatically correct response, but it doesn’t really understand the language yet. There are lots of these kinds of programs floating about (e.g., Alicebot) and they’re pretty impressive…but they don’t understand.

So, what does it mean for a computer program to understand English? How would a program learn to understand?

Author: Rob

I'm a retired engineer, a Jesus-follower, a long-time computer hobbyist, a Starfinder GM, a sometimes player of chess, and a solidly mediocre guitar strummer.

Comments are closed.