http://www.sciam.com/podcast/episode.cfm?id=thinking-of-human-as-machine-09-02-24
This is the first idea about speech recognition that sounded right in a long time. The idea is to try to understand how it is that the human brain picks up speech and decodes it as guidelines for how we might make computers do the same thing.
While this short snippet is short of details it mentions the idea that different neurons respond to different frequencies. I have no idea how state-of-the-art speech recognition is done these days, but I bet there’s a lot of things that we can learn from seeing how the brain does it. The premise that the researcher in the above link is working on is that it’s a more mechanical process in the brain than we think and that maybe we can leverage that.
Kind of cool. Makes me wonder if we might eventually get this stuff to work after all.