Models analyze language
"If improvement is just a matter of increasing both computational power and the training data, then Beguš thinks that language models will eventually surpass us in language skills.
"Mortensen said that current models are somewhat limited. 'They’re trained to do something very specific: given a history of tokens [or words], to predict the next token,' he said. 'They have some trouble generalizing by virtue of the way they’re trained.'
"But in view of recent progress, Mortensen said he doesn’t see why language models won’t eventually demonstrate an understanding of our language that’s better than our own. 'It’s only a matter of time before we are able to build models that generalize better from less data in a way that is more creative.'
"The new results show a steady chipping away at properties that had been regarded as the exclusive domain of human language, Beguš said. 'It appears that we’re less unique than we previously thought we were'."
Comments
Post a Comment
Empathy recommended