It would be super cool if the tech media would simply stop credulously repeating every value-juicing statement C-Suites make like they’re the in-house PR department. The only difference between these hollow promises and the guy crying, “The end is near!” on the street corner is funding.
Like she’s blatantly full of shit. No one will follow-up on this. The models have already absorbed every publicly available dissertation ever written so what’s stopping it from being this “smart” now? And it still doesn’t address the “hallucination” problem. It’s a nonsense statement.
The thing statements like this are hiding is basically every LLM from the major players has already reached the end of the training data universe. They’ve absorbed everything there is to scrape. They’re in desperate need of new data to build larger models. That data isn’t being created fast enough.
Not to mention they're sucking up so much energy that companies are starting to invest in nuclear fusion startups in order to train bigger models. This is insanity for the purpose of keeping the VC money flowing.
wapo.st/3KSgQxx