Unlike intercontinental ballistic missile these nuclear-powered ramjet cruise missiles could be recalled... I don't think I would want this thing coming back 😅
I guess this guy doesn't like to get misgendered
I think you're right, the Liberals will struggle especially with the Canadian tradition of Prime Ministers not stepping down as party leader until they've lost an election. If Trudeau had enacted the electoral reform he promised the NDP would at least have had a chance. I'll miss my teeth 😬
🤡 We are the dumbest western country by far.
I guess you missed the whole Trump thing
I know the difference. Neither OpenAI, Google, or Anthropic have admitted they can't scale up their chat bots. That statement is not true.
Currently very few jobs should be replaced with AI. But many jobs should be augmented with AI. Human-in-the-loop AI amplify the finate resource of smart humans.
That may have been true for the early LLM chatbots but not anymore. ChatGPT for instance, now writes code to answer logical questions. The o1 models have background token usage because each response is actually the result of multiple background LLM responses.
The title of the article is literally a lie which is easily fact checked. Follow the links to quotes in the article to see what the quoted individuals actually said about the topic.
No, a chat bot as it's talked about here is not an LLM. This article is discussing limitations of LLM training data and inferring that chat bots can not scale as a result. There are many techniques that can be used to continue to improve chat bots.
I'm sorry if I'm coming across as condescending, that's not my intent. It's never been "as simple as just throwing more data and CPU at the problem". There were algorithmic challenges for every LLM evolution. There are still lots of potential improvements using the existing training data. But even if there wasn't, we'll still see loads of improvements in chat bots because of other techniques.
Edit: typo
People that don't understand those terms are using them interchangeably
Yes of course I'm asserting that. While the performance of LLMs may be plateauing, the cost, context window, and efficiency is still getting much better. When you chat with a modern chat bot it's not just sending your input to an LLM like the first public version of ChatGPT. Nowadays a single chat bot response may require many LLM requests along with other techniques to mitigate the deficiencies of LLMs. Just ask the free version of ChatGPT a question that requires some calculation and you'll have a better understanding of what's going on and the direction of the industry.
OpenAI, Google, Anthropic admit they can’t scale up their chatbots any further
Lol, no they didn't. The quotes this articles are using are talking about LLMs not chatbots. This is yet another stupid article from someone who doesn't understand the technology. There is a lot of legitimate criticism for the way this technology is being implemented but FFS get the basics right at least.
No, they'll be weird looking hairless bears
Oh, you're just asking for a bear attack!
Kings 2:23-24
23 From there Elisha went up to Bethel. As he was walking along the road, some boys came out of the town and jeered at him. “Get out of here, baldy!” they said. “Get out of here, baldy!” 24 He turned around, looked at them and called down a curse on them in the name of the Lord. Then two bears came out of the woods and mauled forty-two of the boys.
I hope they include this. We should instill the fear of random bear attacks in boys. Otherwise they'll make fun of me when I inevitably go bald.
How did your parents react meeting your now husband? And how did his parents react meeting you?
To be fair, some of my weekends have been a mistake.
I'm not sure about -15C but don't operate a freezer at -15K as it violate the fundamental laws of thermodynamics.
100%, the anti AI hype is as misinformed as the AI hype. We have so much work ahead of us to effectively utilize the current LLMs.
Burrowing badgers cause £100,000 damage road
The animals have dug a sett under the A52 at Seaholme Road in Mablethorpe, the council says.
Sixty-three thousand of the limited edition coins were reported stolen in July.
Imagine trying to off load 40K in one dollar bluey coins 🤣