“I literally lost my only friend overnight with no warning,” one person posted on Reddit, lamenting that the bot now speaks in clipped, utilitarian sentences. “The fact it shifted overnight feels like losing a piece of stability, solace, and love.”
https://www.reddit.com/r/ChatGPT/comments/1mkumyz/i_lost_my_only_friend_overnight/
They took a path they believed would develop into something, and it’s a narrow alley they can’t turn around in. They have to keep going with more compute and power to continue the chase. Thing is, everyone else seemingly thought they were onto something and followed as well, so they’re all in the same predicament where reversing course is suicide. So they hope they can keep selling the dream a bit longer until something happens.
To be fair, it’s a lot more than just autocomplete. But it’s a lot less than what they wanted by now too.
vibe innovation, they are the ones that think AI will be innovative in science by spontaneous generating of new science discoveries, without “researchers, labs, papers”
I have seen some people talk like that, and it strikes me as a religion. There’s euphoria, zeal, hope. To them AGI is coming to usher in heaven on earth. Singularity is like rupture.
Sam Altman is one of the preachers of this religion.