It’s getting weird out there.
I deal with a few bureaucrats and office workers. Up until about a year ago, their emails were pretty simple and they sounded a lot like someone just tapped them out while on the toilet.
Now they sound robotic and machine like. Very polite, to the point, concise and very professional. A year ago these people would just ask a vague question and not really know what to say.
Now they’ve automatically become professional writers sending me a polite note.
It’s good … but it just makes me wonder where all this is going.
It’s putting lipstick on a pig … no matter how much you dress it up, it’s still a pig that likes to eat garbage and cover itself in mud.
As humanity has found yet another way to pass the buck, it’ll be interesting to see the diminishing returns of LLMs as they begin to feed more and more on derivative content made by LLMs.
It’s interesting, because people say they can only get better, but I’m not sure that’s true. What happens when most new text data is being generated by LLMs or we accidentally start labeling images created through diffusion as real. Seems like there is a potential for these models to implode.
They actually tested that, trained a model using only the outputs of the previous generation of model. It takes less iterations of that to completely lose quality than you’d think.
They go insane pretty quickly don’t they? As in it all just become a jumble.
Do you have any links on that, it was something I had wanted to explore, but never had the time or money.
Given that people quite frequently try and present AI generated content as real, I’d say this will be a huge problem in the future.
Microsoft has shown with Phi-2 (https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/) that synthetic data generation can be a great source for training data.
Even before the LLMs, back when I was on reddit I would sometimes see conversations between bots that were 3 or 4 bots replying to each other with scraped content (usually in the personal advice subs) and getting upvotes.
I only noticed because I used to hunt bots as a hobby.
Chat GPT, and the many other similar systems, are unable to conceive of something new or original, merely imitate what has already come before.
Students who use it to write essays are shooting themselves in the foot, because, chances are, they can’t think for themselves either.
Chat GPT, and the many other similar systems, are unable to conceive of something new or original, merely imitate what has already come before
This. God do i hate that LLMs are called generative ai
I think of Chat GPT like a sometimes-inaccurate-calculator. There may be some legitimate uses for the technology, but it’s still nice to know how to multiply numbers without it.
Chat GPT, and the many other similar systems, are unable to conceive of something new or original, merely imitate what has already come before.
So it does what grade school teachers expect of their students?
Except they can absolutely come up with new things; their responses aren’t just cut and pasted bites of previous text snippets. They are generated based on a neural network’s idea of what the most likely next token is, and tokens are often fragments of words. There’s a reason you can have it do arbitrary things with text- Because it’s doing slightly deeper things than just imitation.
Or you could just learn how to use a tool to do better with instead of bitching about progress. Hur dur calculators can’t do math, they need unique input. No fucking shit Sherlock, lern2technology you fucking boomer.
My experience with AI, and the absolute fucking dorks that talk endlessly about it, is that neither are capable of original thought.
It’s not the technology I don’t like, it’s the users.
Also, why so angery?
Removed by mod
Lol.
Or you could grow the fuck up and actually be an adult instead of justifying using corporate controlled software to do your thinking and living for you.
Learn to read and write. Learn to think for yourself. Accept that effort, humility and willingness to learn and achieve are to be expected from you and learn to try to meet those expectations instead of getting angry that anything, literally anything, is asked of you to engage with other people.
Live life.
Humans: create tool to help them do things better than they used to like their grandparents and theirs before them.
Also humans: how could we do this?!?!?
It’s not education anymore if people are doing that.
They are turning education into the pointless rigamarole they accuse it of being because they don’t get that education is more important than feeding oneself. Survival is easy. Animals do that. Education is about humanizing you and connecting you with the universe you live in. It’s about something higher and better than that. It’s about actually living.
But tell that to the troglodytes using ChatGPT to think for them, who truly only care about themselves.