It doesn’t exist “since about 1950”, mate. It’s actually hilarious just thinking about the sheer volume of transistors they’d need to do the basic AI operations back then. :D
The concept exists for a long while, sure, but by 2010 we’ve only had “deep learning” as the first stepping stone towards “something like” AI.
It most definitely wasn’t possible on the home PC hardware, and it would be absolutely ridiculous to have an “AI” doing this sort of thing, when it can be achieved for a marginal fraction of the resources with a simple weighing and sorting algorithm…
It is all about definition of AI, it seems to me, that you only consider token transfmators which produce text and images as AI, while I consider AI as an umbrella term including machine lets which exist since 1950, mate
Yes, we have
Only LLM that work “good enough” are “new”
AI exists since about 1950
It doesn’t exist “since about 1950”, mate. It’s actually hilarious just thinking about the sheer volume of transistors they’d need to do the basic AI operations back then. :D
The concept exists for a long while, sure, but by 2010 we’ve only had “deep learning” as the first stepping stone towards “something like” AI.
It most definitely wasn’t possible on the home PC hardware, and it would be absolutely ridiculous to have an “AI” doing this sort of thing, when it can be achieved for a marginal fraction of the resources with a simple weighing and sorting algorithm…
It is all about definition of AI, it seems to me, that you only consider token transfmators which produce text and images as AI, while I consider AI as an umbrella term including machine lets which exist since 1950, mate
AI is very specifically defined as “Artificial Intelligence”. A quite complex calculator (devices possible in the 50s) is not an AI.
To be fair, the current tech we call “AI” is not really AI either. We’re a couple of decades away from that still.