This sort of ignores the fact that the advances in that technology are widespread applicable to all tasks, we literally just started with text and image generation because:
The training data is plentiful abd basically free to get your hands on
It’s easy to verify it works
LLMs will crawl so that ship breaking robots can run.
We’re in the first days and everyday I add a new model or tech to my reading list. We’re close to talking to our CPUs. We’re building these stacks. We’re solving the memory problems. Don’t need RAG with a million tokens, guerrilla model can talk with APIs, most models are great at python which is versatile as fuck, I can see the singularity on the horizon.
Try Ollama if you want to test things yourself.
Use GPT4 if you want to get an inkling of the potential that’s coming. I mean really use it.
This sort of ignores the fact that the advances in that technology are widespread applicable to all tasks, we literally just started with text and image generation because:
The training data is plentiful abd basically free to get your hands on
It’s easy to verify it works
LLMs will crawl so that ship breaking robots can run.
Second this.
We’re in the first days and everyday I add a new model or tech to my reading list. We’re close to talking to our CPUs. We’re building these stacks. We’re solving the memory problems. Don’t need RAG with a million tokens, guerrilla model can talk with APIs, most models are great at python which is versatile as fuck, I can see the singularity on the horizon.
Try Ollama if you want to test things yourself.
Use GPT4 if you want to get an inkling of the potential that’s coming. I mean really use it.