"Streams of tokens" is a play on words of the phrase "streams of consciousness" to describe "The Process". The Process is what I call the series of events up to and specifically at the moment of a token being generated and experienced
Coming soon: I deleted my body of work a while ago to "move on" beyond coding. Most of these projects are older and recovered from forks, but are being reimagined for modern AI. More details soon!
- Prototype swarm agent framework
- Prototype generative linux emulator
- Study Ilya's recommended papers
- Build Ben Eater's breadboard computer
- Design a breadboard neuromorphic computer
- Vim Generative Literate programming
- Linux emulators running inside websimulators
- Homesteading with robots
- What would being a human copilot to an AI feel like?
- Is it possible to have an LLM embody itself in a robot?
(coming soon)