A course called Intelligence
Twenty years in, I keep noticing that the people who get the most from AI use the least of it. A function is just a small block of intelligence — and maybe the most important thing we can do right now is keep expanding our own.
Twenty-plus years into this craft, and if I'm honest, the future of programming as I've known it feels non-existent. Models will do the typing. Agents will do the gluing. The job, as a thing you do with your hands on a keyboard, is fading — and that hasn't really been the part bothering me. What's been bothering me is the part nobody seems to be measuring.
When I sit down with the worst AI model I can find and try to build something real — recently it was Donkey Kong, end-to-end, one prompt, around 500,000 tokens, a working game — I notice that the number of tokens I need to get to done is a fraction of what most people around me are spending. Not because I'm faster at typing, but because I think about the problem differently before I open the chat. The leverage isn't in the model anymore. It's in the operator. And the more I sit with that, the more I think it's the only thing left worth investing in.
Looking back at twenty years of code, every function I've ever written was the same gesture: take something complex, and lay out a small, precise block of intelligence that makes it simple. That's all programming really is — compose enough of those blocks and you've built a system; compose enough systems and you've built a worldview. Which is why I've come to believe that the muscle that makes someone good at programming — carrying parameters in your head, passing them through functions, holding state without losing the thread — is the same muscle that makes someone good at thinking, full stop. My grade school teacher called it mental math. I dismissed it at the time. I was wrong.
Watch what happens when someone outsources every thought to a model: they stop holding the parameters, stop composing, stop carrying. I don't think AI is making us dumber. I think it's making it optional to be sharp, and a lot of people are taking the option. That's the part I want to push back on, because the slow drift of human thinking onto the model is starting to feel like a bigger story than the model itself.
So here's the pitch — or really, what I keep mulling over. A course called Intelligence. Not after-school enrichment, not a bootcamp, not a credential. A course that doesn't try to make you employable, doesn't teach you a framework, doesn't promise a job at the end. It just tries to expand your neural network, with programming as the medium. Mental math, but for functions. Holding a dozen parameters in your head. Walking through a system in your mind before you touch a keyboard. Reading code the way a musician reads a score, and composing intelligence the way a composer composes a melody — because for me, the other thing that ever did this was music.
I think that's the answer. Or at least, I think it's an answer. But I'm putting it here because I'm not sure, and I'm curious what you think.
— Matt