Generating vs. Searching
March 13, 2026
2 min read
My Software for One post got some interesting comments on LinkedIn. One commenter called it "fast fashion for software": disposable, low-quality and wasteful. Another raised environmental concerns: AI needs a lot of energy and water that's not part of the costs of $0.59.
Both excellent points. I think we all need some time to process what this will mean. My post was meant to be provocative and a bit philosophical at the same time. We can already see the impact that scaling up AI and using it on a large scale is having on our resources, energy in particular. It's not fun and it seems to add to the challenges we already have. On the other hand it's also not yet clear to me what benefits AI will bring.
For these two small apps specifically though, generating them is actually more energy efficient than I thought. The inference energy of Kimi K2 for 37,000 tokens comes out at about 0.000649 kWh: a few minutes of a LED bulb or roughly 15 Google searches (based on figures from energy.inference.net and this paper). This seems like a fair like-for-like comparison, since Google search energy figures are also based on operational energy.
To find the right versions of these apps that fit my requirements (minimal, no ads, open source) I would definitely need quite a few Google searches, browsing websites and trying different versions. So there is a case to be made that even with current tech, generating these small apps on the fly is more energy efficient than searching for them.