My thoughts on AI
Over the past few years, I have written extensively about large language models and machine learning. Back in 2024, I even presented at DevCon Midwest on this topic. While I advocate for the smart use of AI, I have noticed an alarming trend. In late 2023 and early 2024, many applications emerged that were merely built on top of platforms like ChatGPT. I believe this approach is short-sighted. If your business relies on ChatGPT’s continued availability and stable token prices, what happens if ChatGPT is no longer around or if token costs increase? Your business could become unviable.
By 2025, the widespread use of tools like ChatGPT and Gemini has led to a growing demand for new data centers across the country. This has left communities grappling with the implications of such rapid development. Companies like NVIDIA are using cloud capacity buyback agreements and GPU-backed debt structures to effectively lend companies like Google, OpenAI, and Meta over $100 billion to buy GPUs. Meanwhile, companies like Applied Digital and CoreWeave are building the actual data centers, so cloud AI providers aren’t on the hook if demand dips.
I don’t believe my concerns in the early years of cloud AI are that different from those in 2025 and 2026. If you don’t “own the stack”, the risk of something disastrous happening is too high. There are ways of responsibly using AI, though. I demonstrated in my DevCon Midwest presentation that you can use AI at scale with hardware you control. That hardware could be local or remote. You might need to be more careful about how you implement it, but if a company using borrowed GPUs fails to pay for its leased space in a third-party data center, it won’t take down your app in the process.
Not everything needs AI-integration. It’s a really cool new tool, but it doesn’t solve every problem. Please use it wisely.
#ChatGPT #Gemini #Ollama