"We argue that the expansion of the token budget does not resolve a deeper constraint: under structural uncertainty, the decisive variable is not how many questions can be answered but which questions are worth asking -- a problem of agency and direction that computation alone cannot solve."
https://arxiv.org/abs/2603.06630
#LLM #physics #informationTheory #thermodynamics #emtropy #economics

Photons = Tokens: The Physics of AI and the Economics of Knowledge
Debates about artificial intelligence capabilities and risks are often conducted without quantitative grounding. This paper applies the methodology of MacKay (2009) -- who reframed energy policy as arithmetic -- to the economy of AI computation. We define the token, the elementary unit of large language model input and output, as a physical quantity with measurable thermodynamic cost. Using Landauer's principle, Shannon's channel capacity, and current infrastructure data, we construct a supply-and-demand balance sheet for global token production. We then derive a finite question budget: the number of meaningful queries humanity can direct at AI systems under physical, information-theoretic, and economic constraints. We apply Coase's theory of the firm and the durable-goods monopoly problem to the AI value chain -- from photon to atom to chip to power to token to question -- to identify where economic value concentrates and where regulatory intervention is warranted. We argue that the expansion of the token budget does not resolve a deeper constraint: under structural uncertainty, the decisive variable is not how many questions can be answered but which questions are worth asking -- a problem of agency and direction that computation alone cannot solve. We connect limits of measurement in the token economy to a structural parallel between Goodhart's law and the Heisenberg uncertainty principle, and to Arrow's impossibility result for efficient information pricing. The framework yields order-of-magnitude estimates that discipline policy discussion: at current efficiency, the projected 2028 US AI energy allocation of 326~TWh could support roughly $6.5 \times 10^{17}$ tokens per year, or 225,000 tokens per person per day -- more than three orders of magnitude above estimated mid-2024 utilization.