"Goodhart’s Law is really a statement about the process of trying to make policy based on proxy measures of “internal states of complex systems” which are not themselves directly observable. "

Got to this piece from a link from Ben Recht and it finally nailed something that's always bothered me about the constant reference to Goodhart's law across software.

https://backofmind.substack.com/p/goodhart-as-epistemologist

goodhart as epistemologist

what did that "law" really say?

Dan Davies - "Back of Mind"

First of all I actually think people usually mean to reference behavioral incentive change, in which case you should say Campbell's Law probably; second of all neither of these "laws" ever meant that measurement can't be meaningful

"So the message of Goodhart’s Law is that if you’re setting targets, they ought to target the thing that you care about, not something which you believe to be related to it, no matter how much easier that intermediate thing is to measure."

Of course, many people come to these conversations so convinced that no shared meaningful measurement is ever possible, they would prefer to say: all measures are always going to be subject to goodhart's law

I don't believe that, though. I think for most of us, if you're going to take a medicine you will want to know how many people were helped or died after that medicine was administered. Measurement is necessary. I do understand we may have reasonable disagreements about the when and how

@grimalkina I think the key graf of Goodhart’s original observation that goes carefully uninterrogated is the words “for control purposes”. On his context, the next obvious question might have been “control of what”, but a more sanguine interpretation today might be, “of who”.
@grimalkina (In that light, the desire to handwave away the invocation of math might deserve some sympathy, even if it’s misguided, given people’s collective experience of the worst of Taylorism.)
@mhoye that's fine imho, but if that's ALL we ever talk about, I think we obscure moments when it's very handy for abuse and discrimination when certain things are rendered illegible and invisible. For instance, the current US admin's erasure of racial and gender data will render many effects illegible, which makes it difficult for people to advocate for policy change and show the true cost of things. I have faced many such situations where the "human side" (for lack of a better word) is...
@mhoye ...attacked even by the people who otherwise agree with the goals, because they can't help but bikeshed forever on this, and have a holy panic about measurement that isn't always warranted. Yeah, of course, OF COURSE sometimes it's warranted, but not every loudmouth white guy making a million dollars a year should spend his entire career never being accountable to every outcome because of an abstract ideal about how "humans are unmeasurable" that magically only seems to refer to him, yk?

@grimalkina @mhoye

It comes down to something like "feedback control isn't evil, evil people doing the controlling is evil".

In an ideal world of cooperation and non-domination we'd all be talking all the time about measurements we were making and how we should respond to them. ("Hey, the percentage of people who at risk of homelessness is increasing, let's do something!") Putting the response/control in the hands of a tiny few self centered people is the bad idea, not the act of measuring