People abroad trusting America perhaps? America seems like it’s really destroyed its reputation lately.
Or could be referring to the decay of rule of law with ICE and the like, will the public ever trust the government again?
I know what Biden means, I’m saying that there is nothing worth going “back” to. People didn’t trust the USA, the wealthy understood they could secure their power by cooperating with it and using its capacity for violence; that’s how colonialism functions.