State of client side HTTP in #python

- stdlib http.client - HTTP/1.1 only, docs recommend to use requests instead.
- requests - poorly maintained, in 2020 stopped working when servers deprecated older TLS versions. HTTP/1.1 only, sync only
- #httpx - comparatively slow and API design is driven by compatibility with browsers. Some users plug-in aiohttp for better performance. Supports HTTP/2 but it's discouraged as not optimized.
- #aiohttp - good API, both client and server but seems more focused on the server side; HTTP/1.1 only.
- #niquests - async fork of requests with HTTP/2, but uses forked urllib3 with the same package name as the original, which messes up deployment.
- #aioquic - client & server, HTTP/3 only

🤦

#pythonRequests #programming

@mattesilver

Don't forget about the standard library. I have a friend who might be using it happily. Ignorance is bliss?

@cJ
first line of the `http.client` docs:

> See also: The Requests package is recommended for a higher-level HTTP client interface.

@mattesilver I’ve banned requests at my place of work after too many gremlins. urrlib3 is a bit limited, but straightforward once one stops thinking in requests terms.
@ljmc virtually every single http tutorial recommends `requests`.
The difference AFAIK is cookies and some convenience API for JSON and HTTP methods in requests, did I miss something? Still neither supports async nor HTTP/2+.
And instead of JSON->dict I'd rather parse it directly to objects with pydantic|msgspec