Internship / Ph.D. proposal (w. J-B Fermanian):
"Exploring Conformal Prediction in Long-Tail Scenarios"
Come and work on @plantnet.bsky.social data with us!
Internship / Ph.D. proposal (w. J-B Fermanian):
"Exploring Conformal Prediction in Long-Tail Scenarios"
Come and work on @plantnet.bsky.social data with us!
🚨 New blog post 🚨
"**Optimal prediction sets for plant identification: an interactive guide**"
https://josephsalmon.eu/blog/long-tail/
Joint work with Tiffany Ding and Jean-Baptiste Fermanian.
#longtail
#PlantNet
#AppliedConformalPrediction
#ConformalPrediction
#statstab #357 Uncertainty Estimation with Conformal Prediction
Thoughts: Haven't parsed this properly but maybe be an interesting discussion point. How best to quantify uncertainty?
#conformalprediction #bayesian #confidenceintervals #uncertainty
#statstab #223 Conformal predictions w/ {marginaleffects}
Thoughts: Sometimes you need a range of likely future values. To get an assumption-free Prediction Interval, use conformal methods.
In the last couple of weeks I've been learning about #ConformalPrediction, a family of algorithms to measure the uncertainty of predictions made by #MachineLearning models.
Here are a few links to get you started:
- CP course by @ChristophMolnar https://mindfulmodeler.substack.com/p/week-1-getting-started-with-conformal
- Multi-class notebook (in Spanish) https://nbviewer.org/github/MMdeCastro/Uncertainty_Quantification_XAI/blob/main/UQ_multiclass.ipynb
- MAPIE library: https://mapie.readthedocs.io/en/latest/index.html
- TorchCP library: https://github.com/ml-stat-Sustech/TorchCP
Making the rounds again...
...Blackbox #MachineLearning models are now routinely used in high-risk settings, like medical diagnostics, which demand uncertainty quantification to avoid consequential model failures... #ConformalPrediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models...
[1] https://arxiv.org/abs/2107.07511
[2] https://arxiv.org/abs/2106.06137
Black-box machine learning models are now routinely used in high-risk settings, like medical diagnostics, which demand uncertainty quantification to avoid consequential model failures. Conformal prediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models. Critically, the sets are valid in a distribution-free sense: they possess explicit, non-asymptotic guarantees even without distributional assumptions or model assumptions. One can use conformal prediction with any pre-trained model, such as a neural network, to produce sets that are guaranteed to contain the ground truth with a user-specified probability, such as 90%. It is easy-to-understand, easy-to-use, and general, applying naturally to problems arising in the fields of computer vision, natural language processing, deep reinforcement learning, and so on. This hands-on introduction is aimed to provide the reader a working understanding of conformal prediction and related distribution-free uncertainty quantification techniques with one self-contained document. We lead the reader through practical theory for and examples of conformal prediction and describe its extensions to complex machine learning tasks involving structured outputs, distribution shift, time-series, outliers, models that abstain, and more. Throughout, there are many explanatory illustrations, examples, and code samples in Python. With each code sample comes a Jupyter notebook implementing the method on a real-data example; the notebooks can be accessed and easily run using our codebase.
⏩ Analítica acelerada con Shapelets y conformal prediction
https://fediverse.tv/videos/watch/8c55336e-d713-4829-ac8f-9e4b0178e4bd
Nos vemos *hoy* en nuestra reunión de marzo: ⏩ Analítica acelerada con Shapelets y conformal prediction, este mes en The Bridge
https://www.meetup.com/pydata-madrid/events/299749589/
¡Te esperamos a las 19:00! Y después, networking 🗣️
#PyDataMadrid #PyData #python #MachineLearning #ConformalPrediction #shapelets
The distinction between marginal and conditional coverage finally clicked for me. #conformalprediction provides the former but not the latter, and for many (most?) real-world use cases in ML one wants the latter.
If it sounds too good to be true...