Severe testing of deep learning models of cognition (i)
https://errorstatistics.com/2026/01/29/severe-testing-of-deep-learning-models-of-cognition-i/
Severe testing of deep learning models of cognition (i)
https://errorstatistics.com/2026/01/29/severe-testing-of-deep-learning-models-of-cognition-i/
Drop #764 (2026-01-29): Thursday Three-fer
https://dailydrop.hrbrmstr.dev/2026/01/29/drop-764-2026-01-29-thursday-three-fer/
Neue Ausbildungsperspektiven bei FIZ Karlsruhe!
Seit dem 20.01.26 sind wir Partner-Institut der @Hochschule Darmstadt für den Master of Advanced Studies (MAS) „Data & Information Specialist“. Damit erweitern wir unser Volontariat um einen akademischen Abschluss.
Unsere Volontär/innen arbeiten praxisnah im Bereich Patent & Scientific Information – mitten in Daten, Wissen & Innovation.
🔗 https://information-specialist.h-da.de/doku.php/?id=wd:start
#Ausbildung #Volontariat #MAS #Informationswissenschaft #DataScience #OpenScience
📊 The DATAZUR Data Workshop took part yesterday in a collective “Data Session” at LAMHESS – Laboratory of Human Motricity, Expertise, Sport, and Health. Many thanks to Stephen Ramanoel for the invitation, and to the thirty faculty members, researchers, and doctoral students who attended for these very interesting discussions.
Together, we explored the challenges of managing and opening research data in the fields of elite performance and sport for health. 🏃♀️➡️
OpenAI deploys GPT-5.2 data agent across 600PB infrastructure serving 3,500 users. Analysis time drops from days to minutes across 70K datasets.
#AdwaitX #OpenAI #GPT52 #AI #DataScience #EnterpriseAI #news
https://www.adwaitx.com/openai-gpt-5-2-data-agent-600-petabytes/
The Algorithmic Eye: How Artificial Intelligence is Discovering New Worlds
Artificial Intelligence: AI and the Discovery of New Worlds: How Machine Learning is Mapping Thousands of Exoplanets and Redefining Astronomy in 2026 | The Boreal Times
For centuries, the discovery of a new planet was a rare, monumental event, often the result of a lone astronomer spending years peering through glass and calculating orbits by hand. Today, we are in the midst of a cosmic census that has identified over 5,500 exoplanets—worlds orbiting stars other than our Sun. This explosion of discovery is not due to a sudden increase in the number of telescopes, but rather a revolution in how we process information. Artificial Intelligence (AI) and Machine Learning (ML) have become the indispensable partners of modern astrophysics, acting as an “algorithmic eye” capable of seeing patterns in the dark that human observers simply cannot perceive.
The empirical challenge of modern astronomy is one of volume. Telescopes like NASA’s TESS (Transiting Exoplanet Survey Satellite) and the European Space Agency’s Gaia mission generate petabytes of data, capturing the brightness of millions of stars simultaneously. Somewhere in that sea of noise is the tiny, periodic dip in light that signals a planet passing in front of its star. Finding that needle in a digital haystack requires the speed and precision that only AI can provide.
Beyond Human Perception: Neural Networks in the Light Curve
The primary method for finding exoplanets is “Transit Photometry.” When a planet passes between Earth and its host star, it blocks a minuscule fraction of the star’s light. Humans are naturally good at pattern recognition, but we are slow and prone to fatigue. Furthermore, the signal of a small, Earth-sized planet is often buried under “stellar noise”—the natural flickering and sunspots of the star itself.
Empirical research has shown that Deep Learning models, specifically Convolutional Neural Networks (CNNs), are exceptionally effective at distinguishing these subtle planetary signals from random noise. In 2017, NASA announced the discovery of Kepler-90i, the eighth planet in its system, which was found not by a human, but by a neural network trained to scan Kepler mission data. This confirmed that AI could find “hidden” planets in old data that had already been searched by conventional means. By 2026, these algorithms have evolved to be 100 times more efficient, allowing researchers to re-process decades of archival data to find worlds we never knew existed.
AI as a Filter: Managing the Data Deluge
The upcoming Vera C. Rubin Observatory in Chile is expected to produce 20 terabytes of data every single night. If humans had to review every “alert” or change in the sky detected by such an instrument, it would take centuries. AI serves as a critical filter, categorized as “Automated Transient Classification.”
These systems automatically identify whether a change in brightness is a supernova, an asteroid moving through the field, a variable star, or a potential exoplanet transit. By the time a human astronomer arrives at their desk in the morning, the AI has already discarded the 99% of “noise” and presented a curated list of high-priority targets. This synergy allows the scientific community to focus on the “why” and “how” of these worlds, rather than the tedious “where.”
Characterizing Habitability: The Search for Biosignatures
Finding a planet is only the first step. The next, more difficult frontier is determining what that planet is made of. The James Webb Space Telescope (JWST) uses spectroscopy to analyze the chemical composition of exoplanet atmospheres. This process produces complex spectra with thousands of overlapping lines representing different elements like water, methane, and carbon dioxide.
AI is now being used to perform “Spectral Retrieval.” These algorithms compare the observed data against millions of simulated atmospheric models in seconds. This allows scientists to identify potential “biosignatures”—chemical combinations that might suggest the presence of life. While AI has not yet confirmed extraterrestrial life, it is the tool that makes the search feasible by calculating the probability of certain gases existing in equilibrium.
The Convergence of AI, Technology, and Private Enterprise
The rise of AI in space discovery mirrors the themes of “Technological Supremacy” and “AI Colleagues” frequently discussed in The Boreal Times. Private companies are now entering the fray, developing proprietary algorithms to mine public space data for potential commercial or scientific value.
This creates a new era of “Computational Astrophysics,” where the most valuable asset a space agency possesses is not just its hardware, but its training datasets. As AI becomes more autonomous, we are approaching a point where telescopes may be able to self-direct their observations—identifying an anomaly and re-targeting themselves in real-time to capture a once-in-a-lifetime event without waiting for human intervention.
Citizen Science and the AI-Human Partnership
Despite the power of machine learning, the human element remains vital. Programs like Planet Hunters TESS allow members of the public to classify light curves. Interestingly, the best results often come from a “Hybrid Approach,” where AI performs the bulk of the classification, and humans review the “edge cases” where the algorithm is uncertain. This teaches the AI to be better, while allowing humans to use their intuition to spot anomalies that the algorithm might have been programmed to ignore.
For enthusiasts, this is the most accessible era of astronomy. You do not need a multi-million dollar observatory to contribute; you need a computer and an understanding of data. Learning Python or basic data science is now as fundamental to astronomy as knowing how to align a telescope.
A New Map for a New Era
Artificial Intelligence has transformed the universe from a collection of distant, blurry points of light into a detailed map of potential homes. We are no longer limited by the biological constraints of our eyes or the short duration of a human life. AI allows us to see the cosmos through the lens of mathematics and probability, revealing a galaxy teeming with diversity.
As we continue to refine these “Algorithmic Eyes,” the question is no longer if we will find an Earth-like world, but when. In the vast archives of our digital telescopes, the answer is already waiting. We just need the right code to unlock it.
References and Empirical Studies
👉 Share your thoughts in the comments, and explore more insights on our Journal and Magazine. Please consider becoming a subscriber, thank you: https://borealtimes.org/subscriptions – Follow The Boreal Times on social media. Join the Oslo Meet by connecting experiences and uniting solutions: https://oslomeet.org
#AIInAstronomy #artificialIntelligence #ComputationalAstrophysics #DataScience #DeepSpaceExploration #ExoplanetDetectionAlgorithms #MachineLearningSpaceData #TESSMissionAI🎥 Recording now available!
The Git & GitHub: Practical Version Control for Data Work session with R-Ladies Rome is now on YouTube.
We focus on understanding Git first:
• Git as a local tool (no GitHub needed)
• commits, staging, and history
• how to undo safely
• when GitHub fits in
📺 Watch the recording: https://youtu.be/Sa8NPcaNYLo
📄 Materials: https://tinyurl.com/bdczzcf6
#Git #GitHub #VersionControl #ReproducibleResearch #DataScience #RLadiesRome
My Missing Data Imputation in R course is now available as a fully self-paced course, with full access to all materials and recordings, and the option to ask questions at any time in the comment section.
More info: https://statisticsglobe.com/online-course-missing-data-imputation-r