Our sun follows an 11-year cycle of energetic activity, but solar observation technology is advancing faster. A new study shows AI can bridge gaps between old and new solar data, revealing overlooked aspects of the sun's long-term evolution.
New solar telescopes capture unprecedented details of solar flares and magnetic fields, essential for understanding the sun’s complex behavior and driving discoveries. However, differences in resolution, calibration, and quality across instruments cause incompatibilities, complicating long-term studies, the study explains.
The AI approach identifies patterns across diverse datasets, translating them into a standardized format, enriching archives for research on historic sunspots, rare events, and combined datasets, the authors say.
“AI can’t replace observations, but it maximizes the value of existing data,” said Robert Jarolim, who led the study.
Jarolim’s AI method translates observations between instruments that never operated simultaneously, making it applicable to various astrophysical imaging data.
The process uses two neural networks: one simulates degraded images from high-quality ones to learn instrument-specific distortions; the second learns to restore degraded images to their original quality, correcting differences without losing actual solar features.
This lets older, low-quality data benefit from newer instruments' capabilities, improving resolution and reducing noise without distorting solar features.
Tatiana Podladchikova, co-author, said this project revitalizes historical data, creating a universal language for studying the sun’s evolution over time.
The method was tested on data from multiple space telescopes spanning two solar cycles (20+ years). It enhanced full-disk solar images, reduced atmospheric distortion in ground-based observations, and estimated magnetic fields on the sun’s far side.
Applied to sunspot NOAA 11106 in September 2010, the AI produced sharper magnetic maps than the original Solar and Heliospheric Observatory data, revealing magnetic structures more clearly.
“Ultimately, we’re building a future where every observation, past or future, can speak the same scientific language,” Podladchikova said.
This research was published April 2 in Nature Communications.