MIT graduate student Alex Kachin spent nine months, restoring a damaged baroque Italian painting, which left a lot of time to surprise him whether technology could speed up things. Last week, MIT News Announced Their solution: A technique that uses AI-related polymer films to physically restore damaged paintings in hours instead of months. Research Appears In nature.
The method of kachin works by printing a transparent “mask” with thousands of colorful areas that can directly apply to a original artwork. Unlike traditional restoration, which permanently replaces painting, these masks can be removed whenever required. So it is a reversible process that does not permanently change a painting.
“Because it is a digital record of what the mask was used, in 100 years, the next time someone is working with it, they would have a very clear understanding of what was done for painting,” Kachin told MIT News. “And this is not really possible in protection before before.”
Figure 1 from paper.
Credit: MIT
Nature Reports 70 percent of institutional art collections are hidden from public perspectives due to disadvantages – a large amount of cultural heritage sitting in storage. traditional Restoration methodsWhile the protectors filled the damaged areas at a time at a time when mixing the exact color matches for each region, the same painting could take decades to decades. This is a skilled work that requires both artistic talent and deep technical knowledge, but there are not just enough patrons to deal with the backlog.
Mechanical Engineering student imagined the idea for MIT during the 2021 cross-country drive, when a gallery visit revealed how hidden the art is due to damage and restoration backlogs. As a person restores pictures as a hobby, he understood both the problem and ability for a technical solution.
To display its method, Kachin chose a challenging testing case: 15th-century oil painting requires repair in 5,612 different areas. An AI model identified the damage pattern and produced 57,314 separate colors to match the original work. The entire restoration process allegedly took 3.5 hours-66 times faster than traditional hand-painting methods.

In particular, Kachin avoids using the liberal AI model Stable dissemination Or “Full-Arias Application” of the Generic Advertisement Network (GANS) for digital restoration steps. According to nature paper, these models cause “spatial deformation” that prevents proper alignment between restored image and damaged origin.