Samsung posted on the company blog publication, in which she explained the mechanisms for improving the image quality when taking photos of the moon with her smartphone. The Korean manufacturer actually confirmed the theory put forward by the Reddit “whistleblower” that caused the public to wonder what modern photography is all about.
Recently, a user ran an experiment on the Reddit forum that demonstrated the true working of AI algorithms on Samsung phones when capturing moon images. The experiment proved very easy to reproduce by replacing the moon in the lens with its artificially blurred image on a computer monitor – the algorithm simply draws the missing detail into the image.
The Scene Optimizer function is responsible for improving the photo quality on Samsung smartphones, which has supported working with the moon image since the Galaxy S21 model and combines several methods for quality and detail improvement. The upscaling function works at zoom levels of 25x and above and merges data from 10 or more exposures to reduce noise and increase image clarity. To reduce moon blur in dark skies, Zoom Lock is activated, which reduces blur through a combination of digital and optical image stabilization.
Samsung also showed a flowchart describing how to improve image detail using a convolutional neural network – it compares the system’s output to a “high-resolution reference”. Apparently at this stage the AI is adding details that are missing in the original. And that phase caused public discontent – Samsung has been accused of misleading consumers about phones’ ability to scale images. However, the company does little that deviates from general industry practice: the modest capabilities of cameras on smartphones are increasingly being compensated for by computed photography technologies.