laitimes

iPhone photo sharpening is serious, is it really the pot of computational photography?

author:Wolf photography

Before, we talked about the problem of iPhone ghosting, if the ghost can be tolerated, over-sharpening is really unbearable!

Since 2019, Apple has introduced the Deep Fusion feature in the iPhone 11 series, showing off computational photography. But since the iPhone 12 series, Deep Fusion has flipped over! For the sake of so-called details, the contrast is violently increased, resulting in a very noticeable oversharpening problem.

iPhone photo sharpening is serious, is it really the pot of computational photography?

And this problem has not only not improved with the new models and system upgrades, but has become more and more serious from generation to generation, and iPhone users have suffered from over-sharpening for a long time.

iPhone photo sharpening is serious, is it really the pot of computational photography?

A casual search on social platforms will reveal many cases

A typical example of this is when shooting cats and dogs, because Deep Fusion enhances the texture details, resulting in animal hair not only not having the soft and plush feeling that it looks like to the naked eye, but because of the enhanced texture details and contrast, the noise is full and it looks dirty.

iPhone photo sharpening is serious, is it really the pot of computational photography?

Even if you don't shoot cats and dogs, you can feel the phenomenon of over-sharpening when you shoot flowers and trees, buildings, and the sky every day. Previously, this problem only appeared when shooting night scenes or when the light was low, but in the iPhone 15 generation, even in good light, the brightness and contrast will be violently increased, and then after multi-frame fusion, the clarity of the picture will be significantly reduced, and the image quality will be cracked.

iPhone photo sharpening is serious, is it really the pot of computational photography?

Shot with an iPhone 15 Pro

The most outrageous thing is that you look at the picture in the frame and it is obviously good, once you press the shutter, you can see the process of oversharpening the photo with the naked eye, just watching a photo in front of you from clear and bright, becoming more and more "dirty", this feeling is too heart-wrenching, enough to make people lose their enthusiasm for shooting, looking at this effect, how can you still have the desire to take out your mobile phone to take pictures?

iPhone photo sharpening is serious, is it really the pot of computational photography?

Shot with an iPhone 15 Pro

What's more unfortunate is that although the problem of over-sharpening of the iPhone has been "angry" for so many years, it seems that there has been no good solution.

On the one hand, this is because Apple continues to add Deep Fusion to all cameras, as well as almost all photo modes including LivePhoto and Portrait, and cannot turn them off autonomously, resulting in this problem of excessive sharpening becoming more and more prominent from generation to generation, such as in Portrait mode, under violent sharpening, the hair of the character is often photographed as if it has not been washed for a month.

Of course, there are many solutions to iPhone over-sharpening on the Internet, but as long as you do a little research, you will find that most of them are unreliable.

For example, if you shoot with Live Photo and then select a different frame in the album as the cover, you can start continuous shooting by holding down the shutter and swipe left because Deep Fusion cannot be activated for continuous shooting, or you can shoot in ProRaw format on the Pro model and save it with the album edit to get a less oversharpened photo......

iPhone photo sharpening is serious, is it really the pot of computational photography?

The problem with these methods is obvious, it is troublesome, and it will cause a lot of problems that occupy a lot of mobile phone storage, and you have to re-edit it in the album, what is this picture?

Another is to use third-party camera apps, including NOMO RAW, Focus, Procam, Halide Mark II, etc., but these apps have payment problems, and for ordinary users, there is also a certain threshold for use. It can be said that because of Apple's so-called computational photography, many iPhone users can only endure over-sharpened photos all the time.

iPhone photo sharpening is serious, is it really the pot of computational photography?

If the ghost is still caused by hardware, why can't it be effectively improved and optimized for over-sharpening?

Some blame it on the pot of computational photography. The Deep Fusion function captures 4 shots at a higher shutter speed before pressing the shutter, followed by 4 more photos with a standard exposure duration and 1 longer exposure at the same time. In this process, the bionic sensor analyzes the photos with pixel-level precision, and the nine photos are divided into high and low weights for fusion and optimization.

iPhone photo sharpening is serious, is it really the pot of computational photography?

In the early days of the iPhone, only 1200W pixels were not resolved, the addition of Deep Fusion really effectively improved the image sharpness and resolution, and forcibly improved the imaging level of the iPhone from the algorithm level.

Computational photography has also shined, and until now, the pursuit of computational photography by major mobile phone manufacturers has not stopped, Google, Huawei, vivo, OPPO, and Xiaomi have all achieved achievements on the road of computational photography, and even have walked out of their own stylized road.

It is true that in the early stage of domestic mobile phone manufacturers in the application of computational photography, there are also problems such as excessive brightening, obvious smearing, and excessive beautification, but after stylized shadow tones have been taken seriously, through computational photography, the light and shadow and tone of these mobile phones have their own characteristics, sharpening, smearing, multi-frame synthesis, HDR, etc. are no longer the key to image quality performance, and how to truly present light and shadow changes, chiaroscuro, and color styles have become the key topics of computational photography.

iPhone photo sharpening is serious, is it really the pot of computational photography?

It can also be said that in recent years, mobile phone manufacturers and camera manufacturers have co-branded to imitate the additional benefits brought by their stylization, but it has to be said that in terms of hardware capabilities, mobile phones may never be able to catch up with mirrorless and single-lens reflex cameras, but through the appropriate application of computational photography, it can gradually erase this gap in perception.

The iPhone's oversharpening problem should not be blamed entirely on computational photography. It is precisely because of the powerful algorithm and chip computing power that Apple tries to artificially narrow the gap in hardware through computational photography, which is the key to the problem. Does it really work to rely on algorithms to improve real-world performance that cannot be achieved at the hardware level?

Apple wants to return to the throne of mobile photography, it is difficult to achieve it with software algorithms alone, and now Apple has also begun to increase the sensor hardware, and the iPhone 16 series is rumored to further increase the sensor size. However, while the hardware is improving, how to synchronize and adapt the algorithms of computational photography and find their own way to implement both soft and hard is the crux of Apple's problem.

At present, Apple has changed from a leader to a chaser, and it depends on when it can catch up.

Read on