1. 程式人生 > >Inside Apple’s iPhone XS Camera Technology

Inside Apple’s iPhone XS Camera Technology

This perfectly frozen and exposed photo, however, was taken with an iPhone XS.

“We set a reference frame and fuse in information from multiple frames,” Marineau-Mes said. The image I saw was a composite of multiple frames. Some of those frames contained pieces of what would become the final image, like the perfectly sharp hair and water.

“As you stack the frames, if you have the same image, you have lower and lower noise and better and better detail,” he explained.

It takes an incredibly powerful ISP and neural engine backed by an equally powerful GPU and CPU to do all this processing, Marineau-Mes said.

All that heavy lifting starts before you even press the iPhone camera app’s virtual shutter button. Schiller said that what users see on their iPhone XS, XS Max, and XR screens is not dramatically different from the final image.

“You need for that to happen,” said Schiller. “It needs to feel real-time for the user.”

When I asked what all this gathered information meant for file size, they told me that Apple’s HEIF format results in higher quality but smaller file sizes.

“We make cameras for real people in real situations,” Townsend said. “They’re not on tripods; they’re not in the labs. They want their picture to be a beautiful picture without thinking very much about it.”

Sometimes Apple’s engineers arrive at better image technology almost by accident. Last year, Apple introduced flicker detection, which seeks light source refresh frequencies and tries to reduce flicker in still and video imagery. While incandescent and fluorescent lights have consistent refresh frequencies, which makes it easy to figure out exposure times, modern energy-saving LEDs operate at all different frequencies, especially the ones that change hue, Townsend explained.

This year, Apple engineers widened the range of recognized frequencies to further cut down on flicker. However, while doing so, they realized that they can now also immediately identify when the sun is in the picture (“The sun doesn’t flicker,” Townsend noted) and instantly adjust the white balance for the natural light.

“Our engineers were kind of working at that, and they spotted this extra information. So, this is the bonus that we get from the flicker detect,” Townsend said.

Video and new frontiers

All of these image capture gymnastics extend to video as well, where the same frame-to-frame analysis is happening in real-time to produce video with more details in high and low light.

It occurred to me that with all that intelligence, Apple could probably apply the depth editor to video, adding the professional polish of a defocused background to everyday video shoots, but when I asked Schiller about it, he would only say that Apple does not comment on future plans.

Video or stills, the end result is a new high-water mark for Apple and, perhaps, smartphone camera photography in general. The company gets emails, Townsend told me, where people say, “I can’t believe I took this picture.” It’s an indication that Apple’s achieving a larger goal.

“We make cameras for real people in real situations,” Townsend said. “They’re not on tripods; they’re not in the labs. They want their picture to be a beautiful picture without thinking very much about it.”