![]() ![]() It's still not perfect, and one pro photographer I know immediately called out what he thought was a terrible appearance, but it is improved, and in some cases most people may not recognize that it's all done in software. On the iPhone XS and iPhone XS Max, Apple augments the dual cameras with Neural Engine processing to generate better depth maps, including a segmentation mask that improves detail around the edge of the subject. It was a hit-or-miss feature that sometimes created a nice shallow depth-of-field effect, and sometimes resulted in laughable, blurry misfires. On the iPhone X and iPhone 8 Plus, Apple used the dual backside cameras to create a depth map to isolate a foreground subject – usually a person, but not limited to people-shaped objects – and then blur the background based on depth. The iPhone's Portrait mode is a clever cheat involving a lot of processing power. IPhone XS non-burst image captured less than a minute after the photo above. It's dark, but picks up the detail in the sand. Considering the following photo is captured at 1/1000 sec, and the foreground isn't a silhouette, the result isn't bad. Smart HDR doesn't seem to kick in when shooting in burst mode, or the effect isn't as pronounced. At this point there's more noise in both images, but it's far more pronounced in the iPhone X photo. ![]() The iPhone XS image almost looks as if it was shot using an off-camera flash, likely because the interframes allow highlight retention and motion freezing even as 'shutter speeds' become longer.Īs another example, you can see the Smart HDR on the iPhone XS working in even darker light compared to the iPhone X shot. ![]() The iPhone X image is dark, but you still get a fair bit of detail in the girl's face and legs, which are away from the sun. In the following photo at dusk, I wanted to see how well the cameras performed in the fading light and also with motion in the scene (the flying sand). However, there's no way to force it on.Ĭomparing shots with those taken with an iPhone X reveals the enhanced effect of Smart HDR. As it turns out, it's only once you've enabled the option to keep the original image that you'll see an HDR label on your photos. I wasn't initially sure if perhaps the image quality was due to Smart HDR or the larger sensor pixels no doubt some credit is due to the latter, but it couldn't be that much. After shooting in conditions that would be ripe for HDR – bright backgrounds and dark foreground, low-light conditions at dusk – nothing had that HDR indicator. Testing Smart HDR proved to be a challenge at first, because unlike with the HDR feature in earlier models, the Photos app doesn't automatically label all Smart HDR images as such. The iPhone XS image almost looks as if it was shot using an off-camera flash Smart HDR captures many interframes to gather additional highlight information, and may help avoid motion blur when all the slices are merged into the final product. (See " HDR is enabled by default on the iPhone 8 Plus, and that's a really good thing.") HDR typically blends two or more images of varying exposures to end up with a shot with increased dynamic range, but doing so introduces time as a factor if objects are in motion, the delay between captures makes those objects blurry. This feature intrigued me the most, because last year's iPhone 8, iPhone 8 Plus and iPhone X introduced HDR as an always-on feature. (All the examples throughout are straight out of the device.) Smart HDR The results include a new Smart HDR feature that rapidly combines multiple exposures for every capture, and improved depth-of-field simulation using Portrait mode. More important this year is upgraded computational power and the software it enables: the A12 Bionic processor, the eight-core 'Neural Engine,' and the image signal processor (ISP) dedicated to the camera functions.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |