WitWiW

World Best News Website

Inside Google’s Pixel 4 photography: Overhauled portrait mode, dual cameras and more

6 min read


Pixel 3 portrait mode

Google Pixel phones offer a portrait mode that artificially blurs the background to focus on a subject. The Pixel 4’s dual cameras bring improvements.


Stephen Shankland/CNET

Over the last three years, Google’s Pixel phones have earned a well-deserved reputation for photographic strength. With the Pixel 4, the company is flexing new camera hardware and software muscles.

The new flagship Android smartphone, which the search giant unveiled on Tuesday, gets a second 12-megapixel camera, a key component in an overhauled portrait mode that focuses your attention on the subject by artificially blurring the background. The new portrait mode works more accurately and now handles more subjects and more compositional styles.

The additional camera, a feature Google itself leaked, is just one of the advances in the Pixel 4, which boasts a host of new abilities that stem from the company’s prowess in computational photography technology. Other new features include better zooming, live-view HDR+ for fine-tuning your shots and the extension of Night Sight to astrophotography. 

The new features are the surest way Google can stand out in the ruthless, crowded smartphone market. Google knows a lot is riding on the phones. They’re a blip in the marketplace compared with models from smartphone superpowers Samsung and Apple. Google improved its prospects with the low-priced Pixel 3A in June. But to succeed, Google also needs better alliances with carriers and other retail partners that can steer customers to a Pixel over a Samsung Galaxy.

Improving photography is something Google can do on its own, and photography is important. We’re taking more and more photos as we record our lives and share moments with friends. No wonder Google employs a handful of full-time professional photographers to evaluate its products. So I sat down with the Pixel 4’s camera leaders — Google distinguished engineer Marc Levoy and Pixel camera product manager Isaac Reynolds — to learn how the phone takes advantage of all the new technology.

Two ways to see three dimensions

To distinguish a close subject from a distant background, the Pixel 4’s portrait mode sees in 3D that borrows from our own stereoscopic vision. Humans reconstruct spatial information by comparing the different views from our two eyes.

The Pixel 4 has two such comparisons, though: a short 1mm distance from one side of its tiny lens to the other, and a longer gap about 10 times that between the two cameras. These dual gaps of different length, an industry first, let the camera judge depth for both close and distant subjects.

“You get to use the best of each. When one is weak, the other one kicks in,” Reynolds said.

Those two gaps are oriented perpendicularly, too, which means one method can judge up-down differences while the other judges left-right differences. That should improve 3D accuracy, especially with things like fences with lots of vertical lines.

Levoy, sitting at Google’s Mountain View, California, headquarters, flipped through photos on his MacBook Pro to show results. In one shot, a motorcycle in its full mechanical glory spans the full width of a shot. In another, a man stands far enough from the camera that you can see him head to toe. The smoothly blurred background in both shots would have been impossible with the Pixel 3 portrait mode.

Continuous zoom

Google wants you to think of the Pixel 4’s dual cameras as a single unit with a traditional camera’s continuous zoom flexibility. The telephoto focal length is 1.85X longer than the main camera, but the Pixel 4 will digitally zoom up to 3X with the same quality as optical zoom.

That’s because of Google’s technology called Super Res Zoom that cleverly transforms shaky hands from a problem into an asset. Small wobbles let the camera collect more detailed scene data so the phone can magnify the photo better.

“I regularly use it up to 4X, 5X or 6X and don’t even think about it,” Levoy said.

The Pixel 4's Super Res Zoom uses processing tricks to zoom beyond its camera's optical abilities.

The Pixel 4’s Super Res Zoom uses processing tricks to zoom beyond its camera’s optical abilities.


Google

HDR+ view as you compose photos

HDR+ is Google’s high dynamic range technology to capture details in both bright and dark areas. It works by blending up to nine heavily underexposed shots taken in rapid succession into a single photo — a computationally intense process that until now took place only after the photo was taken. The Pixel 4, however, applies HDR+ to the scene you see as you’re composing a photo.

That gives you a better idea of what you’ll get so you don’t need to worry about tapping on the screen to set exposure, Levoy said.

Separate camera controls for bright and dark

Live HDR+ lets Google offer better camera controls. Instead of just a single exposure slider to brighten or darken the photo, the Pixel 4 offers separate sliders for bright and dark regions.

That means you can show a shadowed face in the foreground without worrying you’ll wash out the sky behind. Or you can show details both on a white wedding dress and a dark tuxedo.

The dual-control approach is unique, and not just among smartphones, Levoy says. “There’s no camera that’s got live control over two variables of exposure like that,” he said.

Continuous zoom

Google wants you to think of the Pixel 4’s dual cameras as a single unit with a traditional camera’s continuous zoom flexibility. The telephoto focal length is 1.85X longer than the main camera, but the Pixel 4 will digitally zoom up to 3X with the same quality as optical zoom.

That’s because of Google’s technology called Super Res Zoom that cleverly transforms shaky hands from a problem into an asset. Small wobbles let the camera collect more detailed scene data so the phone can magnify the photo better.

“I regularly use it up to 4X, 5X or 6X and don’t even think about it,” Levoy said.

Shoot the stars with astrophotography

In 2018, Google extended HDR+ with Night Sight, a path-breaking ability to shoot in dim restaurants and on urban streets by night. On a clear night, the Pixel 4 can go a step further with a special astrophotography mode for stars.

The phone takes 16 quarter-minute shots for a 4-minute total exposure time, reduces sensor noise, then marries the images together into one shot.

The Pixel 4's Night Sight mode can photograph the Milky Way and individual stars -- if the sky is clear enough.

The Pixel 4’s Night Sight mode can photograph the Milky Way and individual stars — if the sky is clear enough.


Google

AI color correction

Digital cameras try to compensate for color casts like blue shade, yellow streetlights and orange candle light that can mar photos. The Pixel 4 now makes this adjustment, called white balance, based in part on AI software trained on countless real-world photos.

Levoy showed me an example where it makes a difference, a photo of a woman whose face had natural skin tones even though she stood in a richly blue ice cave.

All these features represent a massive investment in computational photography — one Apple is mirroring with its own Night Mode, Smart HDR and Deep Fusion. Google has to “run faster and breathe deeper in order to stay ahead,” Levoy acknowledged.

But Apple also brings more attention to Google’s work. “If Apple follows us, that’s a form of flattery.”

Leave a Reply

Your email address will not be published. Required fields are marked *