Rabu, 16 Oktober 2019

Can the Pixel 4 win back Google’s camera crown? - The Verge

After Apple’s iPhone 11 event, I noted that while the company was catching up on features like ultrawide and night mode, it was unclear whether it’d be able to get on the Pixel’s level in terms of basic image quality. Over the course of our review process, it became clear that Apple had indeed achieved that. Apple says it has a class-leading camera every year, but this time it actually does.

The next question, then, is how big a leap will come with Google’s new Pixel 4. We still can’t answer that yet, just as we didn’t know how good the iPhone 11 was the day after its announcement. (You can see some quick comparisons here, but stay tuned for the full review.) We can, however, take a lot from what Google did — and didn’t — have to say on stage yesterday.

“We didn’t forget about the camera,” Google’s Sabrina Ellis said near the end of the presentation. “With Pixel 4 we’re raising the bar yet again, and it all starts with this little square.” “Little” is perhaps a charitable description of the Pixel 4’s conspicuous camera bump, but that bump does of course house what should amount to the biggest change to the Pixel camera.

Let’s just get this out of the way: it’s weird that Google went for a telephoto lens as its second option. “Wide angle can be fun, but we think telephoto is more important,” computational photography lead Marc Levoy said on stage. That’s not an unreasonable position — Apple certainly agreed during the last three years it put out dual-camera phones before switching to ultrawide with the iPhone 11. But Google spent a lot of time last year touting its Super Res Zoom feature that uses multi-frame algorithms to improve the quality of traditional digital zoom. It wasn’t better than an optical telephoto lens, of course, but it was better than nothing, and Google is continuing to use it for the Pixel 4’s extended zoom range.

An ultrawide lens, on the other hand, can’t be faked in software. The reason to include one is because it’s the only way to achieve that perspective. Why, when finally deciding to add a second lens after years of insisting it wasn’t necessary, would Google choose glass that solves a problem it already had a passable solution for, instead of something that makes entirely new types of photos possible?

Or, and hear me out here, why not just add an ultrawide and a telephoto? These are expensive phones. That is a big camera bump. Every single one of Google’s competitors in the premium market now sells phones with triple-camera setups — it’s not that exotic a feature any more. I’m sure we’ll see it on the Pixel 5.

Overall, Google had almost nothing to say about the Pixel 4’s camera hardware on stage beyond the acknowledgement of the second lens. It turns out the main lens has received a slight aperture increase, going to f/1.7 from f/1.8, while the sensor remains 12 megapixels and is presumably the same size as before.

“But the hardware isn’t what makes our camera so much better,” Ellis went on. “The special sauce that makes our pixel camera unique is computational photography.” That is certainly true, and what followed was an engaging presentation from Levoy on how the Pixel works its magic. Or, as he described it in a wry swipe at Apple: “It’s not mad science, it’s just simple physics.”

After explaining the basic principles of HDR+, Levoy detailed four new computational additions to the Pixel 4 camera. The first was Live HDR+, which uses machine learning to calculate HDR+ in real time so you can see it in the viewfinder. This also allows you to control the camera’s exposure with sliders for brightness and shadows, which could make it a lot easier to lock in the results you want.

Elsewhere, the Pixel 4 now uses machine learning for white balance in all modes, instead of just Night Sight — the examples Levoy used included a snow scene, which trips up traditional cameras all the time. The Pixel 4 portrait mode uses the telephoto lens to create a better depth map and work with a wider range of subjects. And Night Sight now has an astrophotography function that merges images across four-minute exposures to produce pin-sharp photos of stars.

All of these features sound cool, and I could watch Marc Levoy talk about HDR algorithms for hours. But it’s hard to know whether the announcements will add up to meaningful, noticeable improvements in pure image quality for the types of photos we take every day. I’m looking forward to taking pictures of stars on my phone, but I don’t know if it’s a selling point. What matters is the extent to which Google has managed to push its core photographic results forward.

This is basically what I thought about the iPhone 11 after its launch, of course, and it turned out to be the biggest leap for Apple in several years despite what appeared to be near-identical hardware. We won’t know how good the Pixel 4 camera really is until we’ve spent more time with it. But the Pixel 3 camera is still one of the best in the world, so there’s no reason to expect anything short of greatness.

We just wonder about that missing ultrawide.

Let's block ads! (Why?)


https://www.theverge.com/2019/10/16/20916938/pixel-4-camera-vs-iphone-11-pro

2019-10-16 12:00:00Z
52780406029379

Tidak ada komentar:

Posting Komentar