In Questions: Google Pixel Camera Q&A with Brian Rakowski & Tim Knight


Google Pixel 2_fonearena-10

Here at FoneArena, we are big fans of the Pixel camera and all that Google has been able to accomplish using a mixture of computational photography and machine learning. That’s not to say that the camera is perfect of course. We recently got a chance to sit down for a group session with Brian Rakowski and Tim Knight from Google.

Brian is VP of Product Management at Google while Tim leads the camera efforts for the Pixel line up. He was previously the Director of Engineering at Lytro which was responsible for developing the Illum light field camera that enables shifting focus in post. Over a half an hour discussion we managed to get answers to a lot of questions that us and the community at large have been trying to find an answer to. Read on for our full interview after the jump.

Q: Hi Brian, from what we’ve seen so far and what we’ve been hearing the focus seems to be on still image quality. This was also the case with the Pixel 1. OIS seems to have gotten a lot of focus but going by reports and our own testing, video quality seems to be a bit soft. The bitrate too is not as high as the competition. Audio recording is noticeably worse than the competition and even the Pixel 1. Especially in a high ambient noise scenario like a concert. Especially coming to video recording, is there any scope for improvements & increased bitrate?

Brian: Some of the things you’ve mentioned, we’ve noticed a few scenarios where our tuning hasn’t been perfect and we’re working on fixes for those. On audio, we’ve observed a few instances where our noise suppression algorithms are too aggressive. We’ll be tuning those and fixing those in updates.

Tim: I’d say video quality is very important to us and there’s a very challenging balance to strike between dynamic range, noise, detail, bitrate, artefacts and a lot of different parameters. Some of these are at odds with each other. We have a real battery of objective and subjective testing that we put videos through. We believe that the overall tradeoff between noise and sharpness places us at the top of video quality. That’s not to say that every single aspect is perfect. There’s always things to improve and we are looking to make tuning updates and maybe in future updates there can be some tweaks to fix little things here and there as they are discovered but what we have today is as far as we are aware, it’s taking pretty great videos even in challenging conditions. In terms of bitrates, in the case of 4K video, I think we have identified that the bitrate there is something we are looking at. For 1080p video, the bitrate is good enough in our opinion. As far as we are aware, for 1080p video, the bitrate is pretty good.

Q: Could you comment further on the audio recording?

Brian: The audio situation, there is one particular situation with the Pixel XL that we have confirmed where the noise suppression algorithms are interacting badly with background noise. We’ve developed a software fix for that which will be rolling out in the next couple of weeks. That should make the experience much better on the Pixel XL where background noise has been interfering with the noise suppression algorithms.

Q: Also, while we’re on the topic of audio recording. Is there any possibility of adding support for external microphones in the camera app?

Brian: I’m not aware of any plans to do so yet. We’ll certainly take the request under consideration back to the team. I’m assuming you are referring to a peripheral microphone support or a pro audio case. I don’t think we have any plans at the moment but I’ll certainly add it to the list.

Q: Low light video at 4K is quite noisy, the camera often tries to brighten up the image too much. Any fixes in work to reduce noise?

Brian: The first thing you mentioned where brightening up the image seems to increase noise is something we discussed internally as we tuned the camera. There’s a very clear tradeoff. On one hand, the user likes bright video where they can see everything that’s happening. But if we brighten it up make things more visible, you also brighten up the noise so that’s a very obvious tradeoff that we think a lot about. I think the tradeoff that we tried to strike to balance it out. If you compare the Pixel to some other cameras in the category, you’ll notice that we’re brighter. Its very easy to make the noise go away by making the image darker. We’ve decided that we’d let the user see more of the scene by making it brighter even if theres more noise. In the case of 4K specifically vs 1080p, 4K there’s a lot more data flowing through the system. 1080p is 2MP, 4K is 8MP so there’s 4 times the amount of data. What this means is that within a fixed budget of power consumption, CPU resources or whatever the budget you might have, generally 4K there’s less resources available per pixel. In the case of 1080p there’s more headroom to do more heavy weight processing with the data being much more in 4K. You’ll notice that a 1080p video is a little bit less noisy than a 4K video and that is sort of part of the reason there. Also, we’re downscaling from the original 12MP sensor to a 2MP 1080p frame there’s an averaging effect that also reduces noise. There’s a few different factors at play here. Where I think that if you do a comparison between the Pixel 2 vs other phones, you’ll notice that the noise is not higher than other devices when you factor in the brightness.

Tim: It obviously lags behind prosumer cameras but compared to other smartphones, it does quite well.

Brian: So DXO does a lot of objective tests. According to their tests of SNR, Dynamic Range and more and they gave the Pixel 2 the highest score. So I think objectively in a head to head battle, our video is good.

Q: 4K, 60FPS?

Brian: 4K, 60FPS is not something we’re going to bring to Pixel 2. For future products we’ll consider it certainly but the Pixel 2, 4K 30 and 1080p 60 is what we’re going to support.

Q. Will AR stickers come to the Pixel 1 as well or only to the Pixel 2?

Brian: If I remember correct, I’ll have to double-check this but AR stickers should be available on the Pixel 1 and 2 both. The difference is that the Pixel 2 is factory calibrated for AR stickers which means that we can make sure that they stick well to the right surfaces. And that surfaces are detected correctly and more resiliently. I’ll double check but, we’ll be bringing it both to the Pixel 1 and Pixel 2 but the quality should be a bit better on the 2.

Q. Is the Visual Core Chip going to be used for anything other than faster HDR and for giving other apps access to similar HDR processing?

Brian: The Visual Core which we will be turning on in the coming apps will primarily be for 3rd party apps. The cool thing about it is that it gives pretty good performance in a default capture scenario. So when 3rd parties use the camera APIs they’ll be able to get those high quality HDR processed images. So we’re really looking forward to see what they do with it. Turns out we do pretty sophisticated processing, optimising and tuning in the camera app itself to get the maximum performance possible. We do ZSL and fast buffering to get fast HDR capture. So we don’t take advantage of the Pixel Visual Core, we don’t need to take advantage of it. So you won’t see changes in the pictures captured from the default camera app in the coming weeks. What you’ll see is that pictures taken in 3rd party apps will get significantly better as they’ll start taking benefit from some of the HDR processing.We’re pretty excited about what it’ll deliver and we’re looking forward to seeing all your favourite apps that use the camera will get much better pictures as a result.

Q. Any enhancements to the Panorama mode lined up?

Brian: The panorama mode takes pretty decent shots but we haven’t updated it recently. The team has been focussed more on core image quality and in making sure that our shots and video looks as great as possible. But in terms of usability, panorama mode hasn’t been a priority for us so there’s nothing new to announce there. But it is something that we can take a look at for future updates.

Q. Any work being done on manual controls?

Brian + Tim: So this is something we think a lot about actually. If you think about DSLRS, they have certain manual controls ISO/Aperture etc that people are very comfortable with. The way that people use DSLRs, especially pros they’ve really come to learn and master these tools and this makes a lot of sense for the medium. You’ve a got a camera with a physical aperture that you can dial down and adjust. Some apps and camera apps have basically the same sliders in the camera experience and honestly I’m not convinced that this is a good interface. Having manual controls in a smartphone camera. For e.g.. HDR+ we do a lot of sophisticated processing by using 5 to 10 frames and doing a lot of tone mapping and so many things. You can’t express all of this as a slider. If we gave users an interface with an ISO slider or exposure slider and it turned off HDR+, that would be an awful feature as image quality would plummet. I think that finding ways to give users control over composition, the purpose of manual controls, that’s a very important thing to figure out. I don’t think the answer is to take the same sliders as an SLR and put that in the camera app. The medium is different and the kind of controls that make sense on a smartphone are different. That’s my philosophy about it. Its something we think a lot about and you know in future software updates for the Pixel line, there might be some additional controls that we add as we figure out what the right model interface is but at the moment don’t expect to see an ISO slider appear any time soon.

Q. Even if you don’t have manual controls, RAW capture lets you get so much more data. This is important for photographers. With HDR+ we’re at the mercy of what the algorithms allow and we can’t do much to make adjustments in post processing with a JPEG file. Why not add a RAW capture mode?

Tim: Thank you for the suggestion. Definitely something we’re looking at. We’ve had similar feedback from other sources too. We don’t have anything to announce today but we’re definitely looking into it.

Q. With the current implementation of the camera app, long exposure photography isn’t really possible. Even with a 3rd party camera app, it is capped to about 3 seconds or so. Is that something thats not a priority at all at Google?

Brian: I’d say the priority is to focus on the kind of photos that our consumers want to take so we don’t have any pre set notions about what is the maximum exposure time that we should support. If it turns out that a lot of users want a long exposure mode and put their phone on a tripod, we’ll certainly look into that but from our data, from the user studies, we don’t think many users want to put their phone on a tripod and take long exposures. Thats more or less a DSLR use case. Not many people even have a tripod that they carry with them all the time. It’s not something that we have prioritised. It’s certainly something we can consider. Takes us back to the earlier question that manual controls from a DSLR may not translate very well to controls on a smartphone. So you know, it’s a different medium and that’s our higher level philosophy. Coming to the 2 or 3 second limit, I don’t think that was a specific number we chose. Going by the tuning we had and the components we used, that number made sense. If there is user feedback that this needs to be longer, we can certainly consider that for a future update.

Q. The default white balance on the Pixel 2 seems to be on the cooler side compared to the Pixel 1.

Brian + Tim: So both Pixel 2 and Pixel 1 we tried to get the white balance to be very accurate. Now I think that we have identified that in the case of the Pixel 2, occasionally there can be some skin tones that can be on the cooler side in terms of the color rendition and that’s something that we already have a fix in the pipeline for. It’ll make the white balance look more natural. It’s not a common or a large-scale issue. It’s just that in some cases, some particular scenes there can be a bit of coolness to the skin tone.

Q. Can we also expect a fix for the iffy flicker reduction under certain LED lights?

Brian: Yes. The camera certainly does have flicker detection and correction. We already had that in the Pixel 1 and almost every phone camera does have that. We did identify a software bug that in certain conditions and scenarios, it is possible that detection doesn’t work properly. We’ll have a fix for that out very soon.

Q. The lens on the Pixel 2 has a 24 or 25mm focal length. On portrait mode this is closer to 38mm. With a 12MP sensor when you are upscaling from a 24 to 35ish mm, there is upscaling going on. How are you countering for possible image quality loss?

Tim: We do have a pretty sophisticated upscaling technology, a technique that is being used for the first time on the Pixel 2. Basically we have machine learning trained upscaling filter. It is a technique, upscaler that is better at preserving sharp edges without over sharpening. Something that we have also used in the digital zoom in general and portrait mode. I don’t think that we’ve made any public announcements around this so I’m not sure how much I’m allowed to talk about this.

Q. The Pixel size has dropped from 1.55 micron to 1.4 microns on the Pixel 2. What kind of effect does this have on low light and what has been done to counter it?

The Pixel size reduction is being considered by an aperture increase. The f/2.0 lens became an f/1.8 lens which compensates for the reduced Pixel size.

Q. If you look at the approach that other manufacturers are taking. Apple, Samsung are embracing creators like photographers and videographers by giving them all sorts of manual controls and tools. A lot of film makers are now using iPhone to make short films and more. Google’s ideology seems to be more focused on the average consumer who just wants a no hassle point and shoot experience. Is this assumption correct?

Tim: I’m certainly no expert at the iPhone space but from what I understand, there are some 3rd party video apps that are pretty popular for capturing content and exposing controls that are more along the lines of what a pro videographer would like to use. I think some of those are also available on Android. I haven’t tested them to see whether they’re good enough but in theory, the same video experience could run on the Pixel 2. In the case where the same app is available on both the iPhone and Android. I’m not aware if the default iPhone app has all these advanced controls but then again I’m not very up to date on the iPhone feature set.

Brian: Our team is very focussed on metrics and seeing what our consumers are doing and actually using our phones for. We are working on matching the uses cases that best work for what our consumers their phones. We’re really proud of the result we’ve gotten from our camera. Our users are happy as well. In terms of some of the pro mode functionality, I wouldn’t say its something that we don’t care about. It’s more a question of priority for us. We’ve been focussed on delivering the best default experience first. I think coming in on our second year, we’ve done a great job and we can always go ahead and add more controls over time. You can already do quite a bit and we’ve seen some impressive results from some pro photographers. The most important thing for us it to make our customers really happy. Its early days but we’re very happy with what we’ve done. It’s not important to us how many phones we’ve sold but how happy our customers are with the product. So we’re really focussed on making that our customers, the early adopters are thrilled with the devices. The feedback from those early adopters is going to shape and influence the features of the camera and the phone.

Note: Minor edits for cohesiveness. Additional questions posed by other journalists have been included for the benefit of the readers.


Author: Dhruv Bhutani

Your friendly neighborhood techie. Currently using a Pixel 2 XL. Catch him on Twitter (@DhruvBhutani) / Facebook .