Above: A frangipani caterpillar photographed in RAW mode and cropped on a Samsung Note 9. Photo by Mark Lyndersay.
BitDepth#1171 for November 15, 2018
The Note 9 is an amazing communications device. Not just because of what Samsung has built into it over the seven years since it introduced the device, then one of the largest form factor phones on the market.
Looking at the device now, it just seems normal, maybe even a bit small compared to its hefty competition, the illusion of its Infinity screen gently curving into the brushed metal back of the device blending it even further into the background.
It’s such a definite, mature design that most protective cases don’t even try to cover that defining arc of glass, so that it goes largely unprotected in case of a fall.
I’d had some concerns about the need to liquid-cool the device, but it hasn’t built heat to the point of being uncomfortable, not even after I stumbled through a few sessions of Fortnite in search of the elusive skin, but mostly dying ghastly and undignified deaths in the rendered landscape.
Over the last three years, I’ve been bouncing between a small prosumer point and shoot camera and a smartphone for coverage of news and I keenly anticipated working with the longer lens (56mm isn’t telephoto, it’s closer to “normal”) that’s built into the Note 8, S9 Plus and Note 9 devices since Samsung began adding more lenses to its smartphones.
Since the introduction of the Note 9, the company has gone further with the A9, which doubles again the number of lenses available on a Samsung smartphone, offering a 24MP main camera, the 56mm ‘telephoto’ lens and a 120 degree ultra-wide lens (equivalent to a 13mm lens on a DSLR). The fourth depth lens gathers spatial information.
When Huawei introduced the P9 with twin lenses in April 2016, it struck out firmly in the direction of computational photography with an emphasis on plenoptics, which uses information gathered by the binocular vision of two capture optics to provide data that makes some surprising adjustments possible.
Most of those adjustments have been seen in what Samsung calls Live Focus, the ability to mimic very shallow depth of field using data gathered by a pair of lenses.
The iPhone uses depth mapping to darken backgrounds and to shift focus.
Samsung has dramatically improved its camera capture app, taking some design cues from Leica’s heads-up display inspired work on the P9 and removing that irritating option to add cute fuzzy ears from the main interface.
But there’s a problem with the Note 9’s camera system and it’s probably there in all of its current devices using two or more lenses.
The company is using all of the computational photography advancements at its disposal to ensure that every picture taken with the phone is awesome, and it largely succeeds at that.
I use Pro mode almost entirely and was surprised to discover that the Note 9 doesn’t use the 56mm lens in any predictable way.
Pro mode on a smartphone camera is an indirect analogue of manual mode on a traditional camera. On a digital camera, manual mode releases all control to the user. On a smartphone, everything remains in automatic mode in Pro mode until the user expressly switches a setting to manual control.
It’s a bit unusual for a serious photographer, but understandable from the manufacturer’s perspective, because it improves the chances of recording a quality image by ceding control by default to the device until specifically overridden.
In that mode on a smartphone, I capture images in RAW format, a direct dump of the sensor data along with a JPEG file.
Looking at the capture files, it became clear that when I thought I was switching to the longer lens, what was being produced was the result of digital zoom, a closer view created by cropping and resizing the JPEG file.
Unfortunately, the resizing, particularly at higher ISO settings, is substandard and prone to adding artifacts. I got better results working with the RAW file and cropping the image after the fact.
I tried switching between f1.5 and f2.4 apertures, setting the camera to 2X in Auto mode and switching to Pro mode (the setting sticks, but you get a cropped and resized JPEG).
Eventually, I blocked the main camera lens to see exactly when the 56mm lens is activated.
Zooming to 2X ( swipe the shutter button right and left) in Auto and Pro modes doesn’t help and I finally saw an image show up on the 56mm lens in Live Focus mode while zooming in.
In any other mode on the camera, this behaviour would be perfectly fine. I’d actually expected it to work that way in Live Focus mode, but in Pro mode, it’s also reasonable to expect that the device will give full control to the user.
It does not.
There doesn’t seem to be any good reason for this decision. Most users, even those with cocky aspirations, won’t spend much time in Pro mode, but those who do, expect the device to respond to their selected settings.
It’s something that can probably be fixed easily in software, but for that to happen, Samsung needs to be a lot clearer about what Pro mode means to an admittedly small slice of its device users and perhaps consult with its more savvy users about what they expect their device to do in Pro image capture mode.