THE NOTE 9 is an amazing communications device. Not just because of what Samsung has built into it over the seven years since it introduced the device, then one of the largest form factor phones on the market.
Looking at the device now, it just seems normal, maybe even a bit small compared to its hefty competition, the illusion of its Infinity screen gently curving into the brushed metal back of the device, blending it even further into the background.
Over the last three years, I’ve been bouncing between a small prosumer point-and-shoot camera and a smartphone for coverage of news and I keenly anticipated working with the longer lens that’s built into the Note 8, S9 Plus and Note 9 devices since Samsung began adding more lenses to its smartphones.
Since the introduction of the Note 9, the company has gone further with the A9, which doubles again the number of lenses available on a Samsung smartphone, offering a 24MP main camera, the 56mm telephoto lens and a 120 degree ultra-wide lens (equivalent to a 13mm lens on a DSLR). The fourth depth lens gathers spatial information.
When Huawei introduced the P9 with twin lenses in April 2016, it struck out firmly in the direction of computational photography with an emphasis on plenoptics, which uses information gathered by the binocular vision of two capture optics to provide data that makes some surprising adjustments possible.
Most of those adjustments have been seen in what Samsung calls Live Focus, the ability to mimic very shallow depth of field using data gathered by a pair of lenses.
But there’s a problem with the Note 9’s camera system and it’s probably there in all of its current devices using two or more lenses.
I use Pro mode almost entirely and was surprised to discover that the Note 9 doesn’t use the 56mm lens in any predictable way.
In that mode on a smartphone, I capture images in RAW format, a direct dump of the sensor data along with a JPEG file.
Looking at the capture files, it became clear that when I thought I was switching to the longer lens, what was being produced was the result of digital zoom, a closer view created by cropping and resizing the JPEG file.
Unfortunately, the resizing, particularly at higher ISO settings, is substandard and prone to adding artefacts. I got better results working with the RAW file and cropping the image after the fact.
I tried switching between f1.5 and f2.4 apertures, setting the camera to 2X in Auto mode and switching to Pro mode (the setting sticks, but you get a cropped and resized JPEG).
Eventually, I blocked the main camera lens to see exactly when the 56mm lens is activated.
Zooming to 2X (swipe the shutter button right and left) in Auto and Pro modes doesn’t help and I finally saw an image show up on the 56mm lens in Live Focus mode while zooming in.
In any other mode on the camera, this behaviour would be perfectly fine. I’d actually expected it to work that way in Live Focus mode, but in Pro mode, it’s also reasonable to expect that the device will give full control to the user.
It does not.
It’s something that can probably be fixed easily in software, but for that to happen, Samsung needs to be a lot clearer about what Pro mode means to an admittedly small slice of its device users and perhaps consult with its more savvy users about what they expect their device to do in Pro image capture mode.
Mark Lyndersay is the editor of technewstt.com. An expanded version of this column can be found there