I want Google’s camera app level in all my cameras
[ad_1]
Ever since the Pixel 5 came out in 2020, Google implemented a handy level feature in its camera app that I truly love using. It’s so good, I’m convinced that just about every other smartphone and camera manufacturer should just blatantly rip it off for our collective betterment.
The Pixel camera app automatically calls up some nifty virtual horizon lines by default when you hold the phone steady to line up a shot. You guide the two tilting white lines toward a static yellow one, and when your picture is perfectly level from side to side (roll) and front to back (pitch), they all align and turn yellow. You even get a nice haptic buzz once your horizon reaches an even zero degrees. The UI for this assistive tool changes when you point the phone downward or upward, replacing the lines with two crosshairs. Line up the moving white crosshair with the yellow static one, and bada bing bada boom, you’ve got a perfectly level zoom.
I can’t stress enough how handy this is when shooting flat lay images, like an aesthetically pleasing arrangement of items on a table or a lovely meal you just got served at a fancy restaurant — you know, the stuff that’s perfect for the ‘gram. Those shots look awkward as hell when they’re crooked because it easily throws off the proportions.
Google isn’t the originator of all this leveling business with the Pixel. Apple started doing the double-crosshair thing in 2017 with iOS 11 on the iPhone X, and I know Samsung has had it from at least the Galaxy Note 10 generation in 2019. But Apple and Samsung only enable the crosshair leveler when you turn on their cameras’ grid overlays, and they lack level lines for standard photos. So Google imitated but also innovated a bit, making a better experience overall.
As for full-size cameras, they’ve had virtual horizons for many years, but their implementations have been outclassed by smartphones. I recall my first camera with it was the Nikon D700 in 2009, though it was fairly basic. The sad part is that larger cameras haven’t improved much on them since then. The virtual horizons of expensive flagships like a Sony A1, Nikon Z9, Canon R3, or Leica SL2 work great for standard shots but get lost as soon as you point them downward for a simple flat lay. As someone who frequently uses Sony A9 II and A7 IV cameras for weddings, it drives me mad knowing that the phone in my back pocket would potentially help me take this shot faster than these state-of-the-art mirrorless cams.
Why am I so obsessed with this? Maybe I secretly want to achieve true level, or maybe I have a bad habit of taking ever so slightly crooked shots when I’m working quickly and I’m a bit of a perfectionist. Regardless, I’m confident that it’s not just nit-picky part-time wedding photographers like me that would benefit from this feature. Product and food photography are whole categories that often require straight-down flat lay imagery, and trust me when I say it’s no fun having to bust out a physical bubble level to ensure you get it right in-camera.
Yes, you can take a bunch of shots to ensure you get it right, use software to correct crooked and off-axis proportions later, buy one of these silly things that takes up your hot shoe, or “git gud” and somehow nail a perfect handheld shot every time, but the technology exists to help anyone get this right consistently on the first try. It should be implemented in all cameras, smartphones, and pro cams alike. Plus, you get to learn how many tables and desks in the world are actually slightly crooked.
[ad_2]
Source link