This site may earn affiliate commissions from the links on this folio. Terms of use.

It'due south no secret most of the world'due south photos are now shot with, and viewed on, a smartphone. For casual photography, the impressive on-board processing of modernistic phones is ordinarily enough: Simply shoot and share. But if you're a petty more fussy about your images, or are photographing hard subjects or in catchy lighting, then you're likely to desire to practice some prototype editing. Even with clever apps like Lightroom CC and Snapseed, that can be painful on a phone'southward pocket-size screen. Fortunately, there are now some means to stay light simply take a better platform for editing.

Perhaps taking pity on me for carrying 20 pounds of photograph gear around every tech briefing I embrace, Google challenged me to see what I could do with a Pixelbook, a Pixel 2, and Lightroom. Then I've been relying on that combination whenever possible for the last few weeks to run across how consummate a mobile photography solution I tin make it. I've supplemented it with either my Canon G9 10 point-and-shoot or my Nikon D7500 for capturing images beyond what the telephone can practise on its ain. Here's how information technology's worked out, along with some options for tuning your own mobile photography workflow.

Besides the Pixelbook and Pixel 2, I've been carrying either a compact or small DSLR for shots needing more zoom or other additional capabilities

Besides the Pixelbook and Pixel 2, I've been conveying either a compact or modest DSLR for shots needing more zoom or other additional capabilities. Along with a couple SD cards and a mouse (yeah, I nonetheless utilise one), that'south plenty for a surprising number of photo projects.

Making the Most out of Your Smartphone Camera

First, for the photos I intendance about, I shoot in RAW. In the case of the Pixel 2 or my personal OnePlus 5, that is typically DNG, although with Lightroom I can now also have advantage of Adobe'southward Raw HDR workflow. The latter produces a high-dynamic range, floating point DNG image that has already been given a default tone mapping, but nonetheless has much more range to piece of work with than a simple JPEG or traditional DNG. Other than the requirement for post-processing, the merely other reason non to shoot in RAW is that some of the super-clever computational imaging done by high-end phones is only accessible when you shoot JPEG. For case, Google's HDR+ engineering science that combines multiple frames to make a single, superior, image only works for JPEGs shot with the Photographic camera app.

Lightroom mobile offers a wide-variety of editing tools, starting with these simple slider-based adjustmentsFifty-fifty though I have a agglomeration of camera apps on my phones, for this article I've been using the camera capability built into Lightroom Mobile and the default Google Camera app for the Pixel ii. Lightroom's camera syncs my images to the Adobe deject and to my other Lightroom devices automatically, and also supports RAW image capture, which the default Google Camera application for the Pixel two does not. Google's Camera app gives me the total do good of Google's impressive computational imaging HDR+ technology, and syncs with Google Photos automatically.

The images shot with the Lightroom photographic camera are also stored in Google Photos — which is pretty absurd, since all images shot with a Pixel 2 betwixt now and 2020 will be stored in original resolution for free by Google. Every bit I've written nearly previously though, the proliferation of various backend photograph clouds is adding some defoliation, as each vendors apps currently work all-time with their own deject.

As long as both devices are online, images shot on the Pixel two sync through Lightroom automatically to the Pixelbook (and to the Lightroom desktop car in my studio, and my main laptop). To get images off my Nikon D7500 or Canon G9 X, I either need to use Wi-Fi or an SD carte. Unfortunately, while the Pixel 2 paired nicely with the D7500, the Pixelbook didn't. Ultimately I'1000 not sure how big a deal that is, equally the cameras' Wi-Fi is too ho-hum to transfer large numbers of image or RAW files. And then a USB SD carte du jour reader is the small and elementary answer. For the Pixelbook, you'll either need a USB-C version or an inexpensive adapter. The good news is that everything including the D7500, flash, chargers, and cables fits in a convenient bag like the pictured MindShift PhotoCross.

Not surprisingly, the workflow starting with the smartphone camera is a whole lot simpler. Simply how does information technology measure up for challenging photography situations? Loftier-dynamic-range and depression-lite scenes have been some of the toughest to capture with smartphones. A variety of computational imaging technologies under the loose heading "HDR" are now available to address those shortcomings. We'll take a await at them and how they perform.

HDR on Smartphones is Improving by Leaps and Premises

When HDR first appeared on smartphones, it was clever, but fairly clunky. It mimicked the process of bracketing on a standalone photographic camera by (relatively slowly) capturing 2-three images and then tone mapping them in a straightforward fashion. Now, the Pixel ii, for case, captures up to 10 images in a fraction of a 2d, then aligns and assembles them using the total power of the phone's GPU and paradigm processing hardware. Finally, information technology does noise reduction and an AI-based tone mapping that takes into account local contrast and the overall scene. Fifty-fifty the initial exposure is calculated based on a machine learning engine that has been trained on thousands of sample scenes. Apple, Samsung, and other loftier-stop phone makers take similar systems, although they vary in how many images they capture, whether the images all have the same exposure, and in the quality of the mail service-processing and artifact suppression.

This is one of the scenes that really sold me on HDR+ as implemented in the Pixel 2. This is on full Auto, straight out of the camera. You can click through to see a 50% down-sampled version of the original.

This is one of the scenes that really sold me on Google's HDR+ as implemented in the Pixel 2. This is was captured in full Auto style, and is straight out of the camera. Yous can click through to see a 50 percent downward-sampled version of the original.

The consequence of Google'south HDR+ (and similar features in other phones) is an effective extension of the phone photographic camera'south dynamic range well beyond what the ten-bit image sensor tin provide natively. Google, as well every bit Apple, Samsung, and a couple other phone makers have too washed an excellent job reducing or eliminating the artifacts that come along with doing all that image fusion. You can still fool them with enough motion in the scene, but it is getting harder. For anyone who wants an instantly usable image, this in-camera HDR produces a standard JPEG yous tin can share right away. Only if y'all desire the ultimate in HDR, Adobe has pushed things fifty-fifty further.

While doing an excellent job of rendering the high-dynamic-range scene, this Adobe RAW HDR shows ghosting in the cyclist's leg despite a nominal shutter speed of 1/1800s

While doing an excellent job of rendering the high-dynamic-range scene, this Adobe RAW HDR shows ghosting in the cyclist's leg despite a nominal shutter speed of one/1800s.

With the newest version of Lightroom Mobile, if you have 1 of the supported smartphones, Lightroom Mobile's camera feature tin can painlessly capture enough individual images to record both the shadow and highlight areas of a scene. Information technology then automatically merges the individual RAW images into a loftier-allegiance floating betoken RAW version for follow-on processing. The results are very impressive, at least for static scenes. The procedure is slower than the congenital-in HDR+ characteristic, and then information technology doesn't work also when in that location is motion in the scene. Also, considering this is a twist on the RAW format that is unique to Adobe, images in this format aren't widely supported, at least not yet. For example the Adobe HDR images I shot with the Pixel 2 aren't viewable on Google Photos. Withal, they fit correct into Lightroom, which brings us to the side by side slice of the puzzle, prototype processing.

I was able to make some quick adjustments using Lightroom Mobile shown by the Adjustment Brush mask, which were synced automatically to my desktop and the Pixelbook where I could do further editing.

I was able to make some quick adjustments using Lightroom Mobile shown past the Aligning Brush mask, which were synced automatically to my desktop and the Pixelbook where I could do further editing.

Lightroom At present Spans Just About Every Device from the Largest to the Smallest

Once only available on full-on computers, thanks in part to a complex interface that begged for a keyboard, mouse, and large brandish, now Lightroom is easily accessible on phones, tablets, computers, and even the latest Chromebooks that have Android support. While the available characteristic set varies between devices, as yous'd expect, even the Mobile version has go quite powerful. When used on a Pixelbook or large tablet, you can do a large amount of professional person-course image editing with information technology. If that isn't enough, all of your images can exist automatically synced to your computers for further editing, still in total fidelity.

The Pixelbook isn't your Father'southward Chromebook

Lightroom Mobile on the Pixelbook provides a similar interface to the one on the phone, but with even more powerWhen I tried to use the original Google Pixel as my traveling computer in 2013, information technology drove me nuts. There weren't whatsoever bully image editors for Chrome Os, and I had access to neither my familiar Windows apps or their Android equivalents. The add-on of Android support, availability of Lightroom Mobile, and the pick for an active stylus help make the new Pixelbook an entirely dissimilar experience. I tin now practise almost the same editing on the Pixelbook that I'd do on the route with my Windows laptop. And with Lightroom Mobile, my edits won't be wasted if I decide to exercise more work on an paradigm afterwards my Windows desktop in Lightroom or Photoshop.

Overall, Google has put together an constructive 1-ii dial for photographers who want to travel light, but still have a loftier-finish workflow. That said, if y'all don't demand the keyboard on the Pixelbook, and then an iPad or an Android tablet with an agile stylus would be a less expensive, and lighter-weight, alternative to the Pixelbook. Similarly, if you want one of the all-time smartphone cameras on the market, the Pixel ii is ideal. But if you're on a budget, you can detect less-expensive models that still back up some grade of automatic HDR and Adobe's RAW HDR capability. For example, my cheaper OnePlus 5 fit just as nicely into this workflow, although information technology doesn't produce the same image quality every bit the Pixel 2.

[Images by David Fundamental]