[ad_1]
Lightroom truly seems to be having a bit of an identity crisis right now. For many years, photographers have used the Lightroom suite of products for mostly basic edits. With the past few updates, they proved to us that you more or less didn’t need Photoshop anymore unless you were retouching. Even then, retouching could be done via Lightroom. But now, it’s getting more generative AI features. The biggest update is called Generative Remove — and Lightroom is touting that you can eliminate anything from any photo non-destructively with a single click. Essentially, it means that you won’t need to clone or heal out specific things when it comes to shooting weddings or portraits.
We’ll be updating our review of Lightroom soon.
Generative Remove in Lightroom
Mind you, Adobe Lightroom isn’t including some of the even weirder things like the ability to add a leopard into the middle of the image with the latest update. So while it’s still not doing everything Photoshop can do, it’s doing quite a bit that is arguably augmenting what Lightroom can do. However, it’s just making it easier.
Considering that Adobe is spending lots of money on ads about photographers who use their phones and wouldn’t be able to do their work without post-production, this kind of seems on-brand.
The new Generative Remove is powered by Adobe Firefly — which is based on licensed photographs from Adobe Stock. In their press release, Adobe states that Firefly is, “designed to generate content for commercial use that does not infringe on copyright and other intellectual property (IP) rights such as trademarks and logos.”
It’s then fascinating that Generative Remove is powered by Firefly and not an enhancement of something like Content Aware Fill or something like that. I’m very curious to see how photographers will use it; and I can already think of a few occasions where people could use it rather maliciously. However, I’ll need to test this out.
Lens Blur
In addition to Generative Remove, the Lens Blur tool is now enabled fully and out of early access. During my initial testing, it seemed pretty good when used in a way that looks real. However, it can also surely be used in a way that makes no sense. By that, I also mean that the images won’t even look like anything a tilt-shift would make either. Lens Blur works by enabling it, allowing Lightroom to analyze the photo, and then configuring the blur with tools around lens depth, a brush, etc.
Truly, it’s going to be best in the hands of someone who is skilled. Though in time, we’re sure that people who don’t understand how depth of field will work will try to make something viral of their own. It can also only do so much. If a subject is slightly out of focus, Lightroom might be able to make them a tad sharper. But don’t expect a whole lot.
Here are some more features from Adobe in the new Lightroom:
Expanding tethering support for new cameras, including the latest Sony digital cameras – such as the Alpha 7 IV and Alpha 7R – provides access to photos in Lightroom Classic in real-time, delivering time-saving on everyday editing workflows and enabling better collaboration across teams;
HDR Optimization, used already across tens of millions of images, enables anyone capturing photos to edit and export their photos with brighter highlights, deeper shadows, and more vivid colors, as seen in real life which has been used to edit millions of photos to date;
Instant access to photo libraries in Lightroom mobile and desktop apps empowers faster editing than ever before;
Lightroom’s all-new mobile editing experience streamlines the mobile toolbar to prioritize the most popular features, while making it faster and more intuitive to edit.
[ad_2]