Adobe kicked off its annual (Max 2025) event with a first look at new and upcoming generative AI tools for the company’s “Photoshop”, “Premiere Pro”, “Lightroom Creative Cloud” apps.
These include updates to Photoshop’s generative fill feature, designed to give creators more control when adding, removing, or modifying content, and tools that can automate some of the time-consuming elements of photo and video editing.
More to come such as “Firefly AI image model”, “AI audio and video tools”, and “Adobe’s AI assistant”.

It’s a notable change for a company that has been rolling out its proprietary artificial intelligence (AI) for the past two years. For example, this means if you’re retouching a photo in Photoshop, you can now pick which AI engine handles your request.
Adobe says this gives you more creative options, though it’s not yet clear whether most people will care which model does the heavy lifting or if they just want good results.
Firefly Image 5
On Tuesday, Adobe announced the launch of the latest version of its image generation model, Firefly Image 5. It can now operate at up to 4 megapixels native resolution, a huge increase over the previous generation model, which could produce 1 megapixel images but later upscaled them to 4 megapixels. The new model also renders people better, the company said.

Image 5 also allows for layered and prompt-based editing – the model treats different objects as layers and allows you to edit them using prompts or using tools like resizing and rotating. The company says this ensures that editing these layers does not damage the image’s detail and integrity.
Adobe’s Firefly website has supported third-party models from artificial intelligence labs such like “OpenAI”, “Google”, “Runway”, “Topaz”, and “Flu” to increase its appeal to creative customers, and now the company is taking it a step further by allowing users to create custom models based on their artistic style.
Photoshop
Now Adobe is allowing Photoshop users to power “generative fill” capabilities using “Google” and “Black Forest Labs” third-party AI models. After selecting their image and giving generative fill a prompt – such as describing an object to insert, or replacing an existing object or person with something else – users can switch between Google’s “Gemini 2.5 Flash”, “Black Forest’s Flux.1 Kontext”, and “Adobe’s Firefly” image model, providing a wider variety of results to choose from.

Photoshop for the web is also launching a private beta with an AI assistant. It’s a chatbot-like interface that can be given descriptive instructions, such as “increase saturation,” to automatically edit files, effectively leveraging a wide range of Photoshop tools. This capability was introduced in April, and a similar feature is currently rolling out in public beta in the Adobe Express app.
Adobe Lightroom
Adobe Lightroom is introducing another beta feature called “Assisted Culling”, which can filter large collections of photos based on different focus, angles, and brightness levels and recommend the best photos for photographers to edit.
Firefly (AI) Image Model
The Firefly AI image model that powers these editing tools is getting an upgrade. Adobe says Firefly Image 5 can now render images at native four-megapixel resolution without upscaling and has been optimized to improve its ability to render realistic people.
The model also supports Adobe’s new prompt-based editing features, which make specific adjustments based on descriptions, and a new layered image editing tool that will appear in Photoshop to make precise, contextual adjustments, such as ensuring shadows are automatically corrected as you move objects around in an image.
Premiere Pro
For video editors, Adobe also introduced a new tool for its Premiere Pro software that automatically stencils people and objects in video frames. The AI Object Mask is available in public beta and aims to make it easier to quickly color grade, blur, and add visual effects to moving backgrounds, eliminating the need to manually mask objects using the pen tool.
(AI) Audio, Video Tools
Adobe is offering filmmakers new generative AI audio tools that let them quickly add thematically appropriate soundtracks and narration to their videos. The updated Adobe Firefly AI app introduces “Generate Soundtrack” and “Generate Speech”, and Adobe is also rolling out a new online video creation tool that combines multiple AI features with a simple editing timeline.

The “Generate Soundtrack” tool is now available in public beta in the Firefly app. It works by evaluating an uploaded video and generating a selection of instrumental soundtracks that are automatically synced to the footage. Users can control the style of the music by choosing from a selection of presets, such as lo-fi, hip-hop, classical, EDM, and more.
Another filmmaking tool in the works is the “Firefly video editor”, which Adobe describes as a “multi-channel timeline editor for generating, organizing, trimming, and sequencing clips.” It combines Adobe’s various tools for dubbing, soundtracking, and title generation into a single web app, along with frame-by-frame editing features and style settings.

The “Firefly video editor” will be available in private beta next month, and potential users will need to sign up for a waitlist to get early access.
Adobe’s (AI) Assistant
Adobe’s cloud-based Express design platform is getting a new generative artificial intelligence (AI) experience that can transform projects by vaguely describing what changes need to be made.
Adobe describes the “AI Assistant in Adobe Express,” which is launching in public beta today, as a conversational creative agent that “empowers people of all skill levels” to quickly create visual content without requiring them to understand specific design terms or creative tools.
The feature is available as a toggle in the top-left corner of the “Adobe Express” web app. When activated, the usual homepage interface and tool options will be replaced by a chatbot-style text box, with options to make a new design or edit existing images.
YouTube Shorts Made Easily

YouTube users will soon be able to access “Adobe Premiere” editing tools through a new hub called Create for YouTube shorts. It will soon launch both in the new Premiere mobile app and integrated directly into YouTube itself.
Credits: