Adobe’s AI Object Masking: A Long-Overdue Answer to Video’s Most Tedious Task
For the editors behind the 85% of this year's Sundance premieres cut on Creative Cloud, the most hated task in the suite—rotoscoping—might finally be dead. Adobe’s latest updates to Premiere Pro and After Effects focus on "assistive" AI, aiming to reclaim the hours lost to frame-by-frame masking that have long been a bottleneck in professional post-production.
Object Mask: Closing the Gap with DaVinci Resolve
The center of this update is the new Object Mask tool, a "hover and click" system designed to identify and track subjects automatically. While the AI generates a mask overlay in seconds, the tool includes professional-grade refinements like lasso and rectangular addition/subtraction, along with expansion and feathering controls.
However, Adobe is playing catch-up here. Professional colorists and editors have used DaVinci Resolve’s "Magic Mask" for several cycles, and browser-based platforms like Runway have already normalized AI-driven segmentation. Adobe’s late entry into automated rotoscoping isn't about pioneering the technology, but rather about keeping its massive user base from jumping ship to more automated workflows.
20x Faster Tracking and the Hardware Hurdle
Adobe has overhauled its underlying engine, claiming that standard shape masks—now accessible directly from the toolbar—can track objects 20 times faster than previous versions. In practical terms, this speed boost changes the room's energy: an editor can now track a complex subject during a live client review rather than telling the director, "I’ll have this ready for you tomorrow."
The "Reality Check" here is the hardware requirement. Adobe’s move toward on-device processing means these performance gains are heavily dependent on modern silicon. While editors on high-end workstations or the latest Apple M-series chips will see immediate benefits, those working on older laptops will likely find themselves left behind as the software evolves beyond their local processing power.
Bridging the Gap from Concept to Timeline
Adobe is also tightening the loop between its generative ideation tools and the NLE. Premiere Pro now supports direct imports from Firefly Boards, allowing production teams to move from collaborative storyboarding into the assembly phase without the friction of traditional asset handoffs.
Other workflow refinements include:
-
Frame.io V4 Integration: A native panel that consolidates media management and versioning within the edit.
-
Audio Remix: Utilizing Adobe Sensei to automatically re-time background tracks to match clip lengths, avoiding the need for manual blade cuts.
-
Integrated Adobe Stock: Direct asset access within the primary interface to minimize application switching.
The Privacy Pivot: On-Device vs. The Cloud
The tension between creative professionals and AI training data remains at an all-time high. To address concerns over intellectual property, Adobe has specified that the new Object Mask and its associated models operate entirely on-device.
By keeping the computation local, Adobe is attempting to draw a hard line: the company explicitly stated it does not use customer footage or activity to train its generative models. This localized approach serves a dual purpose—it offers a layer of security for high-end production assets and eliminates the latency issues inherent in cloud-based AI tools. For a professional editor, the assurance that a client’s unreleased footage isn't being used to "teach" an algorithm is just as important as the time saved on the edit.
