Premiere Rush CC is Adobe’s new all-in-one video editing tool for desktop and mobile

Earlier this year, Adobe previewed Project Rush, a new multi-platform video editing tool. Today, at its Max conference, the company announced that Project Rush is now Premiere Rush CC and an official part of the Creative Cloud suite.

The idea behind Rush is pretty straightforward. It’s meant to provide video creators with a modern all-in-one video editing solution that allows them to quickly edit a video and publish it on platforms like YouTube and other social networks. Indeed, it’s very much meant to be the video editing tool for the YouTube generation.

Rush takes the core parts of Adobe’s suite of video and audio editing tools and combines them in a single mobile and desktop experience. That means you get a set of Motion Graphics templates, for example, that were specifically designed to give Rush users easy access to customizable titles. The color correction system is built on top of the same technology that powers the more fully featured and complicated Premiere Pro editing tool. And the audio-editing features, including one-click ducking, are powered by the same code as their counterparts in Audition.

All of those edits easily sync between platforms, giving creators the ability to start editing on their phone, for example, and then finish their work on a laptop.

The current version of Rush is pretty much what Adobe announced a few months ago, but the company also used today’s announcement to preview what’s next. Soon, you’ll also be able to edit on Android (Adobe promises a release in 2019) and get speed controls so you can speed up and slow down your videos (a feature that YouTube creators are bound to overuse), as well as the ability to more easily create different versions of your videos for multiple platforms. The team also promises to increase performance over time.

Premiere Rush is now available to all Creative Cloud All Apps, Premiere Pro CC single app and Student plan subscribers. There is also a single app plan for $9.99/month for individuals (or $19.99/month for teams). In addition, there is a free starter plan that gives users access to all the apps and features, but limits exports to three projects.

Adobe launches new AR and drawing tools 

At its Max conference in Los Angeles, Adobe today announced a number of new products in its Creative Cloud suite. Among those is Project Aero, a new tool that allows for building new AR experiences, and Project Gemini for painting and drawing on the iPad.

The ‘Project’ moniker is Adobe’s way of signifying that these are still early-stage products and not quite ready for prime time yet. Over time, though, they typically become fully named parts of the Creative Cloud suite.

The fact that Adobe is launching a tool for building AR experience doesn’t come as a major surprise. Adobe isn’t one to stand by as hype builds around a new technology (see: Adobe’s early support for VR). Project Aero, which integrates with both Adobe Dimension and Photoshop for creating importing assets, is now in private beta, with plans for a wider release in 2019.

The other new tool is Project Gemini, which takes some of Adobe’s Photoshop technology, including its painting engine, to create a stand-alone drawing app for the iPad. The app also takes some cues from existing drawing tools from Adobe like Photoshop Sketch and Illustrator Draw. Indeed, it gets its time-lapse recording feature and support for Photoshop brushes from these — but in a new package that also includes selection and masking tools, grids, drawing guides and a mix of raster and vector drawing capabilities.

One interesting note here is that Kyle T. Webster is behind this new project. Last year, Adobe bought Webster’s Photoshop brush tools almost exactly one year ago.

“Through rigorous testing with artists of all skill levels, we reconsidered how drawing tools work. All of Project Gemini’s features are focused on accelerating drawing and painting workflows,” Adobe writes in today’s announcement. “Illustrators can expect the most natural brushes handcrafted by Kyle Webster, dynamic brushes such as watercolors and oils, new ways to select, mask, and transform, and the integration of technology by the Adobe research team.”

Gemini will support the same brushes that are available in Photoshop, as well as dynamic brushes, and feature the ability to move files between the two programs.

Like many of the projects the company announced today, Project Gemini is still in closed beta. For now, it’s also only scheduled for release on iPad, though Adobe says it’ll come to other platforms, too. I take that to mean Windows given Adobe’s and Microsoft’s close relationship and the lack of compelling Android tablets these days.

Adobe XD now lets you prototype voice apps 

Adobe XD, the company’s platform for designing and prototyping user interfaces and experiences, is adding support for a different kind of application to its lineup: voice apps. Those could be applications that are purely voice-based — maybe for Alexa or Google Home — or mobile apps that also take voice input.

The voice experience is powered by Sayspring, which Adobe acquired earlier this year. As Sayspring’s founder and former CEO Mark Webster told me, the team has been working on integrating these features into XD since he joined the company.

To support designers who are building these apps, XD now includes voice triggers and speech playback. That user experience is tightly integrated with the rest of XD and in a demo I saw ahead of today’s reveal, building voice apps didn’t look all that different from prototyping any other kind of app in XD.

To make the user experience realistic, XD can now trigger speech playback when it hears a specific word or phrase. This isn’t a fully featured natural language understanding system, of course, since the idea here is only to mock-up what the user experience would look like.

“Voice is weird,” Webster told me. “It’s both a platform like Amazon Alexa and the Google Assistant, but also a form of interaction […] Our starting point has been to treat it as a form of interaction — and how do we give designers access to the medium of voice and speech in order to create all kinds of experiences. A huge use case for that would be designing for platforms like Amazon Alexa, Google Assistant and Microsoft Cortana.”

And these days, with the advent of smart displays from Google and its partners, as well as the Amazon Echo Show, these platforms are also becoming increasingly visual. As Webster noted, the combination of screen design and voice is being more and more important now and so adding voice technology into XD seemed like a no-brainer.

Adobe’s product management lead for XD Andrew Shorten stressed that before acquiring Sayspring and integrating it into XD, its users had a hard time building voice experiences. “We started to have interactions with customers who were beginning to experiment with creating experiences for voice,” he said. “And then they were describing the pain and the frustration — and all the tools that they’d use to be able to prototype didn’t help them in this regard. And so they had to pull back to working with developers and bringing people in to help with making prototypes.”

XD is getting a few other new features, too. It now features a full range of plugins, for example, that are meant to automate some tasks and integrate it with third-party tools.

Also new is auto-animate, which brings relatively complex animation to XD that appear when you are transitioning between screens in your prototype app. The interesting part here, of course, is that this is automated. To see it in action, all you have to do is duplicate an existing artboard, modify some of the elements on the pages and tell XD to handle the animations for you.

The release also features a number of other new tools. Drag Gestures now allows you to re-create the standard drag gestures in mobile apps, maybe for building an image carousel, for example, while linked symbols make it easier to apply changes across artboards. There is also now a deeper integration with Adobe Illustrator and you can export XD designs to After Effects, Adobe’s animation tool for those cases where you need full control over animations inside your applications.

Adobe is bringing Photoshop CC to the iPad 

It’s no secret that Adobe is currently in the process of modernizing its Creative Cloud apps and bringing them to every major platform. Today, the company is using its Max conference in Los Angeles to officially announce Photoshop CC for the iPad.

Sadly, you won’t be able to try it today, but come 2019, you’ll be able to retouch all of your images right on the iPad. And while it won’t feature ever feature of the desktop from the get-go, the company promises that it’ll add them over time.

As with all of Adobe’s releases, Photoshop for iPad will play nicely with all other versions of Photoshop and sync all the changes you make to PSD files across devices. Unsurprisingly, the user experience has been rethought from the ground up and redesigned for touch. It’ll feature most of the standard Photoshop image editing tools and the layers panel. Of course, it’ll also support your digital stylus.

Adobe says the iPad version shares the same code base as Photoshop for the desktop, “so there’s no compromises on power and performance or editing results.”

For now, though, that’s pretty much all we know about Photoshop CC on the iPad. For more, we’ll have to wait until 2019. In a way though, that’s probably all you need to know. Adobe has long said that it wants to enable its users to do their work wherever they are. Early on, that meant lots of smaller specialized apps that synced with the larger Creative Cloud ecosystem, but now it looks as if the company is moving toward bringing full versions of its larger monoliths like Photoshop to mobile, too.