Adobe has released the Firefly mobile app for iOS and Android. Previously, Firefly was only available in the browser or as separate tools in Photoshop. From June 2025, users can create images or short videos based on text descriptions, edit them using generative fill and expansion tools, and sync results with projects in Creative Cloud.
I'm happiest than ever to tell you that Adobe partnered with different AI model makers. Our partners and models you can use in Adobe Firefly (Boards, Mobile, etc.)
— Kris Kashtanova (@icreatelife) June 17, 2025
Google DeepMind (Veo3, Veo2, Imagen 4)
Flux Kontext, Flux Pro, etc.
Ideogram 3.0
Pika
Luma
Runway
GPT
🎉 pic.twitter.com/cQ4CJ4AY2V
In the app, you can choose between Adobe models, such as “Image 3” and “Video 1”, or use partner models. Available products include OpenAI, Google, Ideogram, Luma, Pika, and Runway, which are not even available in the web version.
The Firefly app supports prompt history, layer-based editing, and working with “Firefly Boards”, allowing ideas to be developed step by step across different devices. Free users get unlimited basic generations, while access to enhanced features or third-party models requires additional tokens within a $10 monthly subscription.