The Making of a Generative AI Powerhouse

Lessons from Adobe’s new Firefly creative models for image, document, and experience makers
WIRED Brand Lab | The Making of a Generative AI Powerhouse

With all the hoopla over artificial intelligence in recent months, it may feel as though generative AI sprang out of nowhere this year. But the really hard work of creating and improving images through AI and machine learning (ML) has been years in the making.

Take Adobe, whose founders released Adobe PostScript 40 years ago, igniting the desktop publishing revolution. Since then, the company has continually developed technologies from Acrobat to Photoshop to After Effects that empower people to create digital content and experiences with increasing impact and ease. Adobe’s AI and ML journey started more than a decade ago when it introduced ML-powered red-eye removal in Photoshop. It released an AI platform in 2017, and today hundreds of AI-powered features embedded in Adobe’s Creative, Document, and Experience Cloud platforms enable users to push the boundaries of creativity and productivity.

Adobe Firefly—a family of creative models that reimagines how content is designed, as well as how users and computers interact—is the next step in the company’s AI evolution. Says Ely Greenfield, CTO, Digital Media at Adobe, “The nature of Firefly’s advancements puts it in a category all its own.”

When Firefly launched earlier this year, it was focused on generating images and text effects within a free website. Adobe then brought the technology into its flagship subscription-based photo, illustration, and experience applications. Today, simply typing “parrot on shoulder” on Firefly.com can instantly generate multiple photorealistic bird options to add to any portrait. And in the near future, the company plans to enable Firefly to help people of every background and skill level develop video, 3D, and other types of content with a “magical level of ease,” Greenfield says.

As of this writing, users have generated more than two billion images with Firefly—Adobe’s most successful beta release in the company’s history.

Out Ahead of the Curve

According to Greenfield, Firefly has been evolving in Adobe’s research and development laboratories for several years. Starting in early 2019, the company’s research team began exploring the potential of creative AI, introducing precursor capabilities in its applications, including Neural Filters in Photoshop, Content-Aware Fill in After Effects, Liquid Mode in Acrobat, and Attribution AI in Adobe Experience Platform. “By starting early with these AI-powered tools,” Greenfield says, “we built up internal teams of experts who deeply understood both generative AI technology and how to integrate it in our applications.”

These teams didn’t comprise only engineers. Firefly’s capabilities and guardrails were also built by groups of Adobe researchers, designers, executives, and even customers, brought together to harness AI’s potential to enhance the way people create and work. Beyond building Firefly’s user experience, Adobe designers became deeply involved in how Firefly was differentiated and brought to market. “Researchers described it as a passion project,” Greenfield says. “This commitment fueled the Firefly team during nights and weekends ahead of the launch, inspired quick adjustments based on feedback, and it continues to propel the expansion of Firefly’s footprint across every corner of our ecosystem.”

Just as Neural Filters introduced an intuitive new framework for photo editing, including simple sliders that can increase or reduce a person’s smile, for example, Firefly generative AI combines once-unimaginable computing power with natural controls. “There’s no doubt that this technology can positively change how people create—as long as it’s thoughtfully and ethically designed to empower rather than replace humans, meeting real-world customer needs and workflow requirements,” says Greenfield.

Adds the CTO, “Our designers—most of them talented artists themselves—spoke early and often with the creative community, triangulating features and issues that mattered most to creative professionals.”

Rather than deploying the technology first and considering practical issues later, Adobe addressed creative concerns, including control, credentialing, and protecting professionals’ creative styles up front. The company also concentrated on integrating Firefly deep within current creative workflows, bringing the power of generative AI to users’ fingertips in already familiar platforms.

Embedding Responsibility

Adobe’s AI Ethics principles of accountability, responsibility, and transparency served as a foundation and a North Star for the entire Firefly process. The company designed the first model to be commercially safe, training it on images it had the right to use, including high-quality Adobe Stock content, openly licensed content, and public domain content where copyright had expired. The team also worked for months to tackle known ethical challenges, addressing industry-scale AI issues with important innovations, including developing several new ML models to reduce bias and supporting prompts in more than 100 languages.

And because generative AI continues to evolve, Adobe also instituted a rigorous, multifaceted feedback program that allows it to keep tuning and improving its models. “These guardrails reduce the chance Firefly will generate content that infringes on someone else’s copyright or brand as well as other harmful outputs, helping customers feel more confident in using it in their work,” Greenfield says.

In embedding Firefly into the applications that creative professionals use every day, Adobe aims to help them reduce and even eliminate busy work, ideate quicker, expand their ability to create across more techniques and surfaces, and ultimately have more focus and time to create better work. It also applies Firefly in ways that make its software easier and more intuitive to use for people of all backgrounds and skill levels. Greenfield expects Adobe to bring new AI-powered features to its Creative Cloud, Document Cloud, and Experience Cloud applications. “Firefly’s journey has only just begun,” he says.

Advice for CTOs

Greenfield offers some broad suggestions for technologists looking to bring the power of generative AI to their businesses.

First, leverage diverse perspectives and subject matter expertise. “Firefly’s early success is the result of many different teams sharing knowledge and ideas and working collaboratively with our community,” he says. “This diversity of perspective allowed us to create a model that’s built to minimize or eliminate harmful stereotypes as well as providing a great overall experience for our customers.”

Second, stay centered on your principles. Greenfield attributes Firefly’s success to Adobe consciously designing a model that would be safe for customers to use in their business, building both confidence and trust in the company as a partner. “In a dynamic landscape like we’re facing with generative AI, it’s also important to be able and willing to evolve responsibly, aligning with new information and new realities,” he says.

This leads to his final recommendation to his fellow CTOs: Keep evolving. “Even though Firefly is out in the world, we’re still learning, improving, and expanding it every day,” he says. Adobe used the standalone Firefly website to gather the creative community’s feedback, and it continues to use that knowledge to improve the models. As the company brings Firefly capabilities into its applications, it will have new opportunities to better understand and innovate.

All of this depends on keeping users’ goals and experiences squarely in sight. “Our real success will be measured by how much value we deliver to our customers,” Greenfield says. “Because ultimately generative AI—like all technologies—is a tool to help amplify their talents, their objectives, and their successes.”

If you’d like to try Firefly for yourself, visit Firefly.adobe.com.


This story was written by the Adobe Communications team, and edited by WIRED Brand Lab.