AI is more than a product, it’s a platform that will change how and what we design—and who gets involved.
I’ve always been amazed by the things people have created in Figma. Multiplayer games. Digital cities. And in the case of a precocious middle-schooler named Maddie, an expertly-argued case made in FigJam for why her parents should buy her a bunny. But every so often, someone comes along and builds something that truly blows your mind. This was the case with Jordan Singer.
Three years ago, Jordan tweeted a short video of a plugin he built on Figma that leveraged GPT-3 to generate design ideas with a simple prompt. He called it “Designer,” and it sent #DesignTwitter ablaze. Two years later, he founded Diagram, which has been recognized as one of the most promising startups building at the intersection of design and AI.
Today, I’m excited to share that Figma has acquired Diagram. We’re thrilled to officially welcome Jordan, Siddarth, Andrew, Marco, and Vincent to the Figma team.
We’ve been following Jordan and the Diagram team’s work closely for some time, because AI is something Figma has been focused on for some time. It has the potential to transform every part of the product development process. Over the past year, we’ve built and grown a team dedicated to machine learning and accelerated investments in early development of Figma’s AI platform. And we’ve done this as members of the Figma community, including Diagram, have used our open API to build nearly 100 AI-powered plugins of their own.
We share our community’s enthusiasm. In this new era of AI, the possibilities are endless, not just for design but across the entire product development process. It will play a central role in how teams work in Figma, helping them get to a first draft faster and then design and build through to great products.
For example, in the discovery phase, we’re exploring how AI might help you generate and synthesize early ideas with a simple prompt. Or perhaps it can help you move faster during the design phase by tapping into your existing designs and design systems, and surfacing component recommendations for you to build upon. And at the development phase, we’re iterating on ways to help developers infer more context more quickly to help generate better production code.
In short, AI can help us do more—across every part of the product development process—faster. It’s not a feature, but a core capability; more than a product, it’s a platform that can up-level our work to the plane of problem solving—arguably the core pursuit of our craft, and the reason many of us got into design and product building in the first place.
But I get it: It’s hard to look at what AI is capable of these days and not wonder, “If AI can do that, what does it all mean for me?” In moments like this, it can be helpful to look to the past. When we do, we see that design has always evolved with technology. Whether it was the printing press or the smartphone, innovation has never replaced the need for thoughtful design.
Designers are no strangers to change. We’ve adapted to new platforms. We’ve become more collaborative. We’ve learned to work in hybrid work environments. And with each of these fundamental shifts, we adapt, we learn, and still—we design. This, of course, doesn’t mean things won’t be different: They will, and I believe for the better; starting with how we design.
When design systems came along, we all rejoiced at the possibility of spending less time on the little, monotonous things, like radius corners, and more time on the big things, like applying taste and directing concepts. Atomic structures** (pixels) began to coalesce into molecular ones (components), speeding up workflows and elevating our craft.
**Web designer Brad Frost’s 2016 book Atomic Design popularized an atomic design methodology for interfaces based on atoms, molecules, organisms, templates, and pages. During our recent chat, Brad underscored that “increasingly, design is curatorial,” an idea he explores in the post “Design systems in the time of AI.”
I strongly suspect AI will do what design systems did for design and development on an even greater scale. It will allow us to build higher-level and higher-leverage molecular structures. For example, instead of focusing on the design system components that make a login screen—the email field, the password, the pill button—we’ll be imagining new ways to log in that might replace email, phone, or Touch ID.
In short, AI can move us from pixels to patterns, lifting us beyond the digital interfaces we know today toward experiences that are much smoother and more intuitive—and, arguably, more human.
Which brings us to what we design, and how that will change. Somewhat paradoxically, emerging AI technology like ChatGPT is moving us away from the websites and apps that make up the bulk of our current digital experiences back to what we’ve been doing since the dawn of Google: typing in questions and getting answers.
AI can more seamlessly bridge the gap between intention and action. Today, calling an Uber to the airport requires deciding that you need a ride, opening the app, setting your location, evaluating the options, and requesting a car, but the user really just wants to say, “Get me to JFK.” As product builders, we’ll need to ask ourselves: Can AI deliver that same—or perhaps an even better—user experience with fewer clicks and decisions?
One of the most important pieces of the puzzle is how product roles and collaboration might change in the age of AI. At Figma, we believe the personas and workflows of those who do product design and development will shift. More people will become visual creators, and existing designers will be able to create more ambitious experiences than ever before.
One way to think about it is the total space of design having a ceiling and a floor: The ceiling is how good a designer can be at designing, which is constrained by the available tooling; the floor is the minimum skill required for someone to participate in design. AI will lift this ceiling, leading to more creative outputs made possible by more powerful tools; it will also lower the floor, making it easier for anyone to design and visually collaborate.
The end result, in effect, will be a shared space–and bringing people into this shared design space means we’re truly working in the same place. It’s a trajectory that feels natural to Figma, where real-time collaboration has blurred the boundaries between roles and building products increasingly becomes a shared responsibility.
Like the web and smartphone before it, the rise of AI represents a platform shift that’s reshaping how we build, what we build, and who builds it. It’s still early days. There are a lot of unknowns, and a lot left to build and figure out. It’s exciting and fascinating. It can also be a bit…scary.
I invite all of us to show up with curiosity, empathy, and creativity to help guide this technology—rather than letting it guide us. Ask questions, explore new ideas, and keep in mind that the core craft of design and product building remains the same: to solve problems.
And as a reminder, we want to design and build in the open with you. We’d love your feedback. What do you want to see? What would help? We want to hear it.
How Magician uses Figma’s text review API
We sat down with Jordan Singer, who built Magician and participated in our beta, to learn how he used the text review API, explore what the API enables, and share tips on navigating the “untapped intersection of product design and AI.”
Four years in, here’s what Config tells us about the state of design
We received over 1,000 proposals to speak at our annual user conference this year. Researcher Andrew Hogan helps us dig into the data to see what inspires and excites the community—and what continues to challenge us.