Creatives on AI: Carly Ayres
Ayo Fagbemi kicks off our Creatives on AI series with a conversation with Carly Ayres on how AI is reshaping content, strategy, and creative workflows.
Written By 
Ayo Fagbemi
Published on 
Jun 5, 2025
6
 min. read

Welcome to Creatives on AI, our series of conversations with people across the industry exploring the nuance behind AI’s role in our work. We’re shifting our focus past the polarizing one-liners and digging into how AI may help, hurt, and fundamentally reshape our creative worlds. While AI isn't systematically replacing creative roles, it sure is changing things (and fast!). If you want to understand how people are navigating this seismic shift, stay tuned in for new perspectives each week this month.

Ayo Fagbemi is kicking off the conversation with Carly Ayres—Program Director at AIR (air.collabfund.com), and previously of Figma, Google, and HAWRAF.

I’m a big fan of your writing, and how you think through the nuance of AI. How are you using it?

It’s wild to look back even a few months and see how much my workflow has shifted. I now use ChatGPT daily—often across multiple tabs—whether I’m drafting communications for AIR or issues of my newsletter. Perplexity is usually open in the next window, helping me chase down context or citations. These tools have become as natural to my process as opening a Google Doc.

When we reviewed over 200 applications for AIR’s Cohort One, I used ChatGPT to “chat” with the submissions—extracting links, surfacing themes, summarizing takeaways (and, of course, verifying after). It became this interactive layer between raw input and human decisions.

That’s the pattern I’ve seen emerge across projects: once I define the creative direction—voice, structure, intent—I’ll use AI to fill in the blanks, draft variations, or speed up production. It’s less about delegation and more about leverage.

How else do you think knowledge work will be rearranged due to AI? 

It’s sharpening the question of where our time actually creates value. Prioritization has always mattered, but now that baseline capacity has multiplied, the stakes feel higher. If you can do five times the work, what really deserves your attention?

For me, that’s meant being more ruthless about what needs me. The work that benefits from originality, such as concept development, creative strategy, narrative framing, gets my full attention. The rest, I try to systematize. If I catch myself manually writing 30 lines of boilerplate copy, I know I’ve made a bad trade.

AI hasn’t changed what makes creative work good. It’s just clarified what’s worth doing yourself, and what isn’t. It’s also made the cost of misallocating time more obvious. What was once tolerable busywork now feels like a leak in the system.

Despite its merits, why do some in our discipline still turn away from using AI?

A lot of it is fear. When someone says “This is bad,” what they often mean is “This can’t do what I do”—and they’re hoping that stays true. That shows up as critique of craft, but really it’s about uncertainty. We ask, “Is this better than me?” when the better question is, “What is this better at than me—and how do I use that?” And, inversely, what am I uniquely good at? What’s worth doubling down on?

There’s also a strange disconnect between leadership and execution. Execs say, “Use more AI!” but there’s no clear reward structure. If you use AI and your output improves, the reward is often...more work. And more scrutiny. I’ve noticed my work gets picked apart more when I disclose I used AI, compared to when I don’t. So people keep their workflows quiet. They experiment in DMs and group chats instead.

Then you add in climate concerns, ethical qualms, internalized hustle culture—it’s a messy cocktail. I don’t blame anyone for holding back. But these tools are evolving quickly, and the longer you wait, the more you fall behind—not in technical skills, but in mental models. You need fluency to engage, but also to push back credibly.

This feels like the first tech shift happening to people vs. with them. How does AI’s rise affect long-term demand for strategists and writers?

*cracks knuckles* Hot take: The content farm is getting automated, and that’s fine. None of us dreamed of growing up to crank out SEO-optimized blog posts. 

If your edge is producing high volumes of generic content, you should be thinking about what unique perspective you bring to that work, otherwise these next few years might be rough. If you bring original thinking, can frame ideas, synthesize complex inputs—you’re fine. Better than fine. You might finally have space to do the work you’ve always wanted to do.

We’ll likely see a split: on one side, people orchestrating AI to scale content efficiently; on the other, people setting the direction and standards for what gets made. The middle—the implementers without a distinct point of view—will feel the squeeze.

This shift also breaks the traditional learning ladder. Writers and designers used to build fluency through repetition and volume—precisely what AI now handles. So how will new creatives get those skills? If we don’t design new scaffolding, we risk a generation that never got to develop depth.

When younger strategists or writers ask for advice, what should we tell them?

  • Find your fingerprint. What do you see that others don’t? What weird cross-pollinations shape how you think?
  • Get good at asking questions. A well-phrased prompt is half the job. The better the question, the better the outcome. Same with people—the person who asks the best questions often gets the most interesting answers.
  • Share your work. A blog, a newsletter, a thread—it doesn’t matter. What matters is leaving breadcrumbs. Every job I’ve gotten has come through work I’ve made public: HAWRAF’s working docs, my tweets, Good Graf.
  • Become a strong editor. The skill isn’t just in making things—it’s in knowing what to cut, what to keep, when to push, when to stop.
  • And maybe most important: tools will change, but the principles of great storytelling won’t. Don’t confuse mastering software with mastering craft.

What do you think the job spec of a writer and strategist will look like in a couple of years? 

I think we’ll see a shift from doing to directing, from volume to vision. When content becomes infinite, the job is no longer about making more, it’s about deciding what’s worth making, and why. You’ll still write, still shape, still edit. But increasingly, your job will be to set the strategy, guide the tone, define the system. The ability to say, “This is the idea, and here’s how it connects to everything else we’re doing”—that’s what will matter most.

I still like the titles “Content Strategist” and “Content Designer.” They hold up. But the expectations are shifting. Think: “Strong editorial judgment. Systems thinker. Able to collaborate with AI tools. Must have taste.”

Taste is a very interesting one. What’s the role of taste and public persona in the age of AI?

There’s a lot of talk about taste right now. Ruby Justice Thelot has great writing on this. When anyone can generate anything, curation is everything. Your lens becomes your signature. Your filter is the product.

A recognizable voice, a distinct way of seeing, that’s what makes you stand out now. It’s not about who can write the most or design the fastest. It’s about who knows what’s worth our time and attention.

I have a complicated relationship with personal branding. I haven’t had a portfolio site since 2016 and hope I never will again. But I can’t deny that public posting has been central to my career—and, often, my exits. Some people get jobs quietly. I’m just not one of them.

What future skills will give strategists and writers an edge as AI advances?

Discernment. Direction. Synthesis. The edge belongs to those who can spot what’s good, what’s not, and why—and who can turn that into a path forward others can follow.

Cross-domain thinking helps, too. Pulling from literature, science, subcultures, memes. The unexpected mashups are where new ideas live.

Systems thinking will be big. Not just making content, but designing the infrastructure for it. You create more leverage by building a framework than by shipping individual pieces.

And process fluency. You don’t need to be an AI expert, but you should understand how these tools work, where they excel, and where they fall short.

You’re always at the forefront of tech in our industry. What can we learn from early AI adopters in strategy and writing?

  1. Let the work speak for itself. Clients care about results, not your method. I’ve found showing the output  before discussing the process works better than leading with “I made this with AI.”
  2. Use AI to think more, not less. The best users aren’t automating thinking, they’re leveraging it to explore more possibilities and pressure-test more ideas.
  3. Prompt iteratively. The first thing it spits out? Never the best. But an ongoing back-and-forth not only pushes the work further, but develops your prompting skills.
  4. Learn where AI is useful, and where it’s not. Use it for the former, focus your energy on the latter. This discernment is becoming a skill itself.
  5. And lastly: We’re so early. Few (if any) know what they’re doing, and all of us are winging it. It’s a great time to be an early adopter.

I found you through HAWRAF—my old ECD at Wieden shared your transparent docs, and they were super inspiring. What would the studio’s approach to AI have been?

HAWRAF closed in 2019, but the best indicator of what we would have done with AI is probably what we’re doing now. We’ve each taken the same spirit in different directions. I’m now the Program Director at AIR, an incubator for design-led consumer AI products, where I’ve been designing a space for others to learn and build—bringing in speakers, hosting events, and creating room for curiosity, collaboration, and experimentation.

After hearing how Pedro was approaching AI, I invited him to join our initial cohort. The work he’s doing there is everything I’d hoped: playful, inventive, and deeply social—building tools that help friends and communities make things together. Watching him pull on a loose thread of an idea, not quite knowing where it’ll lead, felt like slipping back into an old rhythm. That first week, he shared a working prototype and had the entire AIR office cracking up. It was silly, surprising, and delightful.

Neither of us fully knows what we’re building yet—and that’s kind of the point. At HAWRAF, we learned to stay with the ambiguity, to let the process show us what was possible. That approach still holds. For me, it usually starts with talking it out—sharing what I’m working on, testing the edges, looking for what’s weird and worth following. That’s always where the magic happened.

Welcome to Creatives on AI, our series of conversations with people across the industry exploring the nuance behind AI’s role in our work. We’re shifting our focus past the polarizing one-liners and digging into how AI may help, hurt, and fundamentally reshape our creative worlds. While AI isn't systematically replacing creative roles, it sure is changing things (and fast!). If you want to understand how people are navigating this seismic shift, stay tuned in for new perspectives each week this month.

Ayo Fagbemi is kicking off the conversation with Carly Ayres—Program Director at AIR (air.collabfund.com), and previously of Figma, Google, and HAWRAF.

I’m a big fan of your writing, and how you think through the nuance of AI. How are you using it?

It’s wild to look back even a few months and see how much my workflow has shifted. I now use ChatGPT daily—often across multiple tabs—whether I’m drafting communications for AIR or issues of my newsletter. Perplexity is usually open in the next window, helping me chase down context or citations. These tools have become as natural to my process as opening a Google Doc.

When we reviewed over 200 applications for AIR’s Cohort One, I used ChatGPT to “chat” with the submissions—extracting links, surfacing themes, summarizing takeaways (and, of course, verifying after). It became this interactive layer between raw input and human decisions.

That’s the pattern I’ve seen emerge across projects: once I define the creative direction—voice, structure, intent—I’ll use AI to fill in the blanks, draft variations, or speed up production. It’s less about delegation and more about leverage.

How else do you think knowledge work will be rearranged due to AI? 

It’s sharpening the question of where our time actually creates value. Prioritization has always mattered, but now that baseline capacity has multiplied, the stakes feel higher. If you can do five times the work, what really deserves your attention?

For me, that’s meant being more ruthless about what needs me. The work that benefits from originality, such as concept development, creative strategy, narrative framing, gets my full attention. The rest, I try to systematize. If I catch myself manually writing 30 lines of boilerplate copy, I know I’ve made a bad trade.

AI hasn’t changed what makes creative work good. It’s just clarified what’s worth doing yourself, and what isn’t. It’s also made the cost of misallocating time more obvious. What was once tolerable busywork now feels like a leak in the system.

Despite its merits, why do some in our discipline still turn away from using AI?

A lot of it is fear. When someone says “This is bad,” what they often mean is “This can’t do what I do”—and they’re hoping that stays true. That shows up as critique of craft, but really it’s about uncertainty. We ask, “Is this better than me?” when the better question is, “What is this better at than me—and how do I use that?” And, inversely, what am I uniquely good at? What’s worth doubling down on?

There’s also a strange disconnect between leadership and execution. Execs say, “Use more AI!” but there’s no clear reward structure. If you use AI and your output improves, the reward is often...more work. And more scrutiny. I’ve noticed my work gets picked apart more when I disclose I used AI, compared to when I don’t. So people keep their workflows quiet. They experiment in DMs and group chats instead.

Then you add in climate concerns, ethical qualms, internalized hustle culture—it’s a messy cocktail. I don’t blame anyone for holding back. But these tools are evolving quickly, and the longer you wait, the more you fall behind—not in technical skills, but in mental models. You need fluency to engage, but also to push back credibly.

This feels like the first tech shift happening to people vs. with them. How does AI’s rise affect long-term demand for strategists and writers?

*cracks knuckles* Hot take: The content farm is getting automated, and that’s fine. None of us dreamed of growing up to crank out SEO-optimized blog posts. 

If your edge is producing high volumes of generic content, you should be thinking about what unique perspective you bring to that work, otherwise these next few years might be rough. If you bring original thinking, can frame ideas, synthesize complex inputs—you’re fine. Better than fine. You might finally have space to do the work you’ve always wanted to do.

We’ll likely see a split: on one side, people orchestrating AI to scale content efficiently; on the other, people setting the direction and standards for what gets made. The middle—the implementers without a distinct point of view—will feel the squeeze.

This shift also breaks the traditional learning ladder. Writers and designers used to build fluency through repetition and volume—precisely what AI now handles. So how will new creatives get those skills? If we don’t design new scaffolding, we risk a generation that never got to develop depth.

When younger strategists or writers ask for advice, what should we tell them?

  • Find your fingerprint. What do you see that others don’t? What weird cross-pollinations shape how you think?
  • Get good at asking questions. A well-phrased prompt is half the job. The better the question, the better the outcome. Same with people—the person who asks the best questions often gets the most interesting answers.
  • Share your work. A blog, a newsletter, a thread—it doesn’t matter. What matters is leaving breadcrumbs. Every job I’ve gotten has come through work I’ve made public: HAWRAF’s working docs, my tweets, Good Graf.
  • Become a strong editor. The skill isn’t just in making things—it’s in knowing what to cut, what to keep, when to push, when to stop.
  • And maybe most important: tools will change, but the principles of great storytelling won’t. Don’t confuse mastering software with mastering craft.

What do you think the job spec of a writer and strategist will look like in a couple of years? 

I think we’ll see a shift from doing to directing, from volume to vision. When content becomes infinite, the job is no longer about making more, it’s about deciding what’s worth making, and why. You’ll still write, still shape, still edit. But increasingly, your job will be to set the strategy, guide the tone, define the system. The ability to say, “This is the idea, and here’s how it connects to everything else we’re doing”—that’s what will matter most.

I still like the titles “Content Strategist” and “Content Designer.” They hold up. But the expectations are shifting. Think: “Strong editorial judgment. Systems thinker. Able to collaborate with AI tools. Must have taste.”

Taste is a very interesting one. What’s the role of taste and public persona in the age of AI?

There’s a lot of talk about taste right now. Ruby Justice Thelot has great writing on this. When anyone can generate anything, curation is everything. Your lens becomes your signature. Your filter is the product.

A recognizable voice, a distinct way of seeing, that’s what makes you stand out now. It’s not about who can write the most or design the fastest. It’s about who knows what’s worth our time and attention.

I have a complicated relationship with personal branding. I haven’t had a portfolio site since 2016 and hope I never will again. But I can’t deny that public posting has been central to my career—and, often, my exits. Some people get jobs quietly. I’m just not one of them.

What future skills will give strategists and writers an edge as AI advances?

Discernment. Direction. Synthesis. The edge belongs to those who can spot what’s good, what’s not, and why—and who can turn that into a path forward others can follow.

Cross-domain thinking helps, too. Pulling from literature, science, subcultures, memes. The unexpected mashups are where new ideas live.

Systems thinking will be big. Not just making content, but designing the infrastructure for it. You create more leverage by building a framework than by shipping individual pieces.

And process fluency. You don’t need to be an AI expert, but you should understand how these tools work, where they excel, and where they fall short.

You’re always at the forefront of tech in our industry. What can we learn from early AI adopters in strategy and writing?

  1. Let the work speak for itself. Clients care about results, not your method. I’ve found showing the output  before discussing the process works better than leading with “I made this with AI.”
  2. Use AI to think more, not less. The best users aren’t automating thinking, they’re leveraging it to explore more possibilities and pressure-test more ideas.
  3. Prompt iteratively. The first thing it spits out? Never the best. But an ongoing back-and-forth not only pushes the work further, but develops your prompting skills.
  4. Learn where AI is useful, and where it’s not. Use it for the former, focus your energy on the latter. This discernment is becoming a skill itself.
  5. And lastly: We’re so early. Few (if any) know what they’re doing, and all of us are winging it. It’s a great time to be an early adopter.

I found you through HAWRAF—my old ECD at Wieden shared your transparent docs, and they were super inspiring. What would the studio’s approach to AI have been?

HAWRAF closed in 2019, but the best indicator of what we would have done with AI is probably what we’re doing now. We’ve each taken the same spirit in different directions. I’m now the Program Director at AIR, an incubator for design-led consumer AI products, where I’ve been designing a space for others to learn and build—bringing in speakers, hosting events, and creating room for curiosity, collaboration, and experimentation.

After hearing how Pedro was approaching AI, I invited him to join our initial cohort. The work he’s doing there is everything I’d hoped: playful, inventive, and deeply social—building tools that help friends and communities make things together. Watching him pull on a loose thread of an idea, not quite knowing where it’ll lead, felt like slipping back into an old rhythm. That first week, he shared a working prototype and had the entire AIR office cracking up. It was silly, surprising, and delightful.

Neither of us fully knows what we’re building yet—and that’s kind of the point. At HAWRAF, we learned to stay with the ambiguity, to let the process show us what was possible. That approach still holds. For me, it usually starts with talking it out—sharing what I’m working on, testing the edges, looking for what’s weird and worth following. That’s always where the magic happened.

Further Reading

Sound Off
The Voice of Non-Alcoholic Bevvies
By 
Emily Coyle
min.
Interviews
Pat Attenasio Interview
By 
The Subtext Editorial Team
min.
Verbal Archive
Ooma Verbal Identity
By 
Allison Dobkin
min.
Interviews
Fia Townshend Interview
By 
The Subtext Editorial Team
min.
Sound Off
How a stray paper airplane made me a better writer
By 
Dan Steiner
min.
Verbal Archive
Sister Mary Verbal Identity
By 
Emily Penny
min.
Wall of vintage pulp magazine covers.
Newsletters
Stay in the loop with The Subtext! Subscribe to our newsletter for the latest articles, exclusive interviews, and writing tips delivered straight to your inbox. Join our community of passionate writers and never miss a beat.