- Published on
AI Era Survival Guide Part 8: Survival Strategies for Designers in the Age of Midjourney and Figma AI
- Authors

- Name
- Youngju Kim
- @fjvbn20031
AI Era Survival Guide Part 8: Survival Strategies for Designers in the Age of Midjourney and Figma AI
When Midjourney and Stable Diffusion launched in the fall of 2022, the design community was shaken. Watching high-quality images emerge from a few lines of text, many designers asked, "What does this mean for us?"
More than three years have passed since then. AI image generation tools have grown even more powerful, AI has arrived inside Figma, and AI features have been integrated across the entire Adobe product suite. And in practice, some design tasks are being automated.
But something interesting happened. Demand for designers didn't disappear — what designers are asked to do has changed. And the new work is, in many ways, more interesting and more valuable than what came before.
This article is an honest guide for designers who are feeling uncertain.
1. The Current State of AI Image Generation Tools
You need to first understand the rapid evolution of AI image generation tools. Understanding each tool's characteristics lets you use them strategically rather than fear them.
Midjourney: The King of Ideation
Midjourney remains unrivaled for producing beautiful images. It shines especially in art direction, mood board creation, and concept exploration.
# Example Midjourney prompt (for conceptual understanding)
# Generating a brand identity mood board:
# "minimalist Korean tech startup branding,
# clean typography, soft gradient, trust and innovation,
# --ar 16:9 --style raw --v 6.1"
Key characteristics of Midjourney:
- Exceptional artistic sensibility
- The difficulty of maintaining a consistent style was improved starting from v6 with the character reference feature
- Limited for precise text placement or accurate UI reproduction
DALL-E 3: The Best at Following Prompts
OpenAI's DALL-E 3 follows prompts most accurately. When given complex descriptions, it best reflects the intent. Being integrated into ChatGPT so you can converse and modify images is also a major advantage.
In particular, "placing accurate text within an image" is where DALL-E 3 excels. Previous-generation AIs scrambled text, but now it's considerably more accurate.
Stable Diffusion: The Ultimate in Control
Stable Diffusion stands apart for being open source. Extensions like ControlNet allow fine-grained control over pose, composition, depth, and more.
# Concept of precision control with ControlNet:
# - Pose Control: specify a person's pose as desired
# - Depth Control: maintain spatial depth and perspective
# - Edge Detection: render high-quality output from a sketch
# - This enables accurate reproduction of "the composition you want"
The limitation of Stable Diffusion is its learning curve. Using it well requires understanding models, LoRAs, samplers, and more. But that also means a high degree of freedom.
Adobe Firefly: Guaranteed Commercial Safety
Adobe Firefly's differentiator is one thing: fully cleared commercial licensing. Its training data consists solely of Adobe Stock images and public domain content, making it safe for commercial use without copyright disputes.
If legal safety matters for corporate client work, advertising, or product package design, Firefly is the best option.
Practical Use Cases for Each Tool
| Situation | Recommended Tool |
|---|---|
| Initial concept ideation | Midjourney |
| Client presentations | Midjourney + DALL-E 3 |
| Precise compositional control | Stable Diffusion + ControlNet |
| Commercial advertising assets | Adobe Firefly |
| Conversational iteration | ChatGPT + DALL-E 3 |
2. What Figma AI and Adobe AI Are Automating
Key Features of Figma AI
AI features integrated into Figma since 2025 are transforming design workflows.
Auto Layout + AI Suggestions
If Auto Layout was already powerful, AI can now automatically suggest layout rules. The AI reads "how these components should be arranged" and makes suggestions.
Make Designs (Text-to-UI Generation)
Type something like "dark mode dashboard, sidebar included, data table in the main area" and it generates a rough wireframe. The output isn't highly polished, but it's useful as a starting point for sketching.
Rename Layers
Automatically reorganizes dozens of layers into meaningful names. A tedious but important task that AI handles for you.
Translation Automation
Automatically translates text layers when designing multilingual services. Time spent on manual translation work has been significantly reduced.
Adobe AI (Generative Fill, Express AI, etc.)
Adobe Photoshop's Generative Fill is already used daily by many designers.
- Image background extension (Generative Expand)
- Object removal and automatic background fill
- Partial edits (Inpainting)
- Texture generation
Adobe Illustrator's AI features continue to expand as well, including vector image generation, pattern automation, and font matching.
Time Saved with Automation vs. Time Still Needed
Let me be honest. The following tasks have become much faster with AI:
- Basic asset generation (icon sets, background images, mockup images)
- Generating dozens of screens based on a design system
- Retouching old images and unifying styles
- Testing multilingual layouts
Meanwhile, tasks that still take significant time:
- Defining the uniqueness of a brand identity
- Designing complex user flows
- Designing with accessibility and inclusivity in mind
- Defining subtle interaction details
3. Design Competencies AI Cannot Replace
Now for the core of the matter. No matter how capable AI becomes, there are design competencies it cannot replace. And the value of those competencies is actually increasing.
User Research and Genuine Empathy
AI can analyze existing data. But discovering needs that don't yet exist is a different matter.
The moment in a user interview when you realize "oh, that's actually how they use it"; the ability to identify the real problem from what users don't say; discovering latent needs through observation — these remain core competencies of a human designer.
More importantly, AI cannot "empathize." When a designer genuinely empathizes with a user's pain, and that emotion is woven into design decisions, users feel it. The sense that "this product understands me" comes from empathy, not data analysis.
Brand Strategy and Direction
"What color should our brand use?" is a surface-level question. The real questions are: "What emotions should our brand evoke? What values should it convey? Will this direction still be valid ten years from now?"
AI can say "in this industry, these colors are typically used." But the strategic courage to say "we need to break convention to differentiate from our competitors" is something AI cannot produce.
Naver's green, Kakao's yellow, Toss's blue — these colors became brand assets not because data recommended them, but because of deep understanding of what the brand values and its emotional connection with users.
Complex UX Architecture
Designing not just a single screen, but a user journey where dozens of screens are organically connected, is something AI does poorly.
# Example of complex UX architecture: loan application flow for a financial app
#
# Branching by user state:
# - New user vs. existing user
# - Different flows per credit score range
# - Progress state based on document submission status
# - Handling bank system response delays
# - Recovery when user abandons mid-flow and returns
# - Accessibility requirements (visually impaired, elderly users)
# - Mobile vs. desktop differentiation
#
# Designing all these cases with consistency
# requires both systems thinking and domain knowledge simultaneously.
Transitions between each state, exception handling, error states, empty states, loading states — designing all of these consistently remains the domain of an experienced UX designer.
Storytelling and Persuasion
Good design contains a story. How a single landing page turns a visitor into a customer; what emotions it evokes along that journey, what doubts it resolves, and at what moment it prompts action.
This storytelling ability is a composite of copywriting, visual design, interaction design, and psychology. AI can support each element to some degree, but crafting an integrated story is still the work of a human designer.
4. Using AI as a Powerful Design Tool
Accelerating Ideation with Midjourney
Midjourney is powerful in the early stages of brand concepting and when aligning on direction with clients.
A practical mood board workflow:
# Step 1: Extract keywords
# Extract core emotional keywords from the client brief
# e.g., "trust, innovation, warmth, Korean sensibility, future-oriented"
# Step 2: Experiment with prompt combinations
# "trustworthy Korean fintech brand identity,
# warm yet professional, young professionals target,
# soft blue and warm beige palette,
# --style raw --ar 16:9"
# Step 3: Explore multiple directions at once
# Visualize 5-10 different mood directions in 30 minutes
# Step 4: Direction discussion with client
# Much more efficient discussion with concrete visual materials
In the past, creating a single mood board took half a day. Now you can show a client five directions in under 30 minutes.
Accelerating Mockup Creation with AI
AI saves enormous amounts of time in product photography, app screenshots, and marketing material production.
Practical use of Photoshop Generative Fill:
- Background replacement for product photos (reducing set production costs)
- Automatic changes to backgrounds for different seasons and times of day
- Clothing wear simulation without models
- Restoring damaged brand materials
App screen mockup creation:
The process of applying design file screenshots to actual device mockups has also been accelerated by AI. Plugins are available that automatically apply dozens of screens to various device frames.
The Code-Literate Designer: Figma Dev Mode
A newly important competency for designers in the AI era is the ability to collaborate with developers.
Figma Dev Mode offers functionality to automatically convert designs into code.
/* Example CSS auto-generated by Figma Dev Mode */
.button-primary {
display: flex;
flex-direction: row;
justify-content: center;
align-items: center;
padding: 12px 24px;
gap: 8px;
background: #3b82f6;
border-radius: 8px;
font-family: 'Pretendard', sans-serif;
font-weight: 600;
font-size: 16px;
color: #ffffff;
}
Designers don't need to perfectly understand this code. But knowing:
- What
border-radiusmeans - What
flexmeans - The difference between pixels and rem
...makes collaboration with developers significantly smoother. It enables far more productive conversations when trying to understand "why did this turn out differently?"
Automating Repetitive Work with AI
Designer work involves more repetitive tasks than you might think. Using AI and plugins can significantly reduce that time.
Figma plugin usage:
- Content Reel: automatically fills in realistic dummy data
- Rename It: bulk-renames layers
- Autoflow: automatically generates flow arrows
- Stark: automated accessibility checks (color contrast, vision impairment simulation)
5. The Unique Nature of AI Product Design: AI UX
Designing the UX for products that include AI features requires special considerations beyond general UX. This is the fastest-growing area of specialization for designers in 2026.
Core Principles of AI UX
1. Expressing Uncertainty
AI doesn't always provide definitive answers. There is uncertainty, like "93% probability that this photo contains a cat." How do you visually represent that uncertainty?
# Patterns for expressing AI uncertainty:
# - Confidence bar: express confidence level as a visual progress bar
# - Text labels: "AI-generated", "AI recommended", "High confidence"
# - Progressive disclosure: core result -> detailed explanation -> source data
# - Feedback request: "Was this result helpful? Thumbs up / Thumbs down"
2. Graceful Handling of AI Failures
UX design for situations where AI is wrong or cannot answer.
- Empty State: when AI can't make a recommendation due to insufficient training data
- Error State: when the AI API fails to respond
- Low Confidence State: when AI can only guess
- Fallback State: transitioning to manual functionality instead of AI
3. The UX of Streaming Responses
Interfaces where text is generated in real time, like ChatGPT. This "typing effect" is not just a visual effect. It reduces user cognitive load and makes waiting feel more natural. These subtle interaction design choices significantly change the user experience.
4. Transparency and Explainability
Should you show users "why this recommendation was made"? If so, how should you express it?
Netflix's "Because you watched [movie]" label is a good example of AI UX transparency. It's simple but makes users understand and trust the AI's logic.
Basic AI Concepts Designers Need to Know
You don't need to understand them fully, but you should know the following concepts:
- Hallucination: The phenomenon where an LLM fabricates facts. UX design is needed to prevent this (e.g., displaying information sources).
- Context window: The limit on the amount of text an LLM can process at once. How do you handle the UX for an AI "forgetting" earlier parts of a long conversation?
- Temperature: The level of creativity/randomness in AI responses. The UX for a creative AI differs from that of an accurate AI.
- Prompt injection: Attempts by malicious users to manipulate an AI system. Security-conscious UX design is required.
6. Positioning Strategies for Designers in the AI Era
Positioning Option 1: AI UX Specialist
A designer who specializes in designing the UX for AI products. This is currently one of the most in-demand directions.
Required competencies:
- Understanding of AI system characteristics (hallucination, uncertainty, etc.)
- Understanding of AI product metrics (user satisfaction, AI trust level, etc.)
- Designing prompt interfaces
- Designing AI agent interfaces
Positioning Option 2: Brand Identity Specialist
Paradoxically, the more AI mass-produces images, the more valuable a truly unique brand identity becomes. When visuals that look AI-generated are everywhere, truly human and distinctive brands stand out.
Designers taking this direction:
- Use AI tools for ideation
- But arrive at the final identity from deep brand strategy
- Develop a distinctive personal style
Positioning Option 3: Full-Stack Designer
A designer who understands code, uses AI tools, and can discuss product strategy. Demand is very high among small teams and startups.
# Example tech stack for a full-stack designer:
# - Figma (UI/UX design)
# - Framer or Webflow (design to website)
# - HTML/CSS basics (developer collaboration)
# - Midjourney/Firefly (asset generation)
# - ChatGPT/Claude (copywriting assistance)
# - Notion (documentation and UX writing)
Such a designer can single-handedly handle "design + some frontend + AI tool usage" at a solo-founder venture or small startup.
Positioning Option 4: AI Tool Evangelist
The role of leading AI tool adoption within the design team. Many design teams still haven't systematically adopted AI tools.
- Designing the team's AI workflow
- Experimenting with and evaluating AI tools
- Internal training and guideline creation
- Documenting AI usage best practices
7. Portfolio: Showcasing AI Usage + Human-Centered Design
Core Principles for a Design Portfolio in the AI Era
Simply showing "pretty screens" is no longer enough, because AI can mass-produce attractive images.
A strong portfolio shows the thinking process behind "why was it designed this way."
Case Study Structure
Organize each project with the following structure:
# Case study structure:
#
# 1. Problem Definition (Problem)
# - Which user's pain point was solved?
# - What was the business objective?
#
# 2. Research and Discovery (Research and Insights)
# - What method was used to understand users?
# - What were the key insights?
#
# 3. Design Process (Design Process)
# - What ideas were explored? (including how AI tools were used)
# - What decisions were made and why?
# - What alternatives were explored and why weren't they chosen?
#
# 4. Final Solution (Solution)
# - The completed design
# - How were AI tools used?
#
# 5. Impact (Impact)
# - What results occurred after launch?
# - What was the user response?
Being Transparent About AI Usage
Don't hide the fact that you used AI tools. It's actually a strength.
Example of a good portfolio story:
"In the early concept stage, I explored 15 mood directions with Midjourney. This allowed the client and me to align on direction in 30 minutes, instead of a 3-hour meeting. Based on the chosen direction, I then designed the actual brand assets directly. AI-generated images were used only as a source of inspiration; all final assets were produced by hand."
This kind of story shows "I know how to use AI strategically and responsibly."
Including AI UX Projects
If possible, include a case of UX design for a product with an AI feature in your portfolio.
- AI chatbot interface design
- UX for AI recommendation systems
- Trust indicator design for AI-generated content
- AI agent dashboards
If you don't have such a case, a hypothetical project — "how I would redesign this existing AI product" — is also great portfolio material.
A 12-Month Growth Roadmap for Designers in the AI Era
Months 1-3: AI Tool Mastery
Practice goals:
- Intensive Midjourney practice: 30 minutes daily, experimenting with various style prompts
- Try every Figma AI feature
- Produce one complete marketing piece with Adobe Firefly
Learning goals:
- Understand copyright issues in AI image generation
- Collect and analyze 20 AI UX patterns (from ChatGPT, Perplexity, Notion AI, etc.)
Months 4-6: Deepening AI UX Understanding
Practice goals:
- Complete the UX design for one hypothetical app with an AI feature
- Produce 5 different concepts for expressing AI uncertainty
Learning goals:
- Read 20 articles on "Design for AI" (Nielsen Norman Group's AI UX guidelines)
- Understand basic LLM concepts (2-3 hours of investment)
Months 7-9: Specialization and Portfolio Assembly
- Complete one representative project in your chosen positioning direction
- Publish 3 case studies on your portfolio site
- Present to a design community (Pagchi-won, UXKL, UX Seoul, etc.)
Months 10-12: Career Transition or Position Upgrade
- Clearly define your target role
- Strengthen networking
- Execute your job change or internal positioning strategy
Closing Thoughts: The Essence of Designers in the AI Era
Paradoxically, as AI mass-produces images, the importance of the question "why does this design have to be this way?" has grown.
AI has learned from millions of design patterns. But answering "the emotions this brand's users feel," "why this interaction feels natural in this context," "what this color means in this culture" — these questions are still the domain of a human designer.
Designers are not people who make beautiful things. They are people who solve human problems through visual experience. The more AI handles the visual execution, the more important the designer's role becomes in giving direction and meaning to the results.
Don't be afraid of AI tools. They are extensions of your hands and eyes. What matters is your judgment, your empathy, and your creativity in deciding what to create with those tools.
The AI Era Survival Guide series continues. The next installment takes a comprehensive look at job search and career change strategies in the AI era.