Google Stitch: From simple prompt to working app UI

Łukasz Grochal
Source:Stitch With

Stitch (stitch.withgoogle.com) is an experimental Google Labs tool that turns text prompts, sketches, or screenshots into ready‑to-edit UI designs and front‑end code for web and mobile apps. It is aimed at designers, developers, product folks, and even non‑technical teams who want to move from idea to interface much faster, without starting from a blank canvas in Figma or a code editor.

You describe what you need in natural language, optionally add an image or wireframe, and Stitch generates multi‑screen layouts, responsive variants, and HTML/CSS you can inspect or export. The tool runs on Google’s Gemini 2.5 models and offers two main modes: a faster Standard mode for quick drafts, and a slower Experimental mode that accepts visual input and pushes for higher fidelity. From there you can iterate conversationally, branch variations, tweak themes, and then send everything directly into Figma with Auto Layout preserved, or hand it off to developers through generated code or integration with Google AI Studio.​

Instead of replacing full design suites, Stitch sits earlier in the workflow as an ideation and layout engine that helps teams explore concepts, validate structures, and get “good enough” first passes before doing detailed visual design and interaction work elsewhere. It is currently available for free via Google Labs, with usage limits depending on the chosen mode, which makes it easy for individuals and small teams to experiment without budget friction. In practice it works best when you treat it like a visual collaborator: give it a clear description (for example, a mobile habit tracker, an admin dashboard for beekeeping, or an onboarding flow for a library app), review what it proposes, and then refine the strongest direction with follow‑up prompts or manual edits after exporting.

There are still trade‑offs: it is not as deep as Figma or Adobe XD in terms of component systems, design tokens, or prototyping, and complex product design still demands human judgment, but it can dramatically reduce the time needed to get from a rough idea to tangible screens that stakeholders can react to.

References
1 source
01
stitch.withgoogle.comStitch (stitch.withgoogle.com)
Qwen3.6 Coding Agent Benchmarks Chart Visual

Exploring Qwen3.6: Coding Benchmarks and Speed

Palantier Dilemma Human Rights vs Sercurity

Europe's Palantir Boom Amid Sovereignty and Rights Fears

Project Glasswing: Anthropic Mythos Zero-Day Exploit Finder Art

Claude Mythos Leak Ignites Fears of Unstoppable AI Exploits

OpenRouter LLM Leaderboard April

Chinese AI Models Dominate OpenRouter Top Six in Token Usage

Claude Code’s Big npm Leak

Inside the Claude Code Leak and Anthropic’s Agent Design

China AI accelerator card shipments vs NVIDIA 2025 chart

NVIDIA’s AI Chip Share in China Drops from 95% to 55%

TurboQuant KV Cache Compression Visualization

Google’s TurboQuant makes AI caches smaller and faster

Black Forest Labs FLUX.2 klein

FLUX.2 klein 9B-KV Explained: Speed, Quality, GPUs

Nvidia Slashes LLM Context Memory With KVTC Design

KVTC: Nvidia’s 20x LLM Memory Cut Without Retraining

OpenAI Sora shutdown concept

Sora’s Short Life: Inside OpenAI’s Quiet Retreat