The Handoff Problem: Why Developers Chase Ghost Specs
Every product team has lived through this scenario: a designer spends days perfecting a screen, annotates every margin, exports assets, and hands it off with confidence. The developer starts building, then discovers a missing hover state, an undefined breakpoint, or a color that only exists in the designer's head. Hours of back-and-forth follow. The developer is now chasing a ghost—a spec that was never fully documented. This is the design handoff shotgun: you fire a wide blast of information, hoping enough hits the target, but many details miss entirely.
Why Ghost Specs Emerge
Ghost specs often appear because of three root causes. First, designers assume shared context. A designer who has worked on a project for weeks knows that the error state uses a red border with 2px thickness, but that detail never makes it into the handoff file. The developer, new to the project, sees only the happy path. Second, tools create fragmentation. A Figma file might have 15 frames, but only 5 are fully annotated. Developers pick a frame, start coding, and later discover another frame has the real button behavior. Third, time pressure encourages shortcuts. When a sprint deadline looms, teams skip the final pass, assuming the developer will "figure it out." That assumption is the seed of a ghost.
How the Shotgun Metaphor Works
Think of a shotgun's spread pattern. At close range, it's tight and effective. At long range, pellets scatter. Design handoffs are similar: when a team works closely, with constant communication, the handoff is tight. But in remote or async environments, the handoff distance grows. Pellets (spec details) scatter. Developers miss the center of the target (core functionality) and waste time on fringe details or missing pieces. The 4 Checks framework is designed to tighten that spread, ensuring every spec lands clearly.
This section sets the stage. The rest of the guide will walk through each check in detail, with checklists and examples you can apply immediately. Teams often find that implementing even two of these checks reduces rework by a significant margin—though exact numbers vary, practitioners report noticeable drops in ticket reopenings and clarification questions.
Check #1: The Alignment Check—Ensuring Everyone Sees the Same Target
Alignment is the foundation of any successful handoff. Before a developer writes a single line of code, both designer and developer must agree on what success looks like. The Alignment Check is a structured conversation that confirms three things: the design intent, the technical constraints, and the priority of edge cases. Without this check, developers may implement a design that looks right to them but misses the designer's intent, leading to rework or, worse, a shipped feature that fails user testing.
Conducting the Alignment Check in 15 Minutes
Schedule a 15-minute sync before the handoff. Start by walking through the key screens together. The designer explains the rationale for the layout, the interaction flow, and the states that matter most (loading, empty, error, success). The developer asks clarifying questions: "Is this dropdown custom or native?" "Does this animation have a required timing function?" "Are there any accessibility constraints I should know about?" The goal is not to document everything, but to build a shared mental model. Many teams use a simple checklist: (1) Confirm core user flow, (2) Identify the three most important states, (3) Discuss spacing system (4) Validate component library usage, (5) Note any unresolved design decisions.
A Composite Scenario: The E-Commerce Checkout Failure
Consider a team building an e-commerce checkout flow. The designer created a beautiful multi-step form with animated transitions. The developer, working from static screens, implemented the form as a single page with conditional visibility. When the designer saw the prototype, the animation was missing, and the form felt clunky. A 15-minute alignment check would have revealed that the designer expected step-by-step navigation with fade transitions, while the developer assumed a single-page approach was simpler and faster. Both were technically valid, but they were not aligned. The fix required rebuilding the component, costing two days of rework. The Alignment Check would have surfaced this gap in minutes.
The Alignment Check also prevents "design drift"—where small deviations accumulate across a project. When each developer interprets ambiguous specs slightly differently, the final product becomes inconsistent. By aligning early, you set a baseline that guides every subsequent decision. Teams that skip this check often report higher bug counts and lower developer satisfaction because they feel they are building in the dark. Make this check a mandatory step before any major handoff, especially for complex screens or interactions. It takes little time but pays dividends in clarity and trust.
Check #2: The Completeness Check—Filling in the Missing Pellets
Completeness is about ensuring every state, variation, and interaction is documented before the developer starts coding. The Completeness Check examines the design file for gaps: missing error states, undefined hover effects, unspecified loading indicators, and omitted responsive behaviors. Developers often call these "the hidden 20%"—the part of the design that exists in the designer's mind but not in the file. When these gaps surface during development, they cause delays and force developers to make assumptions that may conflict with the designer's vision.
How to Run a Completeness Audit
Create a checklist of common states for every component in the handoff. For buttons: default, hover, active, disabled, loading, and focus. For text inputs: empty, filled, error, success, disabled, and focus. For cards: default, hover, selected, and loading skeleton. Walk through each screen and annotate every state that exists. If a state is missing, add it or explicitly note that it defaults to the component library's behavior. Many design tools support component variants; use them to enforce completeness. For example, in Figma, create a variant set for each button with all states included. This makes it impossible to forget a state because the variant set requires it.
Composite Scenario: The Dashboard That Broke at 3 PM
One team I read about built a real-time dashboard for a logistics company. The designer handed off a beautiful main screen with graphs and tables. The developer implemented it perfectly for the desktop view. But the design file had no mobile breakpoint, no tablet layout, and no loading state for the data tables. When the developer tested on a tablet, the graphs overlapped, and the tables overflowed. The team had to pause the sprint to create responsive layouts, adding three days of work. A Completeness Check would have flagged these gaps before development began, allowing the designer to provide mobile mockups and loading states upfront.
The Completeness Check also applies to interactions. If a screen has a swipe gesture, is the behavior documented? If a modal closes on click outside, is that specified? If a drag-and-drop list reorders items, is the animation defined? These details are easy to overlook but critical for implementation. Use a shared spreadsheet or a tool like Zeplin or Avocode to track completeness. Mark each component as "documented" or "needs spec." Do not hand off until all items are documented. This may slow down the initial handoff, but it speeds up development significantly by eliminating the need for mid-sprint clarifications.
Check #3: The Consistency Check—Making the Pellet Pattern Repeatable
Consistency ensures that similar components behave the same way across the entire product. Without it, developers build the same button three different times with three different behaviors, and users experience a disjointed interface. The Consistency Check compares the handoff against the existing design system or component library. It catches mismatches: a primary button that uses a different border radius than the library standard, a card that has a shadow when the library uses a border, or a font size that drifts from the type scale.
How to Perform a Consistency Diff
Start by identifying the design system's core tokens: colors, typography, spacing, shadows, border radii, and iconography. Then, for each screen in the handoff, check every element against these tokens. Use tools like Design System Checker (a Figma plugin) or manually review with a side-by-side comparison. Flag any deviation and decide whether it is intentional (a new pattern for a specific context) or an error. For example, if the library specifies a 4px spacing grid but a form uses 6px, the designer must justify the change. If it's intentional, document it as a new token. If not, correct it.
Composite Scenario: The App That Looked Like Two Different Products
A mobile app team was building a feature for profile management. The designer created a new settings screen using a custom card component with rounded corners and a drop shadow. The existing library used flat cards with a border. The developer built the new screen as designed, but when users navigated from the main app to the settings screen, the visual shift was jarring. Users complained that the app felt inconsistent. The team had to spend a sprint refactoring the new screen to match the library, including updating all the cards, buttons, and typography. A Consistency Check before handoff would have flagged the deviation early, allowing the designer to either use the library component or create a new one with proper justification and documentation.
Consistency also applies to interaction patterns. If the app uses a bottom sheet for all secondary menus, a new screen should not use a modal unless there is a strong reason. If all form validation errors appear inline below the field, a new form should not use a toast message. These patterns are part of the product's cognitive consistency. Breaking them confuses users and increases support requests. The Consistency Check is not about stifling creativity—it's about ensuring that intentional changes are documented and justified, while unintentional errors are caught early. Teams that enforce consistency see fewer UI bugs and faster onboarding for new developers.
Check #4: The Feasibility Check—Making Sure the Pellets Can Fly
Feasibility is the final check, and it often reveals the most painful surprises. The Feasibility Check evaluates whether the design can be built within the project's technical constraints: performance targets, platform capabilities, timeline, and existing codebase architecture. A design may look beautiful but require heavy animations that cause jank on mid-range devices, or rely on a gesture that the platform's standard library does not support. Without this check, developers may start building, hit a technical wall, and have to go back to the designer for a redesign—wasting days of effort.
How to Conduct a Feasibility Review
Schedule a 30-minute meeting with the lead developer or technical architect. Walk through the most complex screens and interactions. Ask: "Is this animation achievable with our current stack?" "How many API calls does this screen trigger?" "Are there any components that need to be built from scratch?" "Are there performance concerns with this layout?" Document any item flagged as high risk. For each risk, decide: (1) proceed with a prototype to test, (2) simplify the design, or (3) delay the feature. The goal is not to kill creativity but to identify trade-offs early. For example, a designer may want a parallax scrolling header, but the developer knows it will cause performance issues on older browsers. The team can agree on a simpler scroll effect that still looks good.
Composite Scenario: The Animation That Crashed the Browser
A team building a marketing landing page wanted a complex particle animation in the hero section. The designer created it in After Effects, and it looked stunning. The developer spent three days implementing it with CSS and JavaScript, but on a test with a budget Android phone, the animation dropped to 15 frames per second and the page became unusable. The team had to strip the animation entirely and replace it with a static background image, losing the visual impact. A Feasibility Check before development would have flagged the performance risk. The team could have tested a simple prototype first, or opted for a CSS animation with lower complexity that still created visual interest without the performance cost.
The Feasibility Check also covers data requirements. If a screen shows real-time analytics, does the backend API support that polling frequency? If a form requires autocomplete, does the search index exist? These are not design failures—they are alignment failures between design and engineering. By catching them early, you avoid the frustration of a developer who is told to "make it work" and has to negotiate scope reductions mid-sprint. The Feasibility Check builds trust because it shows designers respect technical constraints, and developers respect design intent. When both sides feel heard, the handoff becomes collaborative rather than adversarial.
Putting the 4 Checks into Practice: A Step-by-Step Workflow
Knowing the four checks is one thing; integrating them into a real workflow is another. This section provides a step-by-step guide for embedding the checks into your team's sprint cycle, from design finalization to development kickoff. The goal is to make the checks a lightweight, repeatable habit, not a bureaucratic burden. Many teams start by applying the checks to complex features first, then expand to all handoffs as the process becomes natural.
Step 1: Design Finalization (Day Before Handoff)
The designer completes the screens and runs the Alignment Check internally. They review the design against the product requirements document (PRD) and note any assumptions. They then run the Completeness Check using a state checklist, ensuring every component has all states documented. Finally, they run the Consistency Check by comparing against the design system tokens. Any deviations are flagged and either resolved or documented with a rationale. The designer exports the handoff package: Figma files, assets, and a summary document. This step should take about 30 minutes for a standard feature.
Step 2: Pre-Handoff Sync (30 Minutes)
The designer and developer meet for the Alignment Check. They walk through the key screens, discuss the intent, and review the flagged deviations. The developer asks technical questions and notes any feasibility concerns. If the Feasibility Check reveals a high-risk item, the team decides on a path forward: prototype, simplify, or defer. The meeting ends with a shared understanding of what will be built and any open questions that need resolution within 24 hours.
Step 3: Handoff and Documentation (Same Day)
The designer uploads the handoff package to the team's handoff tool (Figma Dev Mode, Zeplin, or similar). They include a README-style summary: (1) list of screens, (2) expected states, (3) interaction descriptions, (4) design system deviations, (5) known limitations or assumptions. The developer receives a notification and can start reviewing. The team uses a shared checklist in a project management tool (e.g., Notion, Jira) to track that each check was completed.
Step 4: Development Kickoff and Verification
The developer begins implementation, but after completing the first screen, they do a quick verification pass against the handoff. They check for any missing states or inconsistencies that were not caught. If they find a gap, they flag it immediately rather than assuming. This creates a feedback loop that improves the handoff process over time. After the feature is built, the team conducts a brief retrospective to discuss what went well and what needs improvement in the handoff. Teams that follow this workflow consistently report fewer clarification questions and a higher percentage of "first-time-right" implementations.
Comparing Handoff Tools: Which One Supports the 4 Checks Best?
Not all handoff tools are equal when it comes to supporting the 4 Checks framework. This section compares three popular tools—Figma Dev Mode, Zeplin, and Avocode—across criteria like state documentation, design system integration, and annotation capabilities. The goal is to help your team choose the tool that minimizes ghost specs based on your specific workflow. Remember, no tool replaces the human checks, but the right tool can make the checks easier to enforce.
| Feature | Figma Dev Mode | Zeplin | Avocode |
|---|---|---|---|
| State documentation | Uses component variants; can show all states natively | Supports notes per layer; no built-in state management | Manual annotation with color coding; no automatic state detection |
| Design system integration | Strong; can link to shared component library and tokens | Moderate; can import design tokens but requires manual setup | Moderate; supports style guides but less automated token syncing |
| Annotation and comments | Inline comments and Dev Mode annotations with auto-measurement | Section notes and screenshots with developer-focused specs | Layer-specific annotations with version history |
| Responsive preview | Built-in device frames and breakpoint previews | No native responsive preview; relies on uploaded artboards | Supports multiple artboard sizes but no live responsive simulation |
| Version control | Automatic via Figma version history | Version history for projects | Built-in version comparison |
| Best for | Teams using Figma end-to-end; strong design system maturity | Teams needing a lightweight, developer-focused spec export | Teams working with multiple design tools (Sketch, Photoshop, Figma) |
Figma Dev Mode is the strongest for the Alignment and Consistency Checks because its component variant system forces designers to define all states, and its design system integration makes token mismatches visible. Zeplin excels for the Completeness Check because its annotation system is developer-friendly, but it lacks native state management, so designers must manually add notes for each state. Avocode is best for heterogeneous teams that use different design tools, but its lack of automated state detection means the human checks are even more critical. Ultimately, choose the tool that your team will actually use consistently. The best tool is the one that your designers and developers both adopt without friction.
Frequently Asked Questions About the Design Handoff Shotgun
This section addresses common questions that arise when teams first implement the 4 Checks framework. These questions come from real team discussions and reflect the concerns that busy product managers and designers often voice. The answers aim to be practical and honest, acknowledging trade-offs where they exist.
Q: Won't these checks slow down the handoff process?
Initially, yes—especially if your team is not used to structured handoffs. However, the time saved during development far outweighs the upfront investment. For every hour spent on checks, you can save multiple hours of rework, clarification meetings, and bug fixes. Most teams report that after three to four sprints, the checks become second nature and take only 15-20 minutes per handoff. The key is to start small: apply the checks to one feature per sprint, then expand as the team builds the habit.
Q: What if the developer disagrees with a design decision during the Alignment Check?
That is exactly the purpose of the check—to surface disagreements early. Treat the Alignment Check as a negotiation, not a handoff. The designer explains the intent, the developer explains the technical constraints, and together they find a solution that serves the user. For example, if a developer says a custom animation will take too long, the designer can propose a simpler transition that still feels smooth. The goal is not to win an argument but to deliver the best possible product within constraints.
Q: Should we apply the 4 Checks to every handoff, even small ones?
No, use judgment. For a minor bug fix or a single button change, a full Alignment Check is overkill. Create a tiered system: Level 1 (small change: one component, no new states) requires only a Completeness Check. Level 2 (new screen or feature) requires Alignment and Completeness Checks. Level 3 (complex feature with new interactions) requires all four checks. This prevents the checks from becoming a bottleneck while still protecting against ghost specs for high-risk work.
Q: How do we handle handoffs when the designer and developer are in different time zones?
Async Alignment Checks are possible. The designer records a Loom video walking through the key screens and states. The developer watches it and replies with written questions. The Feasibility Check can be done via a shared document where the developer lists risks. The key is to ensure that all questions are answered before development begins, even if it takes 24-48 hours. Avoid the temptation to start coding while waiting for answers—that is how ghost specs proliferate.
Q: What if the design system is incomplete or new?
In that case, the Consistency Check becomes even more important. Treat each handoff as an opportunity to grow the design system. When a component does not match an existing token, decide whether to update the token or create a new one. Document every decision so that future handoffs have a clearer reference. Over time, the design system matures, and the Consistency Check becomes faster because there are fewer deviations to review.
Conclusion: From Shotgun to Rifle
The design handoff shotgun does not have to be the default. By implementing the 4 Checks—Alignment, Completeness, Consistency, and Feasibility—you transform a scattered process into a targeted, predictable one. Developers stop chasing ghost specs because the specs are complete, clear, and achievable. Designers regain confidence that their vision will be implemented faithfully. Product managers see fewer delays and lower rework costs. The transition requires effort, especially in the first few sprints, but the payoff is a team that works faster, with less frustration, and with more trust between disciplines.
Start today by picking one check to implement for your next handoff. Run the Alignment Check for a feature that is about to be developed. See how it changes the conversation. Then add the Completeness Check for the following sprint. Within a month, your team will wonder how they ever worked without these checks. The shotgun becomes a rifle—aimed, precise, and effective.
Comments (0)
Please sign in to post a comment.
Don't have an account? Create one
No comments yet. Be the first to comment!