You've finished a design. You're proud of it, but you need to know if it actually works. So you ask for feedback.
The problem? Most feedback isn't useful. You get "looks good!" or "I like it" or "maybe change the color?"—responses that don't help you make better decisions.
Here's how to get feedback that actually helps.
Useful Feedback vs. Nice Feedback
Nice feedback makes you feel good but doesn't help you improve:
- "Looks great!"
- "I like it!"
- "This is really nice."
Useful feedback helps you make decisions:
- "I understood the main action immediately."
- "I wasn't sure what to click first."
- "This feels trustworthy because of the social proof."
The difference? Useful feedback is specific, actionable, and based on behavior—not just preference.
Who to Ask (And When)
Users (Best for: Real-world validation)
Ask users when you need to know if your design actually works for your target audience. Users don't know you, so they'll be honest. They also represent your actual audience.
When to use: Testing landing pages, signup flows, pricing pages, or any design that needs to convert.
How to find them: Share your design in relevant communities, use feedback platforms, or run A/B tests with real voters.
Peers (Best for: Technical critique)
Ask fellow designers when you need feedback on visual hierarchy, spacing, typography, or design patterns. Peers understand design principles and can spot issues you might miss.
When to use: Polishing visual details, checking accessibility, or validating design system consistency.
How to find them: Design communities, Slack groups, or design critique sessions.
Friends (Best for: Emotional support, not validation)
Friends are great for encouragement, but they're biased. They want to support you, so they'll find something nice to say even if the design has problems.
When to use: When you need a confidence boost, not when you need honest validation.
Avoid using friends for: Conversion-critical decisions, headline tests, or any design that needs unbiased feedback.
What to Ask (10 Copy-Pastable Questions)
The questions you ask determine the quality of feedback you get. Here are 10 questions that lead to useful responses:
-
"What's the main action you'd take on this page?" (Tests clarity of purpose)
-
"What do you think this product/service does?" (Tests comprehension)
-
"What feels confusing or unclear?" (Surfaces friction points)
-
"Which element draws your attention first?" (Tests visual hierarchy)
-
"Does this feel trustworthy? Why or why not?" (Tests credibility signals)
-
"What would make you more likely to [take action]?" (Tests motivation)
-
"If you had to describe this in one sentence, what would you say?" (Tests messaging clarity)
-
"What's missing that you'd expect to see?" (Surfaces gaps)
-
"Which version feels more [professional/clear/compelling]?" (For A/B tests)
-
"What would stop you from [taking action]?" (Surfaces objections)
These questions force people to think critically instead of defaulting to "looks good."
How to Structure Feedback Sessions
Async vs. Live
Async feedback (best for: Quick validation, A/B tests)
- Share designs via link or image
- Ask specific questions
- Collect responses over 24-48 hours
- Pros: Fast, scalable, less pressure on voters
- Cons: No follow-up questions, less context
Live feedback (best for: Deep dives, complex flows)
- Schedule 15-30 minute sessions
- Walk through designs together
- Ask follow-up questions in real-time
- Pros: Rich context, immediate clarification
- Cons: Time-consuming, harder to scale
Timeboxing
Set a time limit for feedback sessions. 15-30 minutes is usually enough. Longer sessions lead to diminishing returns and fatigue.
Capturing Notes
Don't rely on memory. Take notes during live sessions, or use a structured form for async feedback. Capture:
- What they said (direct quotes)
- What they did (behavior, not just words)
- Patterns (if multiple people say the same thing, it's probably important)
The "Two Options" Method
Instead of asking "what do you think of this design?", show two options and ask "which one works better?"
This is called A/B testing, and it's powerful because:
- It forces a choice (no vague "it's good" responses)
- It reduces bias (people compare, not just compliment)
- It gives you data (percentages, not just opinions)
- It's faster (voters spend seconds, not minutes)
Example: Instead of asking "is this headline good?", show two headlines and ask "which one makes you want to read more?"
The two-option method turns subjective feedback into objective data.
Before You Ask for Feedback: A Checklist
Before you share your design, make sure you're ready:
- [ ] You have a clear question (not just "thoughts?")
- [ ] You've identified who to ask (users vs. peers vs. friends)
- [ ] You've chosen the right format (async vs. live)
- [ ] You've prepared specific questions (use the list above)
- [ ] You've set expectations (time limit, what you're testing)
- [ ] You're ready to act on the feedback (don't ask if you won't listen)
If you can't check all of these, you're not ready for feedback yet.
Get Unbiased Feedback Fast
The fastest way to get useful feedback? Create an A/B test on DesignPick.
Upload two versions of your design, share with the design community, and get real votes on which one works better. You'll have results in hours, not days—and the feedback will be honest because voters don't know you.
The Bottom Line
Useful feedback comes from asking the right people the right questions at the right time. Avoid "looks good" responses by being specific, comparing options, and testing with people who represent your actual audience.
Want more feedback strategies? Browse more experiments on the blog.