Use this template to assess how users find and understand your documentation. Get feedback on navigation, clarity, examples, and code accuracy. Improve DX.
What's Included
AI-Powered Questions
Intelligent follow-up questions based on responses
Automated Analysis
Real-time sentiment and insight detection
Smart Distribution
Target the right audience automatically
Detailed Reports
Comprehensive insights and recommendations
Sample Survey Items
Q1
Chat Message
Please focus on one specific documentation session within the last 60 days. Answer the questions about that single session.
Q2
Short Text
Which product, API, or tool were you reading about?
Max 100 chars
Q3
Dropdown
What was your main goal during that session?
Set up or install
Understand a concept
Reference API syntax or parameters
Troubleshoot an error
Evaluate options or compare approaches
Check constraints, limits, or pricing
Other/Not listed
Q4
Numeric
About how many minutes did it take to find the right page? If you went directly to it, enter 0.
Accepts a numeric value
Whole numbers only
Q5
Opinion Scale
How easy was it to find the right topic?
Range: 1 – 10
Min: Very hardMid: NeutralMax: Very easy
Q6
Multiple Choice
How did you get to the relevant docs? Select up to three.
Site search on the docs
External search engine
Sidebar or table of contents
On-page find (Ctrl/Command+F)
Direct link or bookmark
Link from another page
Tutorial or example page
Q7
Matrix
Rate the clarity of the site structure you used in that session.
Rows
Very unclear
Unclear
Neutral
Clear
Very clear
Section titles matched content
•
•
•
•
•
Heading hierarchy made sense
•
•
•
•
•
Sidebar grouping was logical
•
•
•
•
•
Breadcrumbs showed location
•
•
•
•
•
Page summary/overview was useful
•
•
•
•
•
Search results were relevant
•
•
•
•
•
Next steps/pagination were clear
•
•
•
•
•
Q8
Multiple Choice
Which best describes the examples you saw?
Runnable examples embedded in the page
Copy-paste examples only
No examples
Not sure
Q9
Matrix
Please rate these statements about examples and code. Select Not applicable if none were provided.
Rows
Strongly disagree
Disagree
Neutral
Agree
Strongly agree
Not applicable
Examples matched my language or stack
•
•
•
•
•
•
Examples were annotated or explained
•
•
•
•
•
•
I could copy-paste and run with minimal changes
•
•
•
•
•
•
Sample code covered edge cases
•
•
•
•
•
•
Example output matched expectations
•
•
•
•
•
•
Q10
Rating
Overall, how clear was the writing on the page(s) you used?
Scale: 10 (star)
Min: Very unclearMax: Very clear
Q11
Multiple Choice
Attention check: To confirm you’re paying attention, please select “I am paying attention” and no other option.
I am paying attention
I am not paying attention
Please ignore this option
All of the above
Q12
Numeric
If you ran sample code, what percent worked without changes? (0–100). If you didn’t run code, enter 0.
Accepts a numeric value
Whole numbers only
Q13
Multiple Choice
Which issues did you encounter when trying code or steps? Select all that apply.
Version mismatch with my environment
Missing imports or dependencies
Deprecated API or outdated instructions
Wrong language variant shown
Environment/setup steps missing
Typos or syntax errors
Permissions/quotas blocked progress
Unclear or hidden prerequisites
No issues
Q14
Constant Sum
Roughly how did you spend your time in that session? Allocate a total of 100 points across the categories.
Total must equal 100
Min per option: 0Whole numbers only
Q15
Ranking
Rank the most frustrating aspects of that session (top = most frustrating).
Drag to order (top = most important)
Finding the right page/topic
Dense or unclear writing
Too few examples
Examples didn’t match my stack
Inaccurate or outdated code
Missing troubleshooting guidance
Navigation didn’t reflect task flow
Q16
Long Text
What is the single most important change that would improve these docs?
Max 600 chars
Q17
Website
Optional: Link to a doc page you used.
Q18
Rating
How likely are you to recommend these docs to a colleague?
Scale: 10 (star)
Min: Not at all likelyMax: Extremely likely
Q19
Dropdown
Which best describes your role?
Software engineer
Data scientist/ML engineer
Student
Product manager
Technical writer
Researcher
Other
Q20
Dropdown
How many years of professional experience do you have?
0–1
2–4
5–9
10+
Q21
Multiple Choice
Which programming languages did you primarily use in that session? Select up to two.
Python
JavaScript/TypeScript
Java
C/.NET
Go
Ruby
C/C++
Other
Q22
Dropdown
Where are you primarily located?
North America
Latin America
Europe
Middle East
Africa
East Asia
South Asia
Southeast Asia
Oceania
Prefer not to say
Q23
Opinion Scale
Before that session, how familiar were you with this product/tool?
Range: 1 – 10
Min: Not at all familiarMax: Very familiar
Q24
Long Text
Any other feedback or context you’d like to share?
Max 600 chars
Q25
AI Interview
AI Interview: 2 Follow-up Questions about your documentation session
AI InterviewLength: 2Personality: Expert InterviewerMode: Fast
Q26
Chat Message
Thanks for completing the survey! Your feedback will help us improve the documentation experience.
Ready to Get Started?
Launch your survey in minutes with this pre-built template