The First Attempt: What We Learned

During the NHA App pilot, we distributed surveys to parents and staff to collect feedback. This page documents what was asked, what came back, and why the approach fell short of producing data we could act on.

Background

In Spring 2026, two Google Forms surveys were distributed to pilot participants — one for parents, one for staff. The goal, as far as we can determine, was to gauge general satisfaction with the NHA App during its mid-pilot phase and identify areas for improvement before a full network rollout.

Both surveys were anonymous. Response rates were low. And the data that came back, while well-intentioned, did not give us anything specific enough to prioritize development work, identify root causes of user frustration, or follow up with individuals who reported problems.

Below is a complete reconstruction of each survey, followed by our analysis.

Parent Mid-Pilot Experience Survey

The parent survey had 11 questions covering onboarding, feature ratings, SPARK, in-app payments, general satisfaction, and three open-ended prompts. An optional contact field was included at the end.

NHA App: Parent Mid-Pilot Experience Survey
Help us build a better connection! You have been using the new NHA App for four weeks. Your feedback will directly shape how this app works for thousands of families across the country.
1. How easy was it to get started with the NHA App?
Seamless: Downloaded and logged in on the first try.
A little tricky: Had some trouble with my password/login at first.
Difficult: Needed help from the school office to get in.
N/A: I haven't downloaded the app yet.
2. On a scale of 1 (Frustrating) to 5 (Excellent), how would you rate your experience with:
12345N/A
Messaging your child's teacher
Reading school announcements/posts
Viewing the school calendar
3. Have you seen Spark yet?
Yes, I love seeing my child's points!
Yes, but I am not sure what it means yet.
No, I haven't seen any points yet.
Other
4. Have you used in-app payments?
Yes, I've made a payment (it was easy).
No, I haven't needed to pay for anything yet.
I tried, but ran into an error.
Other
5. Please rate the following:
Strongly DisagreeDisagreeNeutralAgreeStrongly Agree
The app is easy to navigate.
I can quickly find the information I need.
Notifications are timely and helpful.
The app feels simpler than SchoolConnect.
6. If the NHA App fully replaced SchoolConnect, I would feel:
Very Comfortable
Somewhat Comfortable
Unsure
Uncomfortable
7. What is the BEST thing about the new NHA App?
Your answer
8. What is ONE thing we should change to make it easier for you to stay connected?
Your answer
9. What is the one thing that would most improve the app for you?
Your answer
10. If you would like us to follow up with you, please provide your name, school, and email address:
Your answer

Analysis: Parent Survey

Structural Problems

No clear objective. The survey tries to cover everything — onboarding, feature satisfaction, SPARK, payments, navigation, notifications, and SchoolConnect replacement readiness — in 10 questions. Without a defined goal, the data scatters across too many dimensions to draw meaningful conclusions from any one of them.

Questions 7, 8, and 9 ask the same thing three ways. "Best thing," "one thing we should change," and "one thing that would most improve" — these overlap heavily. Open-ended questions are expensive (low response quality, high analysis effort). Using three of them with subtle wording differences dilutes the signal from all three.

Question-Level Issues

Question 5: "Notifications are timely and helpful." This bundles two distinct concepts. A notification can be timely but unhelpful (irrelevant content), or helpful but delayed. When a parent rates this a 2, we don't know which problem they're reporting.

Question 6: "I would feel..." Asking parents whether they'd feel "comfortable" replacing SchoolConnect measures anxiety, not product quality. A parent could be uncomfortable with change itself regardless of whether the NHA App is better. This tells us about change management, not about the app.

Questions 3 and 4 (SPARK, Payments): These are feature-awareness checks embedded in a satisfaction survey. They don't help us improve either feature — they just tell us whether people found them. Discovery and usability are different research questions requiring different instruments.

What This Data Can Tell Us

The one useful signal from this survey is Question 2 — the experience rating matrix for messaging, posts, and calendar. Even here, the data is limited because a low rating doesn't explain why the experience was poor. "I rated messaging a 2" could mean the feature is broken, the teacher doesn't respond, or the parent couldn't find it.

Staff Mid-Pilot Pulse Survey

The staff survey had 12 questions covering general sentiment, feature ratings, parent adoption, training, support, and readiness to replace SchoolConnect.

NHA App: Staff Mid-Pilot Pulse Survey
Thank you for being an NHA App Pioneer! Your honest feedback is the most critical piece of data we will collect. Please take 5 minutes to help us refine the platform before it reaches every school in the network.
1. How are you feeling about the NHA App so far?
Love it - It's a major improvement.
Getting there - It's fine, just takes some getting used to.
Frustrated - It's making my day harder right now.
2. Why did you choose the answer above?
Your answer
3. Rate the following features on a scale of 1 (Poor) to 5 (Excellent):
PoorFairGoodVery GoodExcellent
Direct Messaging
Posting to your class/school
SPARK
Calendar
4. Are your parents successfully making the switch?
Yes, most are using the NHA App exclusively.
It's split; I'm still getting messages in SchoolConnect.
No, I'm having to remind them constantly.
5. Training and Support
Definitely NotKindaNeutralProbablyDefinitely
I felt adequately trained before go live.
Training materials are easy to access.
I know where to go for help.
6. Have the support responses from the NHA App Team been helpful?
Yes, very responsive.
They respond, but the issue is still there.
I haven't needed support yet.
7. What is the #1 technical issue (if any) that you have encountered?
Your answer
8. If you could change ONE thing about the app's interface tomorrow, what would it be?
Your answer
9. Overall, the NHA App is ready to replace SchoolConnect:
Yes
Yes, with minor changes
Not yet
No
10. Why did you choose the answer above?
Your answer
11. What is the most important improvement we should make before full rollout?
Your answer

Analysis: Staff Survey

Observations

Question 1 provides a quick temperature check. "Love it / Getting there / Frustrated" captures overall sentiment but doesn't explain the cause. A teacher who selected "Getting there" because messaging is unreliable and one who selected it because they haven't had time to explore need different interventions. The follow-up ("Why?") is where the real insight lives, but open-ended answers from anonymous respondents are harder to act on at scale.

The training section (Q5) uses an informal scale. "Definitely Not / Kinda / Neutral / Probably / Definitely" is approachable language, but it makes it difficult to benchmark against standard survey instruments or compare results across future administrations.

Question-Level Issues

Question 4 asks staff to assess parent behavior. A teacher's perception of whether parents have "switched" is filtered through their own experience. Some teachers may not notice because they don't monitor SchoolConnect anymore. Others may overestimate the problem because a few vocal parents are the ones they hear from. This is secondhand data presented as a direct measurement.

Question 6 missing an option: "Yes, very responsive" / "They respond, but the issue is still there" / "I haven't needed support yet" — there is no option for "I contacted support and didn't get a response." The question assumes support always responds. Respondents who were ignored have no accurate choice.

Questions 9-10 repeat the SchoolConnect readiness theme from the parent survey. Asking whether the app is "ready" is a policy question, not a product question. The answer depends on the school, the staff member's role, which features they rely on, and what "ready" means to them. It produces a number that feels useful but isn't.

What This Data Can Tell Us

Question 7 ("#1 technical issue") and Question 8 ("change ONE thing") are the most useful questions on this survey. They're direct, specific, and constrained to one item. If this survey had been non-anonymous with 10x the respondents, these two questions alone would have been more valuable than the rest combined.

Methodology Problems

Beyond the individual question issues, the survey program had structural problems that limited the value of any data it could produce.

What the Data Showed

Across both surveys, 114 responses came back — 47 from parents and 67 from staff — representing three pilot schools. Even with the structural limitations described above, the numbers tell a story.

114
Total responses (47 parents, 67 staff) across 3 pilot schools
49%
of parents had seamless login
15%
of parents needed office help to log in
38%
of parents uncomfortable replacing SchoolConnect
6%
of staff "love it" — 66% "getting there," 28% frustrated
47%
of parents reported no push notifications
70%
of parents never saw Spark points
52%
of staff say app not ready to replace SchoolConnect
18%
of staff say parents use NHA App exclusively
What the Data Actually Revealed

Despite the methodology issues, the surveys surfaced genuine problems. The top issues — broken session persistence, missing push notifications, inability to search parents by student name — were specific and actionable. These came primarily from the constrained open-ended questions and the issue checklists, not from the satisfaction scales.

The Likert ratings produced numbers (e.g., 47% negative on notifications) but didn't explain the cause. The prioritized bug lists were the most valuable output of the entire exercise.

These findings directly shaped our new survey design. The three functions that broke down most visibly — messaging, posts, and notifications — became the sole focus of the new instrument. And the inability to follow up with the 47% who reported missing notifications, or the 15% who couldn't log in without help, reinforced the value of authenticated responses — being able to follow up with specific users when issues are identified or resolved.

The Verdict

The first surveys gave us a useful starting point and surfaced real issues that shaped development priorities. Building on those lessons, the next round is designed to go deeper — focused on the three core communication functions, authenticated so we can follow up with respondents directly, and structured to produce data that maps to specific development decisions.

See the New Strategy → ← Back to NHA AI