“Transform Your Student Programming with Microfeedback: A 3-Step Blueprint for Data-Driven Decisions” ~Arel Moodie, Talk a dot, arel@talkadot.com
As a student affairs professional, you’re probably thinking, “Of course I gather feedback from my students. What kind of professional would I be if I didn’t?” But here’s the thing, collecting feedback is just the first step. It’s like buying all the ingredients for a cake but never actually baking it. Sure, you have all the necessary components, but without taking the time to process, analyze and turn it into actionable insights, it’s just a pile of ingredients collecting dust in your pantry.
Imagine this, you’re at an APCA after-conference get together with some colleagues, and someone asks you how your programming is going. You excitedly reply, “Great! We’ve been getting a lot of feedback from our students.” But then they follow up with the million-dollar question, “Oh, that’s fantastic! So, what are you doing with all that feedback?” And you’re left stuttering, “Um, well, we’re just collecting it. You know, for future reference.”
Let’s be real, no one wants to be that person. And more importantly, collecting feedback without taking the necessary steps to process, analyze and turn it into actionable insights is a wasted opportunity to improve the educational experience for your students.
I know, I know. It sounds scary, time-consuming and overwhelming to sift through all the feedback and make sense of it. But here’s the thing, it’s not just about finding the good feedback and ignoring the bad. It’s about understanding the root cause of any negative feedback and finding ways to improve. And, let’s be honest, isn’t that what we’re all here for? To make the educational experience better for our students?
And here’s some food for thought: A study by the University of Southern California found that data-driven decision making leads to a 19% increase in student engagement and a 15% increase in retention rates. Now, I don’t know about you, but those numbers speak for themselves.
Now, I know what you might be thinking, “But wait, I don’t have time for this, I’ve got a million other things on my plate.” And while it’s true, we’re all busy, it’s important to remember that this is the core of our job. We’re here to enhance the educational experience for our students, and without processing and analyzing feedback, we’re just guessing at what they need.
Close the Experience Expectation Gap
But don’t worry, we’ve got you covered! Closing what we call the “experience expectation gap” (the difference between the experience you believe you’re offering versus what students are actually feeling) is a 3 step process:
Step 1) Collecting Quality Information
The initial and hardest part of making data-driven decisions is actually collecting data in the first place. We have found “micro-feedback” is the way to go. Have you ever gone to the restroom at the airport?
Have you ever seen one of those smiley face feedback screens as you leave?
Simply press the emoji and you are done.
Super fast and super simple. Microfeedback lets you tap into the voice of your students.
Asking people to fill out long, convoluted paper forms or long questionnaires is NOT what this generation will do.
The key is to get quick feedback that’s easy to give and takes as little lift as humanly possible. Survey fatigue is real. Consider how massive amounts of short feedback compiled over time can be way more valuable than a little feedback from a small slice of your audience.
Aim to keep the ENTIRE start to finish survey experience under 2 minutes.
Step 2) Analyzing the Collected Information
This is admittedly the hardest part for most people. Just throwing up an online form and getting a bunch of answers thrown into a spreadsheet that no one looks at has NEVER been the solution.
It’s like filling your trunk with items you want to donate to Goodwill and then just driving around for months with those donations sitting in your trunk. Good intent, poor execution.
The key is to set up a system that can turn your data into graphs that clearly visually display your quantitative data into charts that make sense to you. And have those charts compare over time to one another. It’s best to not make the survey participant think too much so they “nope out” of finishing the survey. A simple Likert scale works great and yes/no questions are fast and insightful.
Here’s an example of what you can measure for your educational programming efforts:
The amount of people who took the survey (which you can compare to the total amount of people who attended as well)
- If they found the event valuable
- If they want to attend future events like this
- Was it relevant to their lives right now?
- Was the information actionable?
- How inspiring was the experience?
- How engaging was the delivery of the presentation?
This is a great way to benchmark what is working and what is not working over time. And gaining very quick qualitative data works well. Having people explain what they learned and what can be improved will go a long way.
This will allow you to be responsive to the changing needs of students, and help you use the data to help identify and address their most in the moment needs.
Step 3) Turning those Insights into Actions
Once you have your data analyzed, now this is where the rubber hits the road.
Collecting data is hard but actually turning it into something useful is even harder.
Especially when you consider how often people change jobs, so much institutional data gets lost. Data shouldn’t be lost simply because someone changes a job.
Collecting and processing data is vital in understanding ongoing trends in attendee satisfaction so you can be more responsive to changes in your attendees needs and gain deeper insights into shifts students’ activity in real time.
If you set up your data to automatically turn into visual reports so you don’t have to do it manually, then you can easily look for trends as your data comes in.
Here’s some examples of trends to look for:
Notice Quantities
Analyze which events have the most feedback as well as highest positive sentiment. This will help set internal benchmarks of what’s resonating over others (beyond just eyeballing it). Are they asking for changes you can make? For example, having celebrity speakers might be nice but not in your budget.
What is unexpected?
One client of ours discovered a phrase they were using was actually offensive to an ethnic group, though this person had no idea it was offensive. Because of this data, the uncomfortable use of that word on campus could be avoided in the future.
By following these steps, you’ll be able to improve your programming, better understand the needs of your students and avoid being the person stuttering at networking events. You’re not just collecting feedback for the sake of collecting it, you’re using it to make a real impact. Plus, you’ll be able to bake that metaphorical cake with confidence, and who doesn’t love a good cake?
SPRING 2023 (CLT) CAMPUS LIFE TRENDS MAGAZINE