I have made many many feedback forms for events I have run or been a part of. Here are some simple heuristics of mine, that I write for others' to learn from and for my collaborators in the future. Most of my events have had between 50 and 500 people in them, that's the rough range I have in mind.
1. The default format for any question is a mandatory multiple-choice, then an optional text box
Most of your form should be 1-10 questions! (e.g. "How was the food, from 1-10?") Then next to it give people an optional space to provide additional text.
All forms I make primarily look like a stack of these questions.
This is because you can get a lot of signal cheaply through getting ~100 people giving a 1-10 on how the food was, or how the talks were, or some other thing. An average of 7/10 is very different from an average of 4/10, and the latter suggests you screwed up and need to do better.
Most of the time asking for text is very costly and takes much more time, and isn't relevant. The text box is there for if they need to tell you something more.
And it's common that they want to! A common experience when someone has something to say is that they feel the number is insufficient to convey their experience, and are compelled to use the free text box.
"How was the food? Oh dear, I got a terrible allergy from something that was poorly labeled, yet overall it was very tasty, healthy, and voluminous. I'm going to pu 2/10 because of my terrible reaction, but I have more to say than a simple number!"
This person uses the text box, but most people don't.
Also, sometimes people let you know some important reason why you shouldn't count their datapoint. For example, someone might rate the food 1/10, which sounds terrible, but then they'll clarify that they weren't there during mealtimes and didn't eat the food, and just gave it 1/10 because it was mandatory! This is rarely predictable, but especially with autistic people you occasionally get odd edge-cases like this.
2. All the areas of participant experience, and all areas you put serious work into, should have a multiple-choice question, and probably that should be 1-10.
Which areas?
Anything that cost a lot of money, or took a lot of staff time, or that was a big part of the participant experience.
Examples of things that I have asked about:
Yes sponsorships! If you sold sponsors part of your event, find out how positive/negative it was! This can end up being positive or negative and it's worth checking.
3. Whatever the key reasons are you ran the event, or whatever makes this event different from other events, should also have multiple choice questions!
The most important parts of your event also probably just need a single 1-10 question.
Don't ask for free-text. You won't have enough time to keep reading them all, also it will be hard to get an aggregate sense.
As an example, after the FTX explosion I ran a rationalist town hall to discuss it. Surely I wanted to ask for mini-essays from everyone about their feelings and how the event shifted them? No, not really. Here were the main questions:
To be clear, I missed rule number 1, I didn't give both optional fields. Partly this is because I field bad about taking up space in google forms; that's one way airtable is better (has better layout/density).
4. Ask how good the event was overall!
People sometimes don't ask this. It's an important question, the difference between an average of 9.2 and 5.6 is big. It helps to compare with other events too (e.g. if you end up running an event series, or an annual event, or just you run 3 different events and are curious which ones people liked more).
It also helps a lot when interpreting other questions. "They made lots of picky comments about the food and venue, yet overall gave it a 9/10, which suggests it wasn't a big determinant of their experience."
Extension: Having a consistent question between feedback forms is similarly good. I almost always use the exact wording of the NPS question, which isn't a great question, but helps me do comparisons with events run by other people (who often use the same question). I would like to hear other proposals for good questions to have over all of my events.
5. There should be basically up to 3 free-text questions that are mandatory, all other free texts should be deleted or clearly marked optional.
Free text fields are very costly in terms of time. The only time to have more than three is if you're paying people to fill it out (e.g. they're staff you employ, or the form is a paid survey for science or something).
6. My standard 3 free-text questions are "Best", "Worst", and "Change"
In essentially all user interviews I do, about any product, service, or event, I ask
This gives me a ton of detail.
7. "Name" should either be mandatory and first, or optional and last.
Either let them know up front that the info is going to be de-anonymized, or let them fill it all out and then get to reflect on whether they're happy to share. It sucks to get to the end of a form where you complain a lot and judge other people, only to find out your name is going to be attached. And it's hard to decide at the top of a form whether to add your name, you want to see what information you're sharing first.
8. Ask about the best people and worst people. Same for sessions.
Here is a set of questions I've begun to ask in my feedback forms:
Why?
Well, how much people contribute to others' experience is heavy tailed. You can find out who are the people providing a ton of value and exploit them. Turns out there's like a few people you should make sure are always at your events because they provide a ton of value.
And for the bad? Most events, basically nothing comes up. Hurrah! But then sometimes it does, and it was super helpful that you got the flag. I had light concerns about someone at an event, and then got a lot of flags, causing me to investigate further, and now I've uninvited them from further events. This was really useful for telling me to do that.
9. Babble then prune. First write a form that is too long.
I wrote out like 50+ questons for my first feedback form for Inkhaven, over the course of the first week, before cutting most of them for not being worth everyone's time. This helped me find the right ones that I didn't normally need and weren't obvious to me, like:
These all helped me identify people struggling and make an effort to help them.
10. Make sure someone in the target reference class fills it out in full before you send it out for the people.
Make sure they fill it out fully.
Make sure that they can submit the form. (I once had a form that would not submit on airtable, an hour before the closing session would start for ~400 people. I spent the entire time recreating it fresh in a format that would work. It was stressful.)
Make sure that it takes them a reasonable amount of time, without you forcing them to go fast. Ideally it should take 5-10 mins, not more than 15.
Bonus Tips
1. You should have a section in your closing session for filling it out for like 10 mins. Then you actually get serous amounts of people filling it out.
2. Make an interface for the aggregate data, and show it live as it comes in! This makes the experience rewarding for the people because literally anything at all happens to them as a result. (If you're concerned about goodharting, you can just show it after all the data comes in.)
3. Make the feedback form when you're first announcing and planning the event (e.g. 2 months ahead of time), so that it helps you think about what you're measuring.