Most staff sustainability surveys are useless. Someone in marketing sends a SurveyMonkey link to the whole company on a Tuesday, asks twenty questions like “How important is sustainability to you?” on a 1-5 scale, and three weeks later a slide gets made saying 87% of staff “care about the environment.” Nothing changes. Nobody on the line ever sees the results. The dishwashers and the dish room porters and the prep cooks who actually handle waste every shift get treated as a single anonymous data point.
This guide is for the operator who wants a different outcome. You run the survey to learn things you don’t already know — about waste streams, about purchasing pain points, about training gaps, about the gap between what your sustainability policy says and what people on the floor actually do at 9pm on a Friday. You write questions that produce decisions. You report back so people see that filling it out mattered. And next year you do it again so you can measure whether the things you changed actually worked.
I’ve helped run surveys at three operations: a 200-seat farm-to-table restaurant in Portland, a contract catering operation servicing two corporate campuses in the Bay Area, and a 14-property regional hotel chain. The patterns are similar. Here’s the working version.
Why most sustainability surveys produce nothing
Before the playbook, the failure modes. If you recognize your last attempt in any of these, you’ll know what to fix.
Surveying the wrong people. The corporate sustainability lead writes the survey. It gets distributed through HR, which sends it to everyone with a company email. That cuts out about 60% of your hourly hospitality staff who don’t check work email, who get paged through the manager Slack, or who are part-time and only work weekends. You measure the views of office staff and call it staff views.
Asking about attitudes when you need to ask about behaviors. “How committed is the team to sustainability?” tells you nothing. “When the green bin is full during a busy service, what do you actually do with the next bag of food scraps?” tells you whether your composting program works after 7pm. The second question is harder to design and harder to ask without making people defensive. It’s also the one worth asking.
No anonymity, or fake anonymity. A survey with “department” and “shift” and “years at company” fields, sent to a team of forty, is not anonymous. People know it. The line cook who’s tired of being told to scrape plates harder will not say so on a form their sous chef will see. You’ll get a clean, polite, useless dataset.
Asking too much. A twenty-five question survey gets a 12% completion rate from hourly staff. A six question survey gets 60%. Length is the single biggest predictor of whether you get usable data. Cut.
No closing the loop. Even when surveys produce decent data, the results never come back to the floor. Six months later when you run it again, response rate drops by half because last time nothing happened. The closing-the-loop step is not optional. If you can’t commit to sharing results and reporting on what changed, don’t run the survey.
Step 1: Decide what specific decisions the survey will inform
This is the planning step everyone skips. Before you write a single question, write down three decisions you want to make based on the results. If you can’t list three, the survey isn’t ready.
Decisions look like:
- Whether to switch our compostable container supplier (currently World Centric, looking at Eco-Products and Vegware)
- Whether to add a second composting pickup per week, or buy a second outdoor bin
- Whether to invest in the half-day green chef training, or build something internal
- Which kitchen station needs new signage or relabeled bins
- Whether to push back on the corporate single-use water bottle policy
Each decision has cost attached. The survey is the cheap research that helps you spend the bigger money correctly. If a decision is already made — leadership has decided we’re switching to compostables and it doesn’t matter what staff think — don’t ask about it. Asking and then ignoring is how you lose trust for years.
For the Portland restaurant survey, the operator went in with four decisions: replace or keep current bagasse supplier, add weekend compost pickup yes/no, build internal training or contract it, and reorganize the back-of-house bin layout. Each question on the survey mapped back to one of those four.
Step 2: Choose your channel by where staff actually are
For an office-heavy operation, an email link is fine. For a hospitality or foodservice operation, an email link is sabotage. Here are the channels that actually work in production environments:
- QR code printed on a half-sheet poster in the break room and the back hall. People scan with their phone during break. Use a tool that works mobile-first — Google Forms or Tally.so, not corporate SurveyMonkey which still has clunky mobile rendering.
- Tablet on a clipboard at the back-of-house manager’s desk. Walk-by completion. Good for short surveys, terrible for long ones.
- Paper. Yes, paper. Especially for staff over 40 and especially in kitchens. A six-question paper survey on a clipboard with a pen on a string in the break room gets responses that nothing else gets. Plan for one person to spend an hour transcribing.
- Manager-led 1:1. Worst data quality but highest response rate. Use this only when you specifically want a manager’s interpretation of how their team is thinking, not the team’s own views. They are different things.
For the Bay Area catering operation we used QR + paper. Response rate was 71% (89 of 125). Email-only the year before got 23%. The QR poster had a $25 Sweetgreen gift card raffle attached, drawn weekly while the survey was open. That mattered.
Step 3: Write questions that produce decisions
Three rules that will fix 80% of what’s wrong with your draft questions:
Ask about behaviors and observations, not attitudes. “Have you ever seen a coworker put compostable items in the trash because the compost bin was full or hard to reach? How often?” produces actionable data. “How important is composting to you?” produces noise.
Use observable specifics. “The compostable cup we use for to-go coffee — have you ever had a customer ask you whether it’s actually compostable? When you answer, what do you tell them?” tells you whether your training on customer-facing claims is working and what version of the message is going out. “Do you feel confident discussing sustainability with customers?” doesn’t tell you anything useful.
Open-ended is okay when the question is specific enough. A general open-ended “any other comments?” produces complaints about parking. A specific open-ended “describe one thing that wastes the most food or material in your station and what you think would fix it” produces a half-page of practical answers.
Here’s a six-question template that worked well for a kitchen-heavy operation:
- Which station do you work most often? (pick one — open kitchen, prep, dish, expo, front-of-house, catering off-site, other)
- In the last two weeks, how often did you encounter a situation where compostable items ended up in the trash because of bin location, fullness, or labeling? (never / once or twice / weekly / daily / multiple times per shift)
- The single biggest source of waste at your station that you think is fixable is _____ . What’s one specific change you’d try? (open-ended)
- Our current single-use packaging — how does it perform during service? (open-ended; prompt for specific products if helpful: “the bagasse plates? the PLA cups? the wooden utensils?”)
- If we ran a half-day paid training on sustainability practices and the products we use, what one topic would be most useful? (open-ended)
- Anything else? (open-ended)
Six questions, four of them open-ended. The open-ended responses are where the value is. The multiple-choice question at position 2 anchors the frequency claim in concrete time bounds.
Step 4: Sample size, demographics, and the anonymity problem
You don’t need every staff member to respond. You need enough to be confident the results aren’t dominated by one shift or one personality.
For an operation of 100 staff, 50-60 responses is plenty. For 30 staff, 18-22 is plenty. Be careful about demographic questions when teams are small. If you have four people on the late-night dish line and you ask “shift” and “department,” those four people are no longer anonymous to anyone with the data. They know it. They’ll either skip the survey or answer carefully.
For teams under 50, ask only one demographic question — the station/department one — and only if you actually need it to make a decision. For larger operations, you can ask shift and department, but treat the data with judgment. Don’t quote a result with a sample size of three.
If you want trustworthy data on something sensitive — sustainability frustrations with management, or pushback on a specific manager’s enforcement of policy — consider a third-party tool. SurveyMonkey or Typeform with no identifying fields. Or even better, a printed paper survey with a literal locked drop box. Old technology, but staff trust it because they can see how it works.
Step 5: Run the survey for two weeks, not two days, not two months
Two weeks is the sweet spot. Less than that and you miss the weekend-only staff and the swing shift. More than that and the first week’s respondents start asking why nothing has happened yet, and you lose the urgency.
Push the survey at three points: launch day, day five, and day twelve. Each push should be different. Launch day is the announcement and the why. Day five is “here’s how many have completed — we’re trying to hit 60% before close.” Day twelve is “two days left, last chance, results coming in three weeks.”
Have a date on the calendar for sharing results. “Results coming in three weeks” with no actual date is how the closing-the-loop step gets forgotten.
Step 6: Analyzing the open-ended responses
This is the step that takes real time and that nobody allocates for. A 60-response survey with four open-ended questions has 240 paragraph-or-bullet answers to read. Reading them carefully takes three to five hours. Skim-reading them produces nothing.
Read every response. Write down themes as you go. The themes will emerge somewhere around response 25 — you’ll start to see the same complaints, the same observations, the same suggestions. Tag each response with the themes it touches. By the end you’ll have a clean list of themes ranked by frequency. That ranking is your answer to “what should we change first.”
For the Portland restaurant survey, the analysis produced six clear themes: (1) compostable utensils were splintering at customer tables — staff embarrassed, (2) prep station compost bin was too small and overflowed daily before lunch service, (3) staff didn’t know what the difference was between PLA and PHA when customers asked, (4) the cardboard sleeves on the hot cups were getting wet and slipping, (5) night-shift cleaning crew was tossing the compost bag in trash because the dumpster pickup schedule confused them, (6) staff wanted a single laminated reference card showing which products in the supply room were compostable, recyclable, or trash.
Each of those six themes mapped to a specific operational fix. None of them would have surfaced from a multiple-choice survey about attitudes.
Step 7: Report back, with specifics, on a deadline
Three weeks after the survey closes, send a one-page report to all staff who took it (and to the rest of the team). The report has three sections:
- What we learned. Three to six clear findings stated as facts, with rough percentages if you have them. “67% of you said the prep station compost bin overflows during lunch. 41% mentioned utensil quality.”
- What we’re changing. Three to five specific things, with timelines. “We’re ordering a second 32-gallon green bin for the prep station, arriving by [date]. We’re switching utensil suppliers — testing samples from two vendors in October. We’re printing a laminated bin guide for every station.”
- What we’re not doing, and why. This builds trust. “Several of you asked about composting customer napkins. Our current hauler doesn’t accept printed paper with food contact. We’re going to ask them again in six months.”
The third section is the one that builds the willingness to fill out next year’s survey. People who feel heard, including when their suggestion was declined with a reason, will respond again. People who got silence won’t.
For external authority on closing the loop with employee feedback, the federal Office of Personnel Management’s employee survey program — designed for federal agencies but with applicable methodology notes — documents how response rates collapse when results aren’t shared.
Step 8: Plan the year-two version before you finish year one
When you write the year-one report, also write down what year-two will measure. The whole point of running the survey annually is to see whether the things you changed worked. If you re-asked the same six questions, would the answers shift? You’d want to know.
So note it: “Next year we’ll ask question 2 again to see if the second prep-station bin solved the overflow. We’ll add a question about the new utensils. We’ll drop the training topic question since we already ran the half-day training in November.”
Year two becomes a much faster planning conversation because you already know what you’re tracking.
What to do with surprising answers
You will get answers you didn’t expect. Two examples from the catering operation:
The first surprise: 31% of staff said they would personally pay $0.50 more per shift meal if the cafeteria offered compostable rather than recyclable packaging. The operator hadn’t asked the question expecting that — it was a throwaway curiosity question. It became one of the data points used to justify the supplier change.
The second surprise: when asked about training topics, the top answer was not anything to do with sustainability. It was conflict resolution between front-of-house and back-of-house staff during waste sorting at the end of service. Apparently expediters were dumping mixed waste in the kitchen compost bin in their rush to get off the floor, and the dish team was the one cleaning it up. The operator had had no idea this was happening. It came out of the survey because question 3 was open-ended and broad enough to let people say what was actually on their mind.
You can’t plan for surprising answers. You can plan to have at least one open-ended question that lets them surface.
A note on what to leave off
A few things people add to staff sustainability surveys that consistently produce nothing useful:
- Carbon footprint questions. “Do you know your personal carbon footprint?” — nobody knows, the answers are random, the question signals you’re more interested in checking a box than learning something.
- Brand preference rankings. “Rank these eight sustainability values in order of importance.” This is something corporate consultants love and nobody else can do. The ranking changes based on what mood the respondent is in.
- Self-assessment of skill. “Rate your knowledge of composting standards from 1-10.” People who know nothing rate themselves 7. People who know a lot rate themselves 4. The data is noise.
- Future commitments from the respondent. “Will you commit to bringing your own coffee cup to work?” Behavior surveys ask about past behavior, not future intentions.
Tools to use, and not use
For a survey of under 100 responses, you don’t need anything fancy. Google Forms is free and produces a workable spreadsheet of results. Tally.so has cleaner mobile rendering for embedded QR-driven surveys. SurveyMonkey works fine for office staff but is heavier weight than you need.
For the open-ended analysis, do it by hand. AI summarization tools are tempting but they smooth over the specificity that’s the entire point of asking open-ended questions in the first place. A summary that says “staff have concerns about bin placement” is worth nothing. A read-through that surfaces “the prep station bin is too small, three respondents mentioned it overflows before noon” is worth a $200 bin order.
Have a colleague double-check your theme ranking. Two people reading the same responses sometimes see different patterns. Compare notes.
The actual benefit of doing this well
The best operators I’ve worked with treat the annual staff sustainability survey as a research tool that pays for itself many times over. The Bay Area catering operation ran the survey, made four changes based on it (new utensil supplier, additional bin in prep, laminated reference card, weekend pickup), and over the next year tracked a 19% reduction in contamination in their compost stream — the hauler measured it at intake. That meant fewer rejected loads, lower hauling cost.
The Portland restaurant’s survey produced a similar ROI: their new utensil supplier (switched after the splintering complaints) cost slightly more per piece but the breakage rate at table dropped by half, which meant fewer comp’d desserts and more importantly fewer customer complaints. The owner credits the survey with making the switch he had been putting off for a year.
The hotel chain found that what staff most wanted was simpler signage. They spent $800 on professionally designed bin labels in three languages and the audit-measured contamination rate at the back-of-house dropped 40% in one quarter. None of that would have happened from a top-down sustainability committee memo.
The survey isn’t the work. The survey is the input to the work. If your operation runs through compostable foodware at any scale — and you’re sourcing from a supplier like ours, browsing our compostable food containers, compostable tableware, or compostable utensils — your staff knows things about how those products perform in your specific service flow that no salesperson and no spec sheet can tell you. You just have to ask the right questions, in the right way, and then close the loop.
Six-question survey template, ready to use
Copy this into Google Forms or print on a half-sheet:
- Which station do you work most often? (open kitchen / prep / dish / expo / front-of-house / catering / other)
- In the last two weeks, how often did you encounter compostable items ending up in the trash because of bin location, fullness, or labeling? (never / once or twice / weekly / daily / multiple times per shift)
- The single biggest source of waste at your station that you think is fixable is what? What’s one specific change you’d try?
- How does our current single-use packaging perform during service? (Mention specific items if you can — the plates, cups, utensils, take-out containers.)
- If we ran a half-day paid training on sustainability practices and the products we use, what one topic would be most useful?
- Anything else we should know?
Print it. QR-code it. Put it in the break room next Monday. Two weeks later, read every response carefully. Three weeks after that, post the results and the changes on the same break-room wall.
Run it again next year. That’s the whole playbook.