Outcomes Measurement Platform: case study

Beyond Numbers: Rethinking How Foundations Measure Impact

What happens when a major foundation realizes their impact measurement system isn't working? This was the situation facing the Ontario Trillium Foundation (OTF) when they approached us about revolutionizing their evaluation approach.

The Challenge: One Size Doesn't Fit All

Imagine trying to measure the impact of a volunteer-run sports club in northern Ontario using the same tools as a large urban social service agency. That was OTF's challenge. Their grantees ranged from small community groups to complex multi-service organizations, each doing vital but very different work across the province.

The existing system of standardized surveys wasn't working. Surveys were too long, didn't capture population-level outcomes, and failed to gather the kind of feedback grantees needed to improve their programs.

Building a Better Way

Working with OTF, we developed a new approach that balanced flexibility with consistency.

The solution? A digital platform that combined user-friendly technology with human expertise.

Key Features That Made the Difference:

  • A secure bilingual portal that any organization could easily navigate

  • Customizable survey tools that let grantees choose from proven, strengths-based measures

  • Real-time dashboards that turned data into actionable insights

  • Expert evaluation coaching available within one business day

  • Qualitative measures that captured the human stories behind the numbers

The Human Touch Makes the Difference

Here's what surprised us: even minimal coaching support had a dramatic impact on how well organizations could plan and learn from their evaluations. Technology alone wasn't enough – organizations needed human guidance to make the most of these tools.

The Results Speak Themselves

Over four years, the platform successfully managed 300,000 survey responses. But the real success wasn't in the quantity of data – it was in its quality and usefulness. Organizations could finally collect information that actually helped them improve their programs, while OTF gained meaningful insights across their entire portfolio.

Key Lessons for Foundations:

  1. User-centered design is crucial for adoption

  2. Real-time data visualization encourages active use

  3. Qualitative measures bring quantitative data to life

  4. A little coaching goes a long way

  5. Flexibility within bounds works better than rigid standardization

Want to Learn More?

Download our detailed case study to discover how this approach could work for your foundation.

Get in touch with us.

Next
Next

Embedding Data Equity in Data Planning: Workshop Insights