Key takeaways:
- Attendance numbers do not prove impact. Funders want before-and-after evidence of what changed. Most community programmes cannot provide this, and it costs them funding.
- Six UK organisations solved this problem. Govanhill HA secured £600,000 in follow-on funding. Heart and Sound secured three years of additional funding. They did it by measuring connections, not counting heads.
- Not all connections are equal. Programmes that only build bonds between similar people plateau. The ones that build bridges across groups sustain themselves. This is measurable.
- You do not need a big budget or a data team. You need a framework that captures what actually changes when your programme works. This article explains how.
The Problem You Already Know About
You run a good programme. You know it works. The tenants tell you. The community feels different. People who were isolated are showing up, making friends, getting involved.
Then funding renewal comes around, and someone asks: “What evidence do you have that this made a difference?”
You have attendance sheets. You have a folder of thank-you cards. You might have satisfaction survey results showing 85% of participants enjoyed it. And none of that answers the question.
The question is not “did people come?” It is “what changed because they came?”
This is the evidence gap that costs community programmes their funding, their budget lines, and sometimes their existence. It is not a skills problem. Most community development officers we work with are excellent practitioners. It is a measurement problem. The tools most organisations use were designed to count activity, not capture change.
What Funders and Regulators Actually Want
Let us be specific about the pressure.
The Scottish Housing Charter requires evidence against Outcome 6 (estate management, neighbourhood quality) and Outcome 3 (participation). Your Annual Return on the Charter needs quantitative data, not anecdotes.
Tenant Satisfaction Measures in England now include TP06 (tenants feel listened to, currently at 64% sector average) and TP12 (landlord makes a positive contribution to the neighbourhood). These are published and compared across the sector.
The Sustainability Reporting Standard (SRS) has been endorsed by 35 lenders who between them hold £105.4 billion in housing association loans. Those lenders now expect ESG reporting from every organisation they fund, and the “S” is the hardest to evidence. Right now most organisations are leaving it blank or filling it with vague narrative.
The Community Wealth Building Act (Scotland, 2025) creates new duties on anchor institutions, but guidance is still being defined. Nobody knows what “good” looks like yet, which is both a risk and an opportunity.
The common thread: every one of these asks for outcomes, not outputs. They want to know what changed in the community, not how many events you ran. If you can provide that evidence, you are ahead of most of the sector.
£105.4 billion in housing association lending is now covered by the SRS. The 35 lenders behind it expect you to evidence your social impact.
The Measurement Landscape: What Works and What Doesn’t
There is no shortage of ways to claim you are measuring impact. But they are not all measuring the same thing.
| Approach | What it captures | Where it falls short |
|---|---|---|
| Attendance tracking | How many people showed up | ”100 people attended” says nothing about outcomes |
| Satisfaction surveys | How people felt about the event | Measures sentiment, not structural change |
| Financial proxies (e.g. HACT Social Value Bank) | Monetary value of outcomes | Cannot show who is connected to whom, or how the community changed |
| Network mapping | The actual connections between people, categorised by type, tracked over time | Requires participants to map their connections (a platform like Nectis guides them through this) |
Each has a role. Surveys capture sentiment. Financial proxies speak the language of boards. But if the question is “what changed in this community?”, only one of these approaches can answer it.
Network mapping shows the real connections between people in a defined group. Each person is a node. Each relationship is recorded, including what type it is, whether it is reciprocal, and how much trust exists. Participants log into a platform, map their own connections in their own time, and the software generates the maps and calculates the statistics automatically. When you do this at the start and end of a programme, you get a before-and-after picture that no other method provides.
Three Types of Connections (and Why the Balance Matters)
When you start mapping the relationships in a community, a pattern emerges quickly. Not all connections do the same thing. Understanding the difference changes how you design programmes and what you measure.
Close ties within a group
These are the strong bonds between similar people. Neighbours on the same street. Members of the same faith group. The same eight people who come to every tenants’ meeting. They provide emotional support and day-to-day help. They are essential.
But they have limits. A group with only these close ties can become insular. Everyone supports each other, but nobody connects to new opportunities, resources, or perspectives outside the circle.
Ties that reach across groups
These are the connections between people who are different from each other: different backgrounds, different estates, different interests. They allow information and opportunities to flow between groups that would otherwise stay separate.
This is where the data gets interesting. The Eden Project’s community programme tracked this transition precisely. Total connections increased by 487%. Reciprocal connections increased by 648%. But the critical finding: the shift from close-group ties to cross-group connections was the single best predictor of whether someone stayed engaged. Members with mostly cross-group connections remained active. Members with only close ties eventually dropped off.
If your programme creates spaces where people only meet others like themselves, engagement will peak and then decline. If it creates opportunities to connect across groups, engagement sustains itself.
Ties to decision-makers and institutions
These are the vertical connections between community members and decision-makers: councillors, housing officers, funders, service providers. They are the hardest to build and the most transformative. They are what turn a community group from a self-help network into an organisation that can influence policy and access funding.
Housing example: A residents’ group that, through a tenant participation programme, builds direct relationships with the housing association’s board, the local authority’s regeneration team, and a community foundation. Those connections give residents a seat at the table where decisions get made.
The balance is the insight
A Scottish Government programme (the Social Innovation and Active Living Fund, delivered through VAF) measured this balance precisely across six grassroots organisations. At baseline: 311 connections, with 68% close-group ties, 25% cross-group connections, and 7% ties to decision-makers. By the end: 1,003 connections, a 223% increase. Trusting connections grew from 222 to 767.
The total growth matters. But the real story is in how the balance shifted. A programme where 68% of connections are within existing circles is building comfort zones. A programme that deliberately cultivates connections across groups and up to decision-makers is building pathways to opportunity.
If you only count total participation, you will never see this. You need to know what types of connections are forming, not just how many.
In practice: Track the ratio of close-group, cross-group, and institutional connections at baseline and follow-up. If the ratio is not shifting over time, your activities may need redesigning, even if total connection numbers are growing.
What This Looks Like in Practice: Six Programmes, Real Numbers
The most persuasive evidence for this approach is what it produces for organisations like yours. Here are six programmes, all using real data.
Heart and Sound
A breakfast club for young men aged 16 to 24 in Fife. The lads were isolated, many dealing with depression and low self-esteem. Connections grew from 41 to 161, a 293% increase. The organisation used this data to secure three years of additional funding. Before the maps, they had anecdotes. After, they had proof.
Govanhill Housing Association
Scotland’s most diverse community. Their tenant participation programme generated 3,316 connections from 55 meetings and events over 12 months. Two new Registered Tenant Organisations, a cooperative constitution, and the evidence base that secured £600,000 in follow-on funding.
Glasgow Disability Alliance
GDA used network mapping over a decade with its 3,000+ members. Before joining, the average member had just 1 connection. After sustained engagement: 158 connections per member, including 11 close ties, 6 reciprocal relationships, and 6 trusting connections. Overall network connectivity grew by 420% across the programme, measured over ten years.
Here is real data from GDA’s Covid Response programme. The baseline (left) shows the starting connections. The “now” (right) shows what grew. You can zoom in and filter by connection type:
The network was 69% cross-group connections. Members were not just meeting people like themselves. They were building relationships across different disability groups, age ranges, and communities. The programme’s impact was independently verified and featured a foreword by Angela Constance, then Cabinet Secretary for Communities.
Eden Project Communities
The Eden Project’s UK-wide programme produced the numbers cited earlier: 487% increase in total connections, 648% increase in reciprocal connections. What made those numbers useful was the design insight they revealed. The programme ran activities that deliberately mixed participants across neighbourhoods and backgrounds. By tracking which connection types formed at each stage, they could see exactly when engagement tipped from short-term attendance to sustained involvement. Several organisations have since used that finding to restructure how they design community events.
648% increase in reciprocal connections when the Eden Project built cross-group ties into their programme design
NGHA Regeneration
New Gorbals Housing Association faced a question every HA dreads during regeneration: what happens to the community when you demolish their homes? When you rehouse 276 families across the city, you do not just move them. You destroy every relationship that held their daily life together. The neighbour who watched their kids. The friend they borrowed milk from. The person who checked on them when they had not been seen for a few days.
NGHA tracked those 276 families over 8 years through exactly this process: tower block demolition and community regeneration. Their longitudinal data captures what happens to the connections in a community during displacement, the steady erosion of close ties, the loss of the informal support networks that do not show up in any official record, and the slow, deliberate work of rebuilding.
Eight years of baseline data is almost unheard of in this sector. Most evaluations capture a snapshot. NGHA captured the whole story: before demolition, during displacement, and through the rebuilding of a new community. For any housing association planning regeneration, this data makes visible what is normally invisible. You can see the cost of disruption and the conditions under which communities recover. It turns a one-off programme into a long-term investment case.
VAF SIALF (programme-wide)
The SIALF programme, described in the “balance” section above, is notable for a different reason: it was one of the first Scottish Government-funded programmes to use network data as its primary evaluation method. The Scottish Government accepted the before-and-after connection data as evidence of impact. That set a precedent. If your funder asks “what evidence format do you want us to use?”, this is a reference point.
The Qualitative Side: Understanding How People Experience Their Connections
Network mapping shows you the structure: who is connected, what types of relationships exist, and how the picture changed. But community strength also has qualitative dimensions that the numbers alone cannot capture. Do people actually trust their neighbours? Do they feel they belong? Is there a culture of helping each other out?
This is where complementary assessment tools come in. Alongside network mapping, you can measure these qualitative dimensions at individual and group level, with baseline and follow-up scores across areas like trust, reciprocity, shared norms, and sense of belonging. Together, the two approaches give you the quantitative structure and the qualitative depth. One shows what exists. The other shows how people experience it.
For reporting purposes, this combination is compelling. You can tell the board that connections grew by 223%, that trust scores shifted measurably between baseline and follow-up, and that the community itself reports feeling more connected. Numbers plus narrative.
How to Use This in Your Organisation
You do not need to redesign your programmes from scratch. Here are practical steps drawn from the data above.
1. Audit your current evidence. What are you measuring right now? If the answer is attendance and satisfaction, you have the gap this article describes. That is not a criticism. Almost everyone starts here.
2. Run a baseline assessment. You cannot show change without a starting point. Even a mapping exercise with 20 to 30 participants will reveal patterns you did not expect: who is connected to whom, where the gaps are, and what types of connections dominate.
3. Design for cross-group connections, not just close ties. Most community activities naturally produce connections within existing groups. People attend with people they already know. If you want connections that reach across groups, you need to design for them: mixed-group activities, cross-estate projects, skills exchanges between different tenant groups.
68% of connections at baseline were close-group ties across the VAF SIALF programme. Without deliberate design, that ratio does not shift.
4. Build connections to decision-makers deliberately. They do not happen by accident. Create spaces where community members interact with decision-makers on equal terms: joint planning workshops, co-design sessions, community representatives on governance boards.
5. Measure the balance, not just the total. A programme that produces 500 new connections sounds impressive. But if 480 are close ties within an already tight-knit group, the community’s resilience has barely changed. Track the ratio over time.
6. Use mid-programme data to adapt. If your data shows cross-group connections are not forming, you can adjust while there is still time. Change the format. Introduce cross-group activities. The measurement is not just for reporting. It is a design tool.
7. Build measurement in from day one. Retrofitting evidence after a programme ends is always harder and less convincing than measuring as you go. If you are designing a new programme now, this is the single easiest thing to get right. Decide what you will measure before you start, and the evidence takes care of itself.
The most common regret we hear: “I wish we had started measuring this years ago.” Even a simple baseline gives you a foundation. The best time was at the start. The second best time is now.
The Financial Case
Community development is often the first budget line to get cut. Every year, teams fight to justify their existence. The ones that survive are the ones who can connect their work to outcomes the board understands.
£150,000 to £250,000 potential annual savings from a 2% improvement in sustained tenancies for a 2,500-home association
A sustained tenancy saves £3,000 to £5,000 in void costs. If your community programme contributes to tenant retention, and there is growing evidence across the sector that connected tenants stay longer, that is a financial case your board can work with.
There is a wider framing too. The SRS requires housing associations to report on social value. Most are struggling to fill that section with anything quantitative. If you can show before-and-after network data, you are not just evidencing your programme. You are filling a gap in your organisation’s ESG reporting that 35 lenders are watching.
9 Community Councils Group, East Ayrshire
Nine formerly isolated community councils used our Compass assessment to build trust and allocate wind farm community benefit funds transparently. First round: £809,848 in grants awarded, leveraging £1.9M in match funding (a 1:2.35 ratio). 79 projects funded, 77 jobs supported. Won the Community Wealth Building Award 2024. The trajectory points to over £100M in community benefit over the wind farm lifetime.
The organisations in this article did not just save their programmes. They grew them. 9CCG turned distrust into £100M in shared prosperity. Govanhill secured £600,000. Heart and Sound secured three years of funding. The common factor was not better marketing. It was better evidence.
What to Do Next
If any of this resonated with where you are right now, here is where to start:
- Look at your next funding report. What evidence can you provide beyond attendance? If there is a gap, that is the problem to solve.
- Talk to your team about the three types of connections. Once your team understands the difference between close-group ties, cross-group connections, and links to decision-makers, they will start noticing which ones their activities produce, without needing new tools or budgets.
- Consider a baseline. The most common regret we hear is “I wish we had started doing this years ago.” Even a simple starting point gives you a foundation to build on.
If you want to explore how this could work for your organisation, get in touch. We have spent over 20 years working alongside housing associations, disability organisations, and government programmes to measure and evidence community impact. We are always happy to have a conversation about what would fit your context.