While larger charities devote resources to measuring their impact, many smaller organisations don’t necessarily have the capacity.
Measuring outcomes has been a key focus for Anglicare WA for the best part of a decade.
The not for profit, which runs 89 social services programs, has been working on how best to measure the success of its offerings to better communicate its value to funders, staff and clients.
The organisation has built a team of data and social impact specialists who help support its frontline staff to evaluate their work.
The approach it has settled on is called results-based accountability, which measures a program’s output, how well it did it, and whether anyone is better off.
“For a for-purpose organisation, that’s actually really critical to help people to understand how they are part of the picture making positive change in the world,” Ms Boldy told Business News.
“We try not to treat it like dull data entry but to support people to understand how their data is important to understanding the difference they are creating through the work they are doing.”
Ms Boldy said the process of implementing evaluation tools had presented challenges.
It was difficult to assess whether progress was due to a particular program or other factors in the community, she said.
“That’s what you would expect; people are complex, communities are complex and so we are a little bit cautious about attribution when it comes to speaking about the change we have been able to facilitate,” Ms Boldy said.
She said staff required ongoing maintenance, coaching, encouragement, and training.
While the organisation could evaluate individual programs, it was working to aggregate data to arrive at figures that could describe the whole organisation.
“We have got a really strong desire to understand the full impact, the full picture of the impact that Anglicare has had in WA, but it’s really not possible to measure the whole organisation with just one or a few measures,” Ms Boldy said.
Anglicare WA’s size gave it an advantage over smaller organisation when it came to measuring outcomes, she said.
“As a larger organisation, we can attract specialists in-house to guide this work and we are very fortunate we can do that, but also really mindful of the fact that smaller organisations, or those with fewer discretionary resources, would really struggle to be able to invest as much time and effort into the task of outcomes measurement as we can,” Ms Boldy told Business News.
“Broadly speaking, in the community services sector there is much greater awareness of the value of measuring outcomes, but capacity is a real issue.”
In 2019, research from the Centre for Social Impact found larger organisations (turnover greater than $1 million) were more likely to measure outcomes than smaller organisations.
It found 5.3 per cent of organisations surveyed had a research and evaluation unit, 9.5 per cent said they trained staff in data analysis, and 9.2 per cent indicated they had staff employed to undertake data collection.
Fremantle-based not for profit St Patrick’s Community Support Centre chief executive, Michael Piu, said measuring impact was important to the organisation but finding the resources was difficult.
St Pat’s, with revenue of about $7.2 million in revenue in the 2020 financial year, runs several programs offering services for people who are homeless.
Mr Piu said St Pat’s evaluation of its programs was ad hoc and mostly driven by the requirements of funders.
He said some government funding required the use of certain databases, which caused challenges internally.
“One, you have to make that the priority, working with that database, and two, it limits your ability to collect data inhouse; that’s been a bit of a frustration,” Mr Piu told Business News.
“Then, on the other hand, you’ve got other programs that are not funded at all, and you have to make do within your own resourcing.”
However, he said St Pat’s had recently commissioned a social enterprise in the UK to roll out a data collection program that could be used by all staff across all services.
“Therefore, what you are achieving is less time taken up by staff data input, which means they are more likely to engage,” Mr Piu said.
“You get better quality data, we can control the data we collect, and there’s efficiency on the other end in terms of reporting.”
Speaking at a recent Centre for Social Impact webinar, University of New South Wales PhD candidate Karen Wilcox said methods like this, which included collecting and analysing data that was already produced, allowed an organisation to analyse its effectiveness on a budget.
Ms Wilcox said information organisations already collected, like case notes, could be useful.
“It’s about using what you are already doing and bringing your team, frontline team members, on board to understanding that turning this work into an outcomes language is actually beneficial for everyone, including the people you are working with and including the team,” Ms Wilcox told the webinar.
Youth Focus instructs its frontline staff to map its clients’ progress using clinical tools to evaluate the effectiveness of its mental health programs.
However, chief executive Arthur Papakotsias said it was sometimes difficult to assess progress in the mental health space because it was hard to know what the trajectory of an illness was.
“When people go into a health care service, particularly a mental health care service, some people improve, some people stay the same and some people get worse,” Mr Papakotsias told Business News.
“A lot of people who come to us come to us in a time of crisis, they are on a path of deterioration. So, a lot of what we try to do is stop that deterioration as a priority and stop the effect of that crisis and then try to help them develop their own resilience and their own coping skills.”
Arthur Papakotsias. Photo: David Henry
Mr Papakotsias said it was beneficial for staff to sit with clients and chat about the tool to help both parties be on the same page.
Youth Focus also uses customer feedback surveys to judge whether people had a positive experience.
“Getting better is an important thing, but having a positive experience is a very important thing,” Mr Papakotsias said.
“If people have a negative experience of the service, then they won’t want to go back to it.”
He said it was difficult to retain funding without being able to verify whether the organisation’s work was bringing reasonable outcomes.
“In this day and age, whether it’s a corporate partnership where the partners require you to demonstrate the effectiveness of the service, or if it’s government funding, people do want to know what impacts your work has made,” Mr Papakotsias said.
“A lot of funders are moving away from outputs to outcomes, and even going from outcomes to impacts.
“We are now starting to look at social impact; what impact has the service had in terms of their quality of life?”
St Pat’s Mr Piu said the push towards measuring impact was beneficial for all parties.
“From a philanthropic point of view, they want to support us in building the evidence base so if it is a good, effective program, government might look to step into that space,” he said.
Mr Piu said program evaluation was put into action when evidence collected from the St Pat’s 20 Homes 20 Lives project, collected by the University of Western Australia, was used in the state government’s WA Housing Strategy 2020-2030.
“All of the evidence for that was quite pivotal really in the state government making really important investments in the Housing First approach,” he said.