Most days, I love my job as Research and Evaluation Manager at The Ian Potter Foundation – meeting passionate people pursuing interesting projects. But there’s another, darker side of this job – closing final reports on grants made.
Since arriving at The Ian Potter Foundation four years ago, I have closed 997 final reports.
Sometimes this activity can be exhilarating; reading about the collaboration, partnerships, sustainability and leverage of hard-working grantees. But sometimes it is painful – requiring copious amounts of chocolate to navigate the dark labyrinth of intentions gone awry.
The foundation has been compiling grantee key learnings by program area for all grants closed after 1 January 2010 (this was a lengthy process as final reports were paper-based prior to 2015).
These key learnings include advice for grantseekers such as:
“Capture data from the very first activity. In one case, the grantee believed conducting a survey before the screening of a film would have been useful in terms of measuring the difference in audience understanding of and engagement with an issue before and after the screening.”
“When seeking feedback on events, a grantee provided the following suggestions:
- A portable tablet rather than a fixed station makes it easier to solicit feedback.
- A staff member should be available to encourage and guide users.
- The survey should take no longer than four minutes to complete.
- An incentive such as a prize draw could increase participation rates.”
“Plan how to manage large amounts of data. Use a database system to avoid double-entering into Excel.”
And the list goes on.
Still, there is much more to cover. For example, closing 997 grants means that I have read 997 acquittal budgets (for the love of Pete, sign your acquittal budget – auditors do check this).
For this reason, our key learnings documents each contain sections on budgeting advice. However, in this article, I would like to discuss one tiny, specific sub-topic about which I am very passionate: the evaluation budget within a grant budget.
Here are 10 tips I’d love to pass along, and I hope they will save some headaches:
1. Fully budget for the evaluation in your project budget.
Refusing to fund evaluation in a grant budget became passé in Australia in 2014. Seriously, if you have applied for evaluation funding within a grant budget, that should be just fine. Do not try to “trim it” to be more competitive.
A grantmaker looks at the merits of the project, then funds accordingly. If you have properly budgeted for evaluation in your application and it is eliminated, someone needs to have a frank conversation. I’m happy to give them a ring.
2. Assess the cost for a quality evaluation.
Send a Request for Quotation to three to four evaluators to get a clear sense of the cost range. Write up a bit of a plan (this Microsoft Word Doc can provide a model), and send to quality evaluators. The Ian Potter Foundation compiles a list of recommended evaluators which is updated each year.
How do you know if they are quality? Ask to read a work sample. Ask how they enhance clients’ evaluation capacity. Ask if their evaluation was of a high enough standard to lead to government adoption of a program and/or profound improvement. Don’t select the cheapest evaluator, choose the one whose work impresses you and whose values are well-aligned.
I’ve seen several multi-year grantees scramble for evaluators halfway through a project after the first evaluator produced a sub-standard initial report. Not fun. And costly.
3. Budget for your evaluation from day one.
Baseline data does not grow on trees. Neither does ethics approval. If you have a multi-year project, include an evaluation budget for every year, not just the final year. Consider the timing of surveys – consider whether holding surveys at a different time could increase the number of responses. Be aware there are substantial challenges (including time taken) in obtaining data from government organisations. One grantee waited almost eight months to receive relevant government data. This can affect a budget – be prepared
4. Reimburse your staff appropriately for data collection.
If you are having your staff collect the data, ensure that you have budgeted for their extra time. It is impossible to be a full-time social worker and collect baseline data on 20 young people without extra hours/assistance. Factor this in so staff do not resent the evaluation process
5. It’s not how much data you have; it is how you use it.
Avoid over-collection and survey fatigue. Collecting data just to “grow dusty” on an Excel spreadsheet or to impress a funder is poor form, and in my mind unethical. Pick clear questions that you need to answer for your end-game stakeholders. For instance, state government may want to see numeracy levels increasing. If that’s the case, then collecting data around students’ literacy gain is not relevant nor necessary.
6. Budget for secure IT.
If you are collecting sensitive information (including birthdates, mental health status and many other pieces of personal information), ensure that you are storing that data safely. Do you need a password-protected laptop? Is data stored on a secure server? Talk this through with your evaluator and/or IT team. It’s more impressive for an organisation to ask for funding for a reasonable data security plan than exposing clients’ data. Foundations depend on maintaining their reputations – please don’t put ours at risk by exposing your clients’ private data
7. Evaluation should be continuous and can include monitoring.
During your program/project, you should monitor for “regression”. For example, the success of one reading program was due to the one-to-one relationship the volunteer buddy had with the child, but unless monitored, some teachers used the buddy as a general classroom aide rather than a dedicated helper for one child.
8. Budget time for comparison data.
Are you planning to use the school down the road? A randomised control group? Federal standards? These all add credibility but take time, effort and money. Do not get caught out at the eleventh hour trying to put together a control group.
9. Draw on your resources.
Recognise that clients/participants can also be a source of feedback to ensure continual improvements. Unemployed jobseekers can be experts providing the grantee with insight into user behaviour and motivations and therefore enable the creation of a much more effective product/service due to their involvement.
10. Learn from mistakes – and if something goes wrong, talk to your funder.
Quality evaluation includes continuous improvement. For example, the original bionic eye failed to meet the surgical and patient requirements, so the grantee developed its own bionic eye components to fulfil the anatomical, surgical, patient and safety needs.
Hopefully, these tips will help you avoid making the same mistakes when it comes to evaluation budgets.