
Insight
We have been conducting summative assessments – also known as ,evaluations, – for several years, providing local authorities, universities, etc, across the UK with evaluations of their programmes/interventions – this has predominantly been for interventions funded by the European Regional Development Fund (ERDF), where an evaluation is a condition of the funding contract.
A summative assessment is important to demonstrate the impact of the programme on beneficiaries (often SMEs), as well as the broader regional economic impact, such as contribution to GVA and employment. In addition, evaluations measure the outputs and outcomes of the programme against the metrics which are typically recorded in the project Logic Model prior to the programme commencing. Programme processes are also evaluated, specifically project management and delivery, and the value-for-money of the programme is also assessed (described below).
In this article we detail our approach to undertaking these revealing and important studies that reflect, shape and inform future regional economic development initiatives.
Our Evaluation Methodology
Our project approach and methodology incorporate elements from both ERDF and central government guidance. In alignment with the Magenta Book (government guidance on evaluation), we adopt mixed methodologies to ensure successful delivery of a robust programme (intervention) evaluation – this strategy is adopted whether the requirement is for an Interim, Final or Interim and Final Evaluation.
We believe mixed methodologies offer variety in evaluation theories and practices, not adhering to a singular approach that may not be the most appropriate for the assessment. This includes a process evaluation and an impact evaluation, both methods commencing with a series of questions which are the focus of the assessment, for example:
- What can be learned from how the intervention was delivered? (process)
- What difference did the intervention make? (impact)
We also conduct a value-for-money evaluation which assesses the cost-effectiveness of achieving programme outcomes compared to alternative ways of producing the same/similar outputs. Again, we begin with questions to answer, for example:
- Is the intervention the best use of resources?
We also adopt the three-phased approach outlined in the ESIF-GN-1-033 Summative Assessment Guidance, having successfully implemented this approach across multiple summative assessment projects in the past. The activities we typically carry out are as follows:
Phase 1 – Assessment Planning
Activity 1 – Start-up Meeting
A kick-off meeting attended by key members of the programme/intervention delivery team, designed to understand more about project rationale and context, and to fine tune evaluation approach. This includes discussing the project Logic Model and Summative Assessment Plan in advance (assuming they are available, which sometimes they are not – in this situation we,d likely support in developing these key documents).
Activity 2 – Review Project Data
Comprehensive review key project documents, including the full project application and any subsequent amendments; the project Logic Model (if available); reports/evaluations of beneficiary projects; case studies; quarterly reports to Managing Authority; and any other available documents.
Phase 2 – Data Collection and Reporting
Activity 3 – Project Team Interviews
Interviews conducted with key members of the project management and delivery team to obtain feedback on project progress and outputs, and any issues that have arisen during the project regarding project management, support delivery and reporting.
Activity 4 – Stakeholder Survey
Detailed discussions with internal stakeholders and up to four key external stakeholders. The purpose of this engagement is to obtain feedback on the delivery and impact of the intervention from stakeholders that have a high level of understanding of business support needs and economic development strategies.
Activity 5 – Beneficiary Survey
Surveying beneficiary population to assess programme delivery / management, project indicators, deliverables, impact of ERDF cross-cutting themes, etc. Further analysis on the project’s strategic alignment and market need will also be obtained at this stage of the evaluation.
Activity 6 – Data Analysis
We continuously analyse all data and information we collect during the evaluation.
Phase 3 – Reporting and Dissemination
This phase considers the economic and policy context, including market failure, project objectives and rationale for the delivery approach. It entails qualitative analysis of the implementation of the project, selection procedures, delivery performance, governance and management.
Activity 7 – Draft Report
A draft version of the summative assessment report is delivered to the client after comments incorporated.
Activity 8 – Final Report
The final version of the summative assessment report, often alongside ERDF Summary using ESIF template, is delivered in advance of programme completion (usually).
Activity 9 – Dissemination Materials
A package of materials for distribution are produced. These present and highlight the summative assessment findings and are customised to client needs and the stakeholders who will receive materials; infographics, online articles, presentations etc all form the package of materials.
Client review meetings occur at key intervals during the project, allowing the client to comment, raise any issues or concerns and receive progress updates against project plan and milestones.
Added Value
Here are some added value extras we tend to offer clients, as well as areas we believe offer value-for-money more generally:
- Data monitoring system support – advising on data gathered and monitored over the course of the remainder of the programme or for future programmes
- Customised/tailored dissemination materials – a package of materials designed for varied mediums (online, events, etc), and tailored to the needs of the stakeholders receiving them
- Review against legacy RD&I metrics – benchmarking the programme’s progress against legacy RD&I metrics, such as ‚ ‘contribution to GVA as a result of R&D‚Äù
- Domain expertise – summative assessment, SME support initiatives, varied sectors, RD&I
- Cross-cutting themes analysis – analysing horizontal principles within programme delivery
