FREE Online Webinar

22 May 2024 - 15:00 GMT

Evaluating Digital Projects: Managing ‘Theory of Change’ and Turning Challenges into Opportunities.

Register

Evaluating
Digital Projects

The Core Principles

PUBLIC’s latest guidance provides advice for teams evaluating their digital and technology projects.

Check out a sneak peek at the 18 core principles to consider when conducting such digital evaluations, or dive deeper with the Full Guidebook, complete with application to a hypothetical use case.
Read the Full Guidebook22 May - Online Webinar

What makes evaluating a digital project different?

Evaluations for other types of project are typically divided into three different strands: Process, Impact and Economic evaluations. While digital projects have different aims, different methods, and different scales, they usually share some important common factors, which often shape how they should be evaluated.

These include, but are not limited to, the following factors:
Digital projects are designed to be iterative and agile, changing quickly to meet user’s needs and the wider organisational context.
Available technologies affecting digital projects can also change rapidly, meaning that new things can become possible, or significantly more affordable, during the lifecycle of a project.
Digital projects are usually intended to be accompanied by broader process, organisational or cultural change, as well as technical system change.
As we discuss the key principles for conducting a digital evaluation in the sections below, we aim to highlight areas that are different from non-digital or traditional evaluations.

Digital projects should make services better

Digital projects are almost always aiming to be enablers of other outcomes or service priorities, they are not an end in themselves. Usually, they are aiming to make processes smoother, faster, more intuitive, or cheaper - in other words, better.

Many digital projects have 'channel-shift' benefits

Many digital projects will ultimately aim to replace a non-digital channel (in-person visits, telephone calls, letters, etc.) with a fully digital channel. This is called a ‘channel shift’ and a projects’ efficiency and cost benefits are sometimes calculated by measuring time saved from in-person visits - for example to post offices, local council buildings, or passport offices.

Improved digital services can drive citizen’s trust in and perception of government

A more intuitive digital service makes it easier for the user, or the operator, to complete the service journey. It might also make people more willing to engage with the service or organisation in question. Internally, having more user-friendly services for operators could yield important benefits, like increased staff well-being and retention.

Performance and useability are linked outcomes

When tracking outcomes, it is important to measure the technical performance of the service (its uptime, speed and latency, for instance) as well as the usability of the service (how much users enjoy using it, what challenges they have overcome or still face). Performance and usability are linked and should be treated as such in the Theory of Change.

Digital projects can increase cyber resilience and reduce cyber risk

Modern technology almost always offers better cyber security than legacy systems as it typically uses more advanced authentication methods like biometrics and two-factor authentication. It is more regularly maintained and patched, and will include security-by-design considerations in the development process. In particular, cloud-hosted services are often more secure than legacy systems.

Digital projects can be highly scalable, but not always

One of the major claimed benefits of digital projects and services is that they can be easy to scale, due to assets being re-usable. However, scaling services from one context to another requires large amounts of dedicated time, resource and technical skill. Being realistic about barriers and constraints to a digital solution’s scalability is key.

Digital skill uptake is best measured in practice on a specific task or project

Digital skills are broader and more context-dependent than ‘traditional’ skills, and often relate to how people solve problems in a variety of contexts. So for teams measuring improvements in digital skills for a group of people, the best way is usually to observe the application of those skills in a real-world context. This can be done by applying ethnographic methods, observing specific projects or tasks.

Digital projects can lead to sustainability outcomes

Teams should measure how their service has driven more sustainable outcomes compared with previous services, but should also track the sustainability of the new service itself. Across all hardware, software and services, teams can take steps to reduce the environmental footprint of their technology project, many of which are covered in ‘Defra’s Greening Government ICT’ report.

Teams should anticipate flexibility in their evaluation design

All projects are subject to changes and review, but digital projects are designed to be agile, iterative, and pivot quickly. The general advice is to ensure that Theories of Change and objectives are reviewed regularly throughout the lifecycle of ongoing projects. In particular, teams expect that the overall vision of their Theories of Change stays the same, but the sub-steps needed to get there could change. The earlier changes are implemented, the more concise the evaluation efforts.

Digital projects often have a unique adoption profile

Digital projects are often innovative and therefore have a distinct adoption profile. A common profile for digital projects is to have very few first movers followed by a number of early adopters, who champion the service. This builds trust and leads to mass adoption, often triggered by a specific event or use case. Finally, innovation adoption often has a stubborn long tail of potential users who adopt a service very late - or sometimes not at all.

Sampling strategies should be guided by ‘user personas’

Digital projects will often segment their potential users into ‘personas’, representing different types of users, contexts and motivations. When evaluation teams develop their sampling or engagement approach, they should ensure they can segment the data into different user personas, and report outcomes against them.

Inclusion and accessibility should be tested as key outcomes

Digital teams will always think about the inclusiveness and accessibility of their services, to ensure that they can be used by everyone. Therefore, inclusion should similarly be a key consideration of evaluation teams. GDS has developed a Digital Inclusion Toolkit, and a 9-point Digital Inclusion Scale, which can be used to transparently test how outcomes have affected different target participant groups, which can be a helpful starting point.

Extrapolations and counterfactuals should be grounded in digital realities

Teams can supplement more traditional matching variables, like population or size, with specific digital considerations that could impact an organisation’s ability to adopt an intervention. This could include the use of operating systems, current digital maturity, or presence of legacy technology within the organisation.

Evaluation design should adapt to changing technology contexts

Digital projects can change a lot. And quickly. In particular, the technology context around them can change (as can the underlying legal and regulatory context). Evaluation teams need to constantly monitor relevant technological changes and assess if the approaches originally decided upon are still valid in light of these changes, or if they need to be amended.

Service Assessments can provide rich context for evaluations

Service Assessments - which follow the Service Standard laid out by GDS - are detailed appraisals of digital projects, conducted by external assessors, covering a number of important project success factors. Our advice for evaluation teams is to use these Service Assessment reports and documentations in their evaluations, especially in their process evaluations, which may cover very similar topics.

Web analytics data is important for measuring  online services

Most websites and online services measure a number of common user analytics and data to monitor their performance. These web analytics can form an important part of any digital service evaluation, and aligning with ongoing GDS and UK Government approaches using Google Analytics 4 can help to ensure that evaluations are joined up across Government.

Digital projects can embed native data collection tools

Digital projects often offer the opportunity to embed tools to run randomised controlled trials. This is because - using A/B triage approaches - there can be a randomised assignment of which users engage with a given service vs. a previous version of the service. This means that, at the level of the user, it is possible to compare outcomes for the new service for the old one. The technology that underpins this kind of randomisation is usually just a simple 50-50 random outcome generator for when a user clicks on a link, or starts an online journey.

Contract and spend data can be useful for some digital evaluations

One source of data that can be highly useful for teams evaluating public sector digital projects is contract data. Contract data - or spend data - is data about the public contracts that public authorities have with their suppliers. In a digital context, these are the contracts that public authorities hold with their digital vendors. This data can be used for many things: like measuring direct cost savings, comparing different authorities, or measuring changes in public sector vendor markets.

The Boot Camp Challenges and Regions for 2023

GovTech Connect will run four boot camps covering all European regions and several industries across 2023 and 2024. In 2023, the project will be focusing on the following challenges and regions:
Achieving Net-Zero - Supporting the green transition of services.
Statement
1. Open Sustainability Data: How can we use technologies to promote transparency about sustainability outcomes in government projects in the Nordic-Baltic region?
2. Energy-Efficient Buildings: How can we use technology to improve energy efficiency in ​public buildings in the Nordic-Baltic region?
3. Sustainable Tourism: How can we utilize technology to enhance sustainable tourism in the ​Nordic-Baltic region and reduce its environmental impact?
Digitising Public Services for Enhanced Civic Engagement, Accessibility, and Transparency
Statement
1. Inclusive and Accessible Services: How can we use new technologies to make public services more inclusive and accessible for all users and communities?
2. Transparent and Open Government: How can we use new technologies and data to promote greater transparency and openness about public services and government outcomes?
3. Community Engagement and Collaboration: How can we use new technologies to facilitate more meaningful and productive dialogues between public authorities and the communities they serve?
Achieving Net-Zero - Supporting the green transition of services.
Digitising Public Services for Enhanced Civic Engagement, Accessibility, and Transparency

The Boot Camp Challenges and Regions for 2023

GovTech Connect will run four boot camps covering all European regions and several industries across 2023 and 2024. In 2023, the project will be focusing on the following challenges and regions:
Achieving Net-Zero - Supporting the green transition of services.
Statement
1. Open Sustainability Data: How can we use technologies to promote transparency about sustainability outcomes in government projects in the Nordic-Baltic region?
2. Energy-Efficient Buildings: How can we use technology to improve energy efficiency in ​public buildings in the Nordic-Baltic region?
3. Sustainable Tourism: How can we utilize technology to enhance sustainable tourism in the ​Nordic-Baltic region and reduce its environmental impact?
Digitising Public Services for Enhanced Civic Engagement, Accessibility, and Transparency
Statement
1. Inclusive and Accessible Services: How can we use new technologies to make public services more inclusive and accessible for all users and communities?
2. Transparent and Open Government: How can we use new technologies and data to promote greater transparency and openness about public services and government outcomes?
3. Community Engagement and Collaboration: How can we use new technologies to facilitate more meaningful and productive dialogues between public authorities and the communities they serve?
Achieving Net-Zero - Supporting the green transition of services.
Digitising Public Services for Enhanced Civic Engagement, Accessibility, and Transparency

Key Dates

Live Q&A
20 and 25 April 2023
Application Extended
Open until 17 May 2023 for Northern Europe, and 23rd June 2023 for Southern Europe
Selection Release
Middle of May 2023
Boot Camp 1
5 June to 30 June 2023
Boot Camp 2
4 September to 29 September 2023

FAQ

Live Q&A Session

What is GovTech Connect?

The European’s Commission DG Connect has recently launched GovTech Connect, a pilot project aiming to foster the digitisation of the public sector through the use of an innovative European GovTech Platform - GovTech Connect. Carried out by a consortium formed by Intellera Consulting, PUBLIC Deutschland, The Lisbon Council, and Politecnico di Milano, GovTech Connect aims to support public administrations in the adoption of cost-effective and flexible digital solutions by articulating the GovTech ecosystem in the European public sector, over the course of two years.

What are the GovTech Connect Boot camps?

The project will run two open innovation activities per year, including two boot camps aiming to identify innovative solutions in different regions across Europe, to solve key pressing challenges.

The boot camps will run as accelerated and intensive training programmes. Each boot camp will be delivered online over four weeks and will culminate with a final online pitch event in week 6.

Content will be delivered in form of masterclasses, workshops, roundtables and focus groups. Each week we will start with a full day of training, housing both lecture and activity-based sessions.

Within the boot camps, we will also facilitate a number of opportunities for start-ups to work together with citizens and co-develop user-centric products. Each start-up will be matched with a small pool of citizens that will be recruited by our consortium or directly by the start-ups, at the beginning of the programme.

What are the regions and challenges for 2023?

The project will run two open innovation activities per year, including two boot camps aiming to identify innovative solutions in different regions across Europe, to solve key pressing challenges.

For 2023, the regions and challenges are the following:

  • Northern Europe: Achieving Net-Zero - Supporting the green transition of services.
  • Southern Europe: Digitising Public Services for Enhanced Civic Engagement, Accessibility, and Transparency.

Who is running the Boot camps?

The boot camps will be delivered by PUBLIC, with contributions from Politecnico di Milano, Intellera and Lisbon Council, all part of the Consortium implementing the pilot project GovTech Connect on behalf of the European Commission. PUBLIC has significant experience in working with European governments and innovators, having facilitated 100+ collaborations between government organisations and SMEs across 4 countries through thought-leadership and consulting activities, product development, innovation programmes and events.

Boot Camp Format

What is the time and resource commitment?

The boot camps will run as accelerated and intensive training programmes for 4 weeks, with a dedicated learning day once per week. At least 1 startup team member must attend all sessions. The boot camps are funded by the EU.

What is the learning content of the Boot camps?

The content of the boot camps includes topics on working with the government, innovative business models, sources of funding, pitch preparation and a closing final pitch.

Content will be delivered in the form of masterclasses, workshops, roundtables and focus groups. Each week we will start with a full day of training, housing both lecture and activity-based sessions.

Within the boot camps, we will also facilitate a number of opportunities for start-ups to work together with citizens and co-develop user-centric products. Each start-up will be matched with a small pool of citizens that will be recruited by our consortium or directly by the start-ups, at the beginning of the programme.s.

When will the Boot camps be delivered?

  • Northern Europe, Achieving Net-Zero - Supporting the green transition of services: 5 June to 30 June 2023.
  • Southern Europe, Digitising Public Services for Enhanced Civic Engagement, Accessibility, and Transparency: 4 September to 29 September 2023.

Who will be delivering the sessions?

Training sessions are regional specific and will be delivered based on personal experiences from recognised experts, drawn from our network of leading entrepreneurs, intrapreneurs, academics and public sector officials, with diverse expertise that span public sector procurement, investment, entrepreneurship, service design, design thinking and more.

Open Call

What startups can apply?

The boot camp is targeted to early-stage start-ups that are developing technology-based solutions with the capacity to generate value for the public sector. These startups are usually in their pre-seed/seed phase. However, more mature startups that are new to working with governments are also encouraged to apply. Startups must fit within one of the challenges and its region.

  • Companies operating in Northern Europe with solutions to the Net Zero challenge. Must have an office in any of these countries: Denmark, Estonia, Finland, Iceland, Latvia, Lithuania, Norway, Sweden.
  • Companies operating in Southern Europe with solutions to the Digital Infrastructure challenge. Must have an office in any of these countries: Cyprus, Greece, Italy, Malta, Portugal, Spain, Albania, Montenegro, North Macedonia, Serbia, Kosovo, Bosnia.

What does the application process look like?

Interested startups can apply to one of the boot camps via the application portal where they will be requested to provide general information on their company and team as well as more detailed information on their proposed solution and how it addresses the relevant challenge.

What is the application deadline?

The application deadline for both boot camps is 5 May 2023.

What are the benefits for the participating startups?

The boot camps offer a number of unique benefits to startups:

  • Learn how to effectively approach working with governments and hear from a variety of stakeholders with first-hand experience on how to implement a GovTech solution successfully
  • Explore key methods for co-designing user-centred and compliant GovTech products together with citizens
  • Receive expert advice on how to successfully build your B2G (Business to Government) model and pricing strategy
  • Gain knowledge, skills and tools to master the process of accessing funding and raising investments in the GovTech sector
  • Showcase your solution to investors and buyers on pitch day and receive buy-in from relevant stakeholders to continue collaborating through future pilot phases

What are other ways to participate if I’m not eligible?

We encourage all startups, including those from other regions of Europe, to subscribe to our JoinUp page and follow the project LinkedIn page to receive updates on any other project activities. In particular concerning the upcoming boot camps with a focus on Western Europe and Eastern & Central Europe in 2024.

General Information

What does the selection process look like?

A jury panel will assess applications made up of representatives from the project team, sectoral experts and citizen engagement leads.

Eligible startups will be scored based against the following criteria: Team Experience, Company Experience, Solution Feasibility and Citizen Engagement.

What is the expected resolution date?

The expected resolution date will occur in the middle of May 2023.

*More questions can be submitted before the Live Q&A sessions through this link.

Want to find out more?

The full Evaluating Digital Projects guidebook provides an introduction to evaluations and digital evaluations, expands upon the 18 principles listed above, and applies these principles to a hypothetical 'CivicAI' use case!

Alternatively, feel free to reach out to the authors at the email below.
Read the Full GuidebookGet in touch