Infrastructure Project Authority

IPA Benchmarking Hub Service (Beta)

22 Incomplete applications

15 SME, 7 large

2 Completed applications

2 SME, 0 large

Important dates

Published
Tuesday 1 June 2021
Deadline for asking questions
Tuesday 8 June 2021 at 11:59pm GMT
Closing date for applications
Tuesday 15 June 2021 at 11:59pm GMT

Overview

Off-payroll (IR35) determination
Summary of the work
The IPA Benchmarking Service product will house a core set of features that have been derived from user research & co-design workshops during Alpha.
These features will make up the MVP to be developed during Private Beta, which is the current scope and requirement for the next 5 months
Latest start date
Thursday 1 July 2021
Expected contract length
3-5 months role dependent
Location
No specific location, for example they can work remotely
Organisation the work is for
Infrastructure Project Authority
Budget range
300-400k

About the work

Why the work is being done
The benchmarking hub will collect project data from completed projects and provide access to this data to project leaders across government to support project investment decisions. Helping to leverage the UK Government’s project portfolio data to support and shape future investment decisions. The platform will collect data on previously completed projects and future projects. The IPA would like to have a final product available by March 31 2022
Problem to be solved
Infrastructure projects across government and ALBs, continue to be delivered with a significant gap between estimated costs and actual costs.

Benchmark project cost data is often procured from private companies in order to support cost estimating activity, for lack of good quality, available / accessible and trusted data.

There is no clear or fully informed consensus on why some infrastructure projects cost what they do when compared to one another, or the intricate drivers causing inaccurate forecasts, which can often result in implementation delays and overspend.

The ultimate outcome that the IPA Benchmarking Service is intending to achieve is

(a) reduced / eliminated cost variances between business case cost forecasts / estimations and actual project outturn costs

(b) an improved relationship between government and ALB organisations and the supply chain i.e. enabling stronger commercial decision making through accessible and centrally managed cost data; this will be a huge benefit for organisations when negotiating on the cost of future projects ahead of and during formal procurement activity
Who the users are and what they need to do
The following user groups and segments have been identified through user research and analysis. One of the key findings is that many of the needs captured are agnostic across all users:

User groups
Benchmarking Service Custodian (IPA - Head of Central Benchmarking Service)
Benchmarking Service Contributors (Government Departments & ALB's)
Benchmarking Service Consumers (External Parties)

User Segments:

Analysis and research identified that across Contributor and Consumer user groups, there are common user needs

Segments:

1. As Head of Benchmarking

I require the ability to align to, access and explore a centrally managed and maintained benchmarking platform, which houses cross government and ALB project cost data for infrastructure and IT projects

2. As an SRO / Commissioner
3. As a Programme / Project Manager
4. As a Cost Estimator
5. As a Commercial Lead
6. As a Finance Lead

I require the ability to access credible, quality and centrally maintained / managed project cost data

I require the ability to explore project cost data by asset sector and class

I require the ability to explore project cost forecast data by asset sector and class

I require the ability to compare assets to one another, where common attributes are shared
Early market engagement
Any work that’s already been done
A prototype developed by the team that has been well tested and explored
Existing team
In place a PM/Bussiness Analyst & Technical Solution Architect
Current phase
Alpha

Work setup

Address where the work will take place
From home. There may be a need to come into London (expenses not covered by the IPA) for meetings subject to CV19 conditions
Working arrangements
Weekly teams meetings/stand ups.
Working from home.
May require London HQ meetings
Travelling expenses not provided. No expectation of overnight stays.
Security clearance
A minimum of Baseline Personnel Security Standard (BPSS) standard is required for all roles.

Additional information

Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • Demonstrate experience in providing data analysis and statistical (data science) consultation on large scale data projects
  • Demonstrate experience of designing and developing inclusive services meeting the GDS Digital Service Standards
  • Experience building data centred services including front end/ back end development and reporting dashboard (using tools such as Tableau/PowerBI)
  • Experience doing user research with senior stakeholders across multiple departments
  • Experience implementing data processing (ETL) pipelines
Nice-to-have skills and experience
  • Experience with building data reporting dashboards with Tableau
  • Experience of implementing data ingestion from Excel files.
  • Building digital services that have passed GDS Beta assessments

How suppliers will be evaluated

All suppliers will be asked to provide a written proposal.

How many suppliers to evaluate
5
Proposal criteria
  • Proposed supplier team structure, with evidence to demonstrate a strong team and how their skills and experience will deliver the required outcomes
  • Evidence that your proposed approach and methodology for delivery will successfully deliver working software to meet user needs and meet specified acceptance criteria.
  • Value for money and quality, assessed in comparison to other pitching suppliers.
  • Evidenced successful delivery of similar projects
  • Evidence of working with data and reporting dashboards
  • Identification of risk and plan to mitigate them
Cultural fit criteria
  • Can work with clients with low technical expertise
  • Work as a team with our organisation and other suppliers
  • Works openly and transparently
  • Challenge the status quo
Payment approach
Time and materials
Additional assessment methods
  • Case study
  • Presentation
Evaluation weighting

Technical competence

40%

Cultural fit

20%

Price

40%

Questions asked by suppliers

1. Noting that a prototype has been developed, please could you confirm whether there is an incumbent supplier for this activity?
No incumbent supplier. The prototype was developed in house by the IPA. We require skills outside the IPA for Beta development and this is why we are going out to the marketplace
Best
2. s there a preferred technology stack for taking the prototype forward? If so, please could you describe it?
The technology choice for Beta is not yet set in stone, and can be suggested by the supplier. The technology stack should be based on Open Source programming languages and frameworks (such as NodeJS, Java, RoR) and deployed on AWS. We are planning to use PostgreSQL (AWS RDS) for the database and Tableau for the data dashboards. There may be further design influencers coming from cross-government technology groups.
3. If the solution is defined and accepted, has an independent review been undertaken to validate the solution against industry standards and best practices?
The IPA has set out the best practices approach to benchmarking through extensive research and testing of processes/products, The IPA is comfortable that we have a strong solution that meets/exceeds industry standards and best practices
4. Noting that a prototype has been developed, please could you confirm whether there is a defined and accepted solution in place and as such this request is for expertise to implement said solution only, or is there scope for examining the prototype to determine possible optimisations and re-designs?
During Beta, we will be building on the prototype proof of concept and we are very open to optimisation and re-designs where this benefits the existing user needs being tested through the current visualisations
5. Will IPA be undergoing a GDS beta assessment?
Yes we will be once we have procured the team to support
6. Has IPA conducted an alpha assessment?
We are going for an Alpha Asssesment in the next 2 weeks. We are comfortable with our prototype and for this assessment
7. You mention Power BI and Tableau – we presume you want only one product – do you have a preference please?
Do you have a preferred cloud provider in mind to host this please?
The preference was for a PowerBI based on a steer from user organisational level insights and some dashboards were done in PowerBI in Alpha. However this may need to change, dependent upon Cabinet Office architecture guidance. Our team is finalising the choice of data reporting tool.
Yes, the preferred cloud provider is AWS
8. Is there any discovery/alpha documentation we can access?
This will be available to the winning bidder

The deadline for asking questions about this opportunity was Tuesday 8 June 2021.