Awarded to Gulp Digital

Start date: Wednesday 15 September 1982
Value: £134,000
Company size: SME
Infrastructure Project Authority

IPA Benchmarking Hub Service Alpha

5 Incomplete applications

5 SME, 0 large

22 Completed applications

19 SME, 3 large

Important dates

Friday 16 July 2021
Deadline for asking questions
Friday 23 July 2021 at 11:59pm GMT
Closing date for applications
Friday 30 July 2021 at 11:59pm GMT


Off-payroll (IR35) determination
Contracted out service: the off-payroll rules do not apply
Summary of the work
The IPA Benchmarking team needs a team to validate and enhance our Alpha findings. We want to conduct more user research and build prototypes as required.

We need:

- Interaction Designer
- User researcher
- BI Tool developer (ideally Tableau)
- Part time Data Analyst, Prototype Developer and Content Designer
Latest start date
Tuesday 10 August 2021
Expected contract length
1-2 months role dependent
No specific location, for example they can work remotely
Organisation the work is for
Infrastructure Project Authority
Budget range

About the work

Why the work is being done
The benchmarking hub will collect project data from completed projects and provide access to this data to project leaders across government to support project investment decisions. Helping to leverage the UK Government’s project portfolio data to support and shape future investment decisions. The platform will collect data on previously completed projects and future projects. The IPA would like to have a final product available by March 31 2022
Problem to be solved
Infrastructure projects across government and ALBs, continue to be delivered with a significant gap between estimated costs and actual costs.

Benchmark project cost data is often procured from private companies in order to support cost estimating activity, for lack of good quality, available / accessible and trusted data.

There is no clear or fully informed consensus on why some infrastructure projects cost what they do when compared to one another, or the intricate drivers causing inaccurate forecasts, which can often result in implementation delays and overspend.

The ultimate outcome that the IPA Benchmarking Service is intending to achieve is

(a) reduced / eliminated cost variances between business case cost forecasts / estimations and actual project outturn costs

(b) an improved relationship between government and ALB organisations and the supply chain i.e. enabling stronger commercial decision making through accessible and centrally managed cost data; this will be a huge benefit for organisations when negotiating on the cost of future projects ahead of and during formal procurement activity
Who the users are and what they need to do
The following user groups and segments have been identified through user research and analysis. One of the key findings is that many of the needs captured are agnostic across all users:

User groups
Benchmarking Service Custodian (IPA - Head of Central Benchmarking Service)
Benchmarking Service Contributors (Government Departments & ALB's)
Benchmarking Service Consumers (External Parties)

User Segments:

Analysis and research identified that across Contributor and Consumer user groups, there are common user needs


1. As Head of Benchmarking

I require the ability to align to, access and explore a centrally managed and maintained benchmarking platform, which houses cross government and ALB project cost data for infrastructure and IT projects

2. As an SRO / Commissioner
3. As a Programme / Project Manager
4. As a Cost Estimator
5. As a Commercial Lead
6. As a Finance Lead

I require the ability to access credible, quality and centrally maintained / managed project cost data

I require the ability to explore project cost data by asset sector and class

I require the ability to explore project cost forecast data by asset sector and class

I require the ability to compare assets to one another, where common attributes are shared
Early market engagement
Any work that’s already been done
We have done some user research and prototyping and we are looking to continue this in Alpha.
Existing team
In place a PM/Bussiness Analyst & Technical Solution Architect
Current phase

Work setup

Address where the work will take place
From home. There may be a need to come into London (expenses not covered by the IPA) for meetings subject to CV19 conditions
Working arrangements
Weekly teams meetings/stand ups.
Working from home.
May require London HQ meetings
Travelling expenses not provided. No expectation of overnight stays.
Security clearance
A minimum of Baseline Personnel Security Standard (BPSS) standard is required for all roles.

Additional information

Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • Experience conducting user research across senior stakeholders and understand needs and motivations for each group
  • Demonstrate experience of running alpha phases meeting the GDS Digital Service Standards
  • Experience prototyping data centric services including Data Dashboards (PowerBI/Tableau)
  • Demonstrate their experience of working to GDS design principles
  • Experience of prototyping different solutions to check assumptions
Nice-to-have skills and experience
  • Demonstrate your experience of successfully passing GDS alpha assessments and deliver a successful alpha phase, describing the approach you took.
  • Demonstrate your experience of working towards policy goals evidencing how these were achieved.

How suppliers will be evaluated

All suppliers will be asked to provide a written proposal.

How many suppliers to evaluate
Proposal criteria
  • Proposed supplier team structure, with evidence to demonstrate a strong team and how their skills and experience will deliver the required outcomes
  • Evidence that your proposed approach and methodology for delivery will successfully deliver working software to meet user needs and meet specified acceptance criteria.
  • Value for money and quality, assessed in comparison to other pitching suppliers.
  • Evidenced successful delivery of similar projects
  • Evidence of working with data and reporting dashboards
Cultural fit criteria
  • Can work with clients with low technical expertise
  • Work as a team with our organisation and other suppliers
  • Works openly and transparently
  • Challenge the status quo
Payment approach
Capped time and materials
Additional assessment methods
Case study
Evaluation weighting

Technical competence


Cultural fit




Questions asked by suppliers

1. Please could you say who the incumbent was for the user research / prototyping work?
It was done in house. The team consisted of a BA and a TSA.
2. Please could you say who the incumbent was for the user research / prototyping work?
It was done in house by the delivery team
3. What prototyping software is in use?
We use for the UI mock screens.
4. In what format is the prototying work carried out by the inhouse Business Analyst and Technical Solution Architect? I.e. Visiual mockups or digitalised?
The prototypes consist of mock up screens and PowerBI/Tableau dashboards. No UI coding has been done yet.
5. Please could the client disclose why the Beta opportunity which was released last month was cancelled?
Because we did not pass the GDS Alpha assessment and have to go back around ALPHA. Once passed (objective of this procurement) we will then procure for BETA again
6. In the final product, will the Benchmarking Hub dashboards and analytics be presented using PowerBI or Tableau?
Tableau is the preferred option of GDS and we would envisage Tableau would be the option for us (although subject to agreement)