This opportunity is closed for applications

The deadline was Friday 24 March 2017
Cabinet Office

Race disparity data across the public sector - beta

10 Incomplete applications

8 SME, 2 large

10 Completed applications

7 SME, 3 large

Important dates

Published
Friday 10 March 2017
Deadline for asking questions
Friday 17 March 2017 at 11:59pm GMT
Closing date for applications
Friday 24 March 2017 at 11:59pm GMT

Overview

Summary of the work
Produce a public beta of a service to present the levels of disparity that people from different races experience when using public services. The beta will include a public-facing service to present and visualise the data for several categories of users and internal tools for validating and transforming datasets.
Latest start date
10/04/2017
Expected contract length
5 months
Location
London
Organisation the work is for
Cabinet Office
Budget range
We are aiming at no more than £500,000 (excluding VAT), but are open to proposals from suppliers who may feel that extra resources are justifiable given the scope of the task.

About the work

Why the work is being done
This is a key No.10 initiative. We are working with Whitehall departments to identify and publish information showing how outcomes differ for people of different backgrounds, in a range of areas including health, education and employment. People will be able to check how race affects the way they experience public services, and transparency will help the government improve services and reduce disparity.
Problem to be solved
Race and ethnicity data is collected, measured and reported on across government - it can be found online in various publications (including official statistics) and can also be requested from departments. However, it's time-consuming and often hard to find, and exists in many different formats. This project aims to bring the data together, make it more easily available and usable by various audiences, and highlight where disparities exist to help aid understanding and inform policy.
Who the users are and what they need to do
Our users span across members of the public, policymakers, third sector and academic researchers.
Their goals are very different. Researchers need the raw data to manipulate themselves. potentially linking it to other datasets, often with the aim of uncovering reasons for disparities and lobbying to improve situations.
Others want to lift graphs and data from the site to quote as evidence for their policies or positions. Those unused to statistics may look at topics that affect them personally, such as crime rates or school performance. They need headline figures and context to help them interpret it.
Early market engagement
None - though the alpha was procured via the Digital Marketplace https://www.digitalmarketplace.service.gov.uk/digital-outcomes-and-specialists/opportunities/1492
Any work that’s already been done
Discovery ran from Nov 2016 - Jan 2017. During alpha we've run multiple user research lab sessions involving one-to-one interviews on a throw-away prototype front-end interface. The research tested how different types of users navigate a service, what features and information they need and how they interpret the data.
Alongside this, we've mapped out a plan for publishing the content and we are working with departments on how to collect the required data. We've also drafted a simple technical architecture plan for the service.
Existing team
The existing blended team comprises staff from Cabinet Office, ONS, GDS and Methods Digital. Methods Digital are contracted to deliver an alpha (as a DOS outcome) which covers development, data science, tech architecture, design and user research. Cabinet Office and ONS provide staff focussed on strategy, insights, programme management, policy, analysts and statisticians. The service manager and user researcher are provided by GDS.
Current phase
Alpha

Work setup

Address where the work will take place
1 Horse Guards, London, SW1A 2HU
Working arrangements
Our team will be based in Horse Guards. Some off-site working may be accepted, depending on the supplier proposal but will primarily be co-located onsite with the existing team. We work Mon-Fri with usual office hours. Occasional travel to other parts of the UK is expected for user research and testing.
Security clearance
Minimum of baseline clearance (BPSS) required. CTC is required to work at 1 Horse Guards.

Additional information

Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • Have proven experience of working on agile projects as part of a multidisciplinary team, with regular user research, as outlined in the Government Service Design Manual and Digital Service Standard.
  • Have expertise in iteratively prototyping, designing and building services which meet both user needs and the GOV.UK guidelines for content design, service design and accessibility.
  • Technical expertise and past experience in building accessible front-end interfaces using GDS design patterns and responsive design.
  • Experience in developing, accessible, easy to understand data visualisations for online services, including statistical data using the GDS and ONS styleguides.
  • Proven track record of working with and manipulating large, heterogeneous datasets and presenting data and insights to users (including the public) in a clear, meaningful way.
  • Awareness of existing open source or proprietary solutions and platforms to efficiently federate, store, link and present heterogeneous datasets
  • Demonstrable experience of manipulating and transforming disparate datasets into common, machine readable formats.
  • Be able to architect robust, scalable systems and display knowledge about system security risks and how to pragmatically mitigate them.
  • Expertise developing scalable services using public cloud technologies, service-oriented architecture, micro services, security and integration.
Nice-to-have skills and experience
Knowledge of statistics, at least to a basic understanding of sampling theory and statistical tests

How suppliers will be evaluated

How many suppliers to evaluate
3
Proposal criteria
  • High level approach - how you will approach delivering a Minimum Viable Product to meet project aims and transition to the team that will take over the service after beta
  • Team structure - covering infrastructure setup and maintenance, ongoing prototyping, software delivery, content publishing and how the the team will adapt over time according to changing priorities.
  • Data and content design - examples of good data visualisation (following ONS or GDS style guides) and how to deliver content coming from several departments, dealing with complex, sensitive subjects.
  • Demonstrate political awareness and cultural sensitivity in managing and presenting complex data around protected characteristics
Cultural fit criteria
  • Experience of successful collaborative working as part of a mixed supplier-client delivery team sharing knowledge within the team
  • Experience delivering in an open, collaborate, agile way according to the principles outlined in the Government Service Design Manual
  • Experience of working with clients / team members with low technical expertise
  • Experience of successful agile delivery in non-agile environments and bridging the two worlds
  • Demonstrate simplicity (e.g. do less but better, explaining complex issues in a clear, simple way, break down and prioritise complex issues)
Payment approach
Capped time and materials
Assessment methods
Written proposal
Evaluation weighting

Technical competence

65%

Cultural fit

15%

Price

20%

Questions asked by suppliers

1. Any work that’s already been done: Can you make previous work available to bidders, please? The Beta will rely entirely on your previous work, hence an accurate bid can be made only in full knowledge of that previous work.
All shortlisted suppliers will be provided with background information which will include (1) findings from user research during the discovery and alpha (2) user personas (3) an overview of the data sets to be collected from departments (4) draft architecture, and (5) high level plan. We aim to provide this information by COB 27 March 2017.
2. Could we get a copy of the Discovery report so that we can better understand the needs of the users and the likely scope of the project?
Yes, this will be included amongst the information we provide to shortlisted suppliers by COB 27 March 2017.
3. Please can you tell us who is completing the current alpha phase?
We have a blended team from Cabinet Office, GDS, ONS and Methods Digital working together during alpha. Methods Digital were commissioned to deliver the digital workstream of the alpha: https://www.digitalmarketplace.service.gov.uk/digital-outcomes-and-specialists/opportunities/1492
4. Do we only have to provide a written proposal?
Only the written proposal will be evaluated, alongside the evidence you provide for essential/nice-to-have skills and experience and cultural fit criteria. However, we are also offering shortlisted suppliers a chance to meet face to face on Thu 30 and Fri 31 March. Any additional information provided during these sessions will be shared with the all other shortlisted suppliers.
5. What are your timeframes for the evaluation?
We aim to provide background information to shortlisted suppliers: COB 27 March. Face to face meeting in London (optional and not evaluated): 30 or 31 March. Written proposals due: Tue 4 April (9am)
6. We note that Alpha delivered a throw-away prototype, however, has Alpha explored or prototyped any specific technologies relevant to Beta delivery? What candidate technologies do the Alpha outputs recommend? Which technologies, if any has Alpha discounted and rejected as being inappropriate?
We explored several options during alpha and are now recommending a simple public cloud hosted website comprised of static HTML pages. We anticipate the pages will be generated by a custom CMS and publishing tool using a web framework.
7. What are the Non Functional Requirements of the service?
The public facing service will need to easily scale for large peaks in traffic and will need to be available 24/7. Non functional requirements will also evolve during beta.
8. We note that the Alpha team incorporated development and data science roles. Are there any preferred technologies already in use by the existing team i.e. R, Python, Shiny, Tableau?
The existing team have used R and Python as part of alpha.
9. Has Alpha outputted a high-level product roadmap for the service? Can this be shared?
Yes - we will share this with shortlisted suppliers.
10. Can you please characterise the data sources relevant to the initial Minimum Viable Product i.e. In reference to volume, velocity and variance – how large are data sources / are these static or real time / what sort of formats are the different raw data sources?
The MVP will be taking data from a number of different government departments.
These will be data aggregated by ethnicity/geography/other dimensions (no record-level data), and a mix of admin and survey data.
They are static datasets (for beta) and mainly supplied to the project as xls/csv flat files.
11. Has Alpha outputted a recommended team profile for Beta and can this be shared?
Suppliers should propose the team structure that they feel best meets the outcome. We expect the team will include one or more of the following roles, and these may change over the course of beta depending on the phase and needs of the project during that time: user researcher, designer, content designer, product manager, delivery manager, developer, architect, web ops, performance analyst
12. Will the existing team members from Cabinet Office, ONS and GDS remain consistent and form part of the team during Beta? If not how is this expected to change during Beta and which roles will the combination of Cabinet Office, ONS and GDS fulfil?
Yes, the team members will remain the same from Cabinet Office, ONS and GDS with the following exceptions: the current GDS user researcher will be moving off to another project at the start of beta; and a designer and content designer are due to join our team imminently.
13. Did the data scientist during Alpha deliver any prototype analytics, if so can these be described at a high-level? Did Alpha consider a data test strategy to inform Beta, if so can this be shared or described?
No analytics were used on the prototype. Data test strategy is under development in this final few weeks of alpha.