Awarded to Kubrick Group

Start date: Thursday 22 July 2021
Value: £664,620
Company size: SME
Genomics England

Data Function Burst Capacity

10 Incomplete applications

7 SME, 3 large

14 Completed applications

10 SME, 4 large

Important dates

Friday 21 May 2021
Deadline for asking questions
Friday 28 May 2021 at 11:59pm GMT
Closing date for applications
Friday 4 June 2021 at 11:59pm GMT


Off-payroll (IR35) determination
Summary of the work
We are looking for a provider of data engineering resource to meet existing BAU demand and deliver technical spikes requiring increased capacity on a short term basis.
Latest start date
Thursday 1 July 2021
Expected contract length
No specific location, for example they can work remotely
Organisation the work is for
Genomics England
Budget range

About the work

Why the work is being done
New Spending Review obligations necessitate the expansion of the data capability within Genomics England, and the intake of additional flexible data resource.
Problem to be solved
The current data resource at Genomics England is not adequate to meet the growing demands of the business. We need to onboard a flexible resource with a grounding in all areas of technology and process, which can contribute to the organisation's objectives in as short a time as possible.
Who the users are and what they need to do
As a product/service owner at Genomics England I need to access burst data resource capacity so that I can meet existing business objectives within organisational timescales
Early market engagement
Any work that’s already been done
We have been working with an incumbent supplier who has provided a number of data wranglers, analysts, and engineers on an initial consultancy basis.
Existing team
Director of Data Strategy, Head of Data Operations, and numerous data wranglers, analysts, and engineers
Current phase

Work setup

Address where the work will take place
Work will take place remotely
Working arrangements
Security clearance

Additional information

Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • have high technical capability in data management and data processing
  • have high technical capability in business process analysis
  • have experience and capability in agile delivery
  • have experience and capability in data modelling
  • have experience with industry standard data tools (Tableau, Trifacta, SQL)
Nice-to-have skills and experience
  • have technical capability in cloud data management and cloud data processing
  • have data science capability

How suppliers will be evaluated

All suppliers will be asked to provide a written proposal.

How many suppliers to evaluate
Proposal criteria
  • Provide at least 10 FTEs for the duration of the contract - 7 data wranglers, 2 data managers, 1 data analyst
  • Provide short term (2 week) burst capacity of 20 FTEs for 4 planned technical spikes
  • Deliver technical training on data modelling, data management and data processing in line with agile delivery - trainers with at least 10 years' industry experience
  • Provide consultants with experience in project management, agile delivery, and user research
  • Provide consultants with experience in (several of) artificial intelligence, data architecture, data engineering, data modelling, data science, econometrics, machine learning, and natural language processing (NLP)
  • Evidence of training/capability of consultants to deliver an in-depth, technical, end-to-end project in a group setting
Cultural fit criteria
  • Provide external line management of data resource throughout the duration of the contract
  • Have a diverse approach to providing candidates from a range of backgrounds
  • Ensure financial security and a level playing field for all candidates regardless of the length of their placement with us
Payment approach
Capped time and materials
Additional assessment methods
Evaluation weighting

Technical competence


Cultural fit




Questions asked by suppliers

1. Could you confirm who the incumbent supplier is? Is this requirement to augment the capabilities that the incumbent is currently providing?
We are currently utilising some temporary resource provided by Kubrick, but this agreement will not overlap with the new requirement for burst capacity. We do not expect the two requirements to run concurrently.
2. The brief mentions providing 10 resources for the duration of the project. What is the duration of the project?
We anticipate a term of 9 months with the option to extend up to a year.
3. What is the budget of this project?
The budget is £1M for the whole project, including the technical spikes.
4. Where you have stated that the work will be delivered remotely, could you clarify is this will need to be UK-based fully or could this involve some elements of non-UK based working?
We anticipate the work will be conducted remotely so can be done from overseas, but will require any team members to be available during standard UK working hours.
5. Do you wish to be able to tap on / off resources as needed and if so, what utilisation could be guaranteed please?
We would be using 8 – 10 resources at any given time and require a burst capacity of 15 – 20 resources for a 2 week technical spike 4 times a year.
6. Please can you confirm whether the supplier experience will still be valued if aligned solely to Google Cloud Platform ML services and Tensor Flow?
This is not aligned with our current technology stack unfortunately so we would not take this forward.
7. You have stated that you require “experience with industry standard data tools (Tableau, Trifacta, SQL)”. Do you have a preference for these tools or are other industry standard tools, such as Alteryx or Azure Data Factory, acceptable?
Alteryx experience would be acceptable but Azure Data Factory skills would need to be translatable to AWS Glue.
8. Does this opportunity fall outside IR35 regulations?
9. 1. Please provide additional information on the expectations for training sessions (e.g. would supplier be required to provide training material, environments, etc)?
We expect that the supplier provides training to their staff on end-to-end data projects including data ingestion, data cleansing, data management, metadata management, data transformation, and data visualisation. Preferably this will be in the form of a structured group exercise for candidates where they work together in an agile fashion to deliver a group project over a fixed term. We expect this type of training to be provided by the supplier to their candidates prior to onboarding.
10. We note that you want to start work 1st of July, can you please provide additional information on the next steps in the procurement?
We will be following the DOS 5 procedure i.e. shortlisting then selecting a preferred supplier based on proposals. Once a supplier has been appointed, we would like to review proposed candidates alongside finalising the contract in order for the 10 FTE to start on 1st July.
11. Please provide additional information on the four planned technical spikes. When will they occur? What skills will be needed for the additional 20 FTE?
The spikes will be on a 3-month basis at the end of each quarter starting on September 16th (next one would follow after 3 months but would need to adjust for holidays taking into account 2 week spike, so likely December 2nd for the next one). Each spike will last 2 weeks. Skills required would be the same as listed for onboarded candidates but possibly including knowledge of machine learning/platform and/or cloud engineering. This is to develop PoCs for large implementation pieces.
12. Please provide additional information on the size and structure of the existing team and how these additional resources/teams will fit into the existing team.
Current data resource within GEL excluding bioinformatician data engineers and data architects stands at 10 FTE plus 6 contract resources from incumbent supplier. All resources are spread across products in squads and teams in the organisation. New resources would also be deployed into product squads and teams.
13. Can you provide more details around the question: have high technical capability in business process analysis?
Candidates should have the ability to create process maps, perform swot and/or pestle analysis, and produce technical flow charts of current data/process flows.
14. Please provide additional information on the software/tools in use. Are Tableau, Trifacta, and SQL the software to use? What Cloud stack and CSP provider (Cloud Service Provider) is being used by the Organisation ?
AWS with hybrid configuration connected to HPC on-prem is our current preferred cloud supplier. Tableau, Trifacta and SQL are the tools currently in use mainly.
15. With your question regarding experience with industry standard data tools (Tableau, Trifacta, SQL), is Power BI an acceptable replacement for Tableau?
PowerBI experience would be an acceptable replacement for Tableau if the candidates are able to produce basic similar dashboards in Tableau, we do not currently use PowerBI but we can test against Tableau capacity before onboarding.