This opportunity is closed for applications

The deadline was Friday 2 June 2017
National Crime Agency (NCA)

Cyber Platform Implementation Team

6 Incomplete applications

4 SME, 2 large

19 Completed applications

6 SME, 13 large

Important dates

Published
Friday 19 May 2017
Deadline for asking questions
Friday 26 May 2017 at 11:59pm GMT
Closing date for applications
Friday 2 June 2017 at 11:59pm GMT

Overview

Summary of the work
Provide a scrum team to enhance a secure Data platform on AWS. NCA National Cyber Crime Unit seeks professional services for a beta and production phase project of their Data Platform. This will see the integration of multiple data sources from Cyber to support analysis and insights
Latest start date
Monday 31 July 2017
Expected contract length
11 months
Location
London
Organisation the work is for
National Crime Agency (NCA)
Budget range

About the work

Why the work is being done
As part of the Cyber strategy the Engineering team deliver data platforms and analysis tools to develop insight into cyber crime. A large number of systems are currently in operation and the project will pull these sources into a single logical source; improve analysts tools and access to information by providing clear, accurate and interactive tools. Develop data pipelines and consolidate currently held information, speed up analysis processes. To be complete within 9 months.
Problem to be solved
Combine NCA and external data sources so they can be searched and analysed. This includes, threat updates and current data repositories.
Who the users are and what they need to do
As a Data Scientist I need to analyse multiple data sources at once so that I can quickly analyse and visualize my findings to make recommendations to the business. As an investigator, I want to be able to reduce the time developing actionable insights. As a compliance manager, I need to be able to manage and oversee data to ensure activity is compliant with relevant legislation.
Early market engagement
Any work that’s already been done
From November 2016 to April 2017 the unit ran an ‘Alpha’ Stage. The pilot successfully built a minimum viable product of the data platform bringing a small number of data sources; together in a centralised data repository and facilitated at speed analysis. The system is based on AWS including; TitanDB, Spark and Elastic Search. The system is designed to run at Official Sensitive classification
Existing team
The main contacts are in the NCCU capability development and engineering teams. The Alpha stage was staffed with 5 business analysts, architects and software engineers
Current phase
Alpha

Work setup

Address where the work will take place
The NCA, Spring Gardens, Tinworth Street, Vauxhall, London
Working arrangements
Supplier are expected to supply individuals at different stages of the project based on different skills sets. Depending on the backlog, there will be work required around business analysis, data engineering, web front end/ UI development and creation of analysis tooling.
Team size to be around 4 people, expertise in design, development and delivery of cyber data systems.
Work a mix of onsite and offsite working. Envisaged technical phases to be done remotely. 1-2 members of the team probably be required onsite working in an agile fashion with NCA teams. Travel expenses should be included in the cost proposal
Security clearance
SC Clearance

Additional information

Additional terms and conditions
Freedom of Information (full clause will be issued on award of contract) and limiting the NCA's indemnity to £100,000

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • 2 or more projects working with clients on big dat and data analysis products
  • Specific experience of AWS ecosystem (EC2, Lambda, DynamoDB, API Gateway, OQS)
  • Knowledge of Graph Databases, Graph Analytics and linked data
  • Experience of Python, Scala, Javascript, CSS and HTML
  • Knowledge of data flows and accusations (queues, API's and ETL Processes)
  • Demonstrable experience of Data taxonomies and ontologies including data modelling and data management
  • Subject expert knowledge in data analysis tools
  • Demonstrable knowledge of UI and UX
  • Excellent stakeholder engagement, to liaise with teams and to Understand existing systems and data
  • Business analysis skills
  • ability to work with client development teams following agile practices and iterative development projects
  • Demonstrate how to support and maintain the pipeline for the beta phase
Nice-to-have skills and experience
  • Cyber Crime/Cyber Security Knowledge including threat data
  • Experience of building cyber threat system

How suppliers will be evaluated

How many suppliers to evaluate
3
Proposal criteria
  • Methodology or approach to delivery and support
  • Technical capability
  • Subject matter expertise
  • timeframe proposal
  • Risks, assumptions and challenges identified
  • added value experience in big data solutions
  • Knowledge of cyber threat arena
  • estimated timeframes for the work
  • knowledge transfer
  • value for money
  • Team Structure
Cultural fit criteria
  • Share knowledge and skills
  • Be transparent and collaborative when making decisions
  • take responsibility for their work
  • subject matter expertise
  • work in partnership with flexibility
Payment approach
Capped time and materials
Assessment methods
Written proposal
Evaluation weighting

Technical competence

50%

Cultural fit

15%

Price

35%

Questions asked by suppliers

1. What’s the scope of the work within the Engineering V lifecycle? As it is T&M the specification doesn’t provide much detail on the specific aspects we’d work on within the system Do you have a WBS?
V model is largely in the detailed design, implementation and verification stages (bottom of the V). 3-4 months of BA work have been conducted which has catalogued requirements via JIRA. Supporting slide deck available. The project is run in an Agile fashion, the next 6 sprints have been scoped.
2. How are the beta and production phases delineated? What are the expected outputs from each?
Detailed in the powerpoint - mixture of functionality and size of userbase
3. Was a pilot report produced for alpha version and would we be able to see
Detailed in the powerpoint
4. What are the mechanics for remote working? Do you have a development domain we can access remotely so tools such as JIRA, Confluence and Bitbucket can be shared? Are you using these tools?
We will provide a VPN which enabale suppliers to connect in. From a security perspective we wish to limit access from a defined IP range where possible
5. Is the platform Official Sensitive or just the data? Could we VDI from our machines into the development environment? Is anything hosted by the NCA or is it all on AWS?
We deem the generic infrastructure to be OFFICIAL, the analytics and collection techniques we deem to be OFFICIAL-SENSITIVE. Our data is OFFICIAL-SENSITIVE. The security controls we apply we also treat as OFFICIAL-SENSITIVE.

We have allowed VPN access previously however this was a short term fix. As we have dev, test and prod envrioment it may eb that we provide access to only one enviroment. The entire enviroment is reproduceable using Terraform scripts, therefore it could potentially be stood up seperately on AWS.
6. Is the technology stack baselined following the alpha phase?
Largely, however the solution has not been proved at scale. Although as we have used open standards and tried to implement modular design principles we believe key changes to technology should not present significant integration work
7. is the volume, velocity and variety of the data in question?
Velocity is not believed to be an issue. The data volumes are relatively small (a number of TBs raw). We have sought to reduce the variety by making a number of design choices and having a technical roadmap which is cognisant to these matters.
8. What tools do analysts currently use?
The current system requires analytic tools to be integrated
9. What is the makeup of the rest of the team? Will the team responsible for the alpha phase continue?
The team responsible for alpha phase were an external company who came to the end of the contract they made up 70% of the team, the remaining 20% was staffed by the NCA consisting of expereinced software developers and infrastructure engineers. The NCA individuals will continue on this projec
10. Could the team support graduates?
Potentially. The concern would be around the quality and speed at which development work could be conducted. The NCA does not have the capacity to mentor external graduates. We could consider this approach if there was some for of mentoring and technicual sign off provided by the provider in collaboration with the NCA.
11. Do you require any infrastructure or hardware installation and configuration or is this out of scope? E.g. Sys-admin and OS installs
This is out of scope, all infrastructure is within AWS. We are able to make changes to our desktop system within the NCCU.
12. Once in production, will you require on-going support?
This is not planned for this FY and will be handled by the NCA/ NCCU support and infrastructure team.
13. How big is the user base for the system?
The total possible users is about 300, although day to day usage is anticipated to be 50% of this figure.
14. Would you consider an off the shelf product?
This is a system of systems, as such we will consider COTs against custom developmenet and open source. We seek best of breed with regards to those subsystems.

We are using open standards and a decoupled design to avoid vendor lock in and increase the agility required to fight cyber crime.

We seek to reduce on going licensing costs where possible.
15. The question we were referring to in our previous submitted CQ was: Demonstrate how to support and maintain the pipeline for the beta phase
This is in reference to being in a position to bring in new feeds of data, these are similar feeds but we must be in a position to understand how datasets of greater scale will effect the overall quality and performance of the system.
16. Could you pls. clarify what you mean in Q5. by by data accusations?
This is a mistake and should read as "data acquisition"
17. Power Point
More detail can be provide in a powerpoint presentation which can be supplied on request
18. Powerpoint Presentation.
Unfortunately as attachments can't be add via this web form the powerpoint presentation will be provided at the next stage.