Awarded to RPLUS ANALYTICS SOLUTIONS LIMITED

Start date: Tuesday 1 February 2022
Value: £71,820
Company size: SME
Department & Work & Pensions

Data Architect

1 Incomplete application

1 SME, 0 large

3 Completed applications

3 SME, 0 large

Important dates

Published
Thursday 16 December 2021
Deadline for asking questions
Thursday 23 December 2021 at 11:59pm GMT
Closing date for applications
Thursday 30 December 2021 at 11:59pm GMT

Overview

Specialist role
Data architect
Off-payroll (IR35) determination
Supply of resource: the off-payroll rules will apply to any workers engaged through a qualifying intermediary, such as their own limited company
Summary of the work
CAMLite is a workflow management tool which is used by internal DWP staff to update claimant information, the work is created for the users as Tasks

DWP are replacing the system with an AWS Cloud based service, using ARA components where possible, to replace the COTS Product.
Latest start date
Tuesday 1 February 2022
Expected contract length
Initial term will be 6 months with an optional 6 months extension
Location
No specific location, for example they can work remotely
Organisation the work is for
Department & Work & Pensions
Maximum day rate

About the work

Early market engagement
Who the specialist will work with
You will be comfortable working in a cross-functional team (including UX, analysts, statisticians, engineers, product owners, security risk, etc.).
What the specialist will work on
As a Big Data Architect you will be able to demonstrate expertise in leading enterprise data architecture and design for large complex businesses.

Your credible track record of designing for Microservices, external data ingestion, real-time data analytics, domain-driven master data model, event-driven enterprise data-sharing, authoritative data sources and an external API exchange will be utilized in a continuously improving, challenging and satisfying environment.

You’ll thrive creating value using large, diverse data volumes, e.g. by using Tableau, Clik, C3, Hadoop / Spark / SQL / Mongo Atlas/ Python to create modelling features from underlying transactional data.

Work setup

Address where the work will take place
The Services will be aligned to a DWP Technology Hub (to include Manchester and Leeds), the majority of the Buyer’s Digital Workforce are currently working from home and the Supplier Services will be delivered remotely, this is anticipated to continue.
Working arrangements
Digital Workforce are currently working from home and the Supplier Services will be delivered remotely, this is anticipated to continue. The team currently work 5 full days, working hours are flexible provided the supplier covers 8 hours between 08.00am & 18.00pm
Security clearance
The individual shall comply with Baseline Personnel Security Standard (BPSS)/Government Staff Vetting Procedures in respect of all persons who are employed or engaged be the Supplier in the provision of this Call Off Contract prior to each individual commencing work.

Additional information

Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • Expert level data modelling, conceptual through to physical, relational, object, analytical and NoSQL.
  • At least 10 years data modelling experience in a software engineering environment.
  • Expert in domain driven design and Microservices.
  • Expert in designing analytical database models from underlying transaction data.
  • Hands own experience in Mongo DB, Hadoop, Tableau, Qlik and Spark
  • Scripting/Coding experience (e.g. Bash, Python, Perl)
  • Excellent experience with the Hadoop ecosystems (such as HDFS, YARN and/or Hive)
  • Strong experience with streaming and stream processing frameworks (such as Spark, Storm, Flink, Kafka and/or Kinesis)
  • Good knowledge of at least one of the following programming languages: Python, Scala, Go, Kotlin, Java
  • Experience with NoSQL databases (such as HBase, Cassandra and/or MongoDB)
  • Experience with public cloud-based technologies (such as Kubernetes, AWS, GCP, Azure, and/or Open stack)
  • Expert level in the creation and maintenance of enterprise data artefacts.
  • Highly proficient in at least one query language for each of the following: relational; analytical; NoSQL.
  • Proven innovator with proficient software engineering skills to prototype relational; analytical; NoSQL solutions.
  • Highly motivated with experience of selecting and working with data and data tools.
  • Experience of canonical modelling.
Nice-to-have skills and experience
  • Capable of engaging senior stakeholders.
  • Capable of developing and maintaining strong working relationships within the organisation and third party suppliers.
  • Able to manage their time effectively and work proactively across projects as well as BAU tasks.
  • Written and verbal skills to enable them to communicate complex data and information problems and solutions.
  • Experience of developing data strategies, policies and standards.
  • Have experience of working within DWP or a comparable organisation within the last 3 years
  • TOGAF Practitioner.

How suppliers will be evaluated

All suppliers will be asked to provide a work history.

How many specialists to evaluate
5
Cultural fit criteria
  • Experience of working within agile teams remotely and collaboratively
  • Experience of GDS Standards
  • Proactive
  • Comfortable working in a cross functional team
  • Ability to communicate effectively to express freedom of thought and innovation, positively challenge and act as an advocate for change.
  • Evidence of working in a transparent and visible way that drives performance, ownership, trust and eliminates surprise.
  • Evidence of a culture of continuous improvement
Additional assessment methods
  • Reference
  • Interview
  • Scenario or test
Evaluation weighting

Technical competence

40%

Cultural fit

10%

Price

50%

Questions asked by suppliers

1. Is there any incumbent?
There is no Incumbent Data Architect
2. What is the budget for this requirement?
DWP decline to answer this question.