UK Research and Innovation: Science and Technology Facilities Council (STFC)

UK SBS DDaT19145 UKRI – STFC – Proper Analysis of Coherent Excitations

Incomplete applications

4
Incomplete applications
2 SME, 2 large

Completed application

1
Completed application
0 SME, 1 large
Important dates
Opportunity attribute name Opportunity attribute value
Published Friday 16 August 2019
Deadline for asking questions Friday 23 August 2019 at 11:59pm GMT
Closing date for applications Friday 30 August 2019 at 11:59pm GMT

Overview

Overview
Opportunity attribute name Opportunity attribute value
Summary of the work UKRI seek a team including a software development team leader and a software developer for the design and development of the core framework of a multi-dimensional data analysis application for parallel and distributed computing, to be used to analyse neutron scattering experiments.
Latest start date Friday 25 October 2019
Expected contract length 1 YEAR
Location South West England
Organisation the work is for UK Research and Innovation: Science and Technology Facilities Council (STFC)
Budget range The total value of this requirement is up to £400,000 exclusive of VAT.

About the work

About the work
Opportunity attribute name Opportunity attribute value
Why the work is being done The Excitations Group at STFC’s ISIS Neutron and Muon Source operates four world-class neutron instruments which are used for materials science experiments. To make best use of these expensive facilities, the PACE data analysis software will enable virtual experiments to be performed before and during neutron experiments, will maximise the information extracted from the experiments and will lower the barrier for users to analyse their data, leading to faster dissemination of scientific results and increased volume and quality of knowledge.

Production of the core data analysis framework of PACE constitutes this work package; it must be completed by August 2020.
Problem to be solved Recent improvements to instruments, experimental design and analysis workflows mean data is now produced at a rate that current analysis software is unable to analyse quickly and fully enough. The capabilities of existing codes need to be implemented for parallel and distributed computing to analyse the larger data sets, and with more flexibility. Moreover, straightforward and direct comparison of experimental data with CPU-intensive third-party materials modelling codes is required, together with optimisation of adjustable modelling parameters in those codes, with full account of instrument broadening. The manipulation, comparison and refinement needs to be possible from a single work environment.
Who the users are and what they need to do The users of the software will be research scientists, some based full-time at ISIS, but the majority visiting from universities and other research institutes making short-term use of the instruments – typically on-site for a few days to a week. Their needs can be summarised as:
As a scientist, I need to be able to extract the scientific data from my neutron experiments, usually comparing with theoretical models, in a robust and timely manner so that I maximise the information gained from the limited and valuable time on the instruments for the benefit of the scientific community at large.
Early market engagement N/A
Any work that’s already been done Outline design work has started for implementation of the framework on distributed computing at STFC; the production of this core framework constitutes this work package. A considerable start has been made on interfacing with third party modelling codes (lattice and magnetic vibrations), and prototypes for faster convolution of models with instrument resolution functions.
PACE is based on existing un-parallelised software written at ISIS that provides a data analysis framework and optimises adjustable parameters in simple models of the data accounting for instrument broadening effects. The principles embodied in these codes are the template for PACE. See www.isis.stfc.ac.uk/Pages/Proper-analysis-of-coherent-excitations.aspx for more details.
Existing team The existing PACE team consists of a full-time software engineer and full time post-doctoral research scientist with substantial parallel programming experience; both are STFC staff. The supplier team will also work closely with 3-4 members of the ISIS Excitations Group and STFC Scientific Computing Department who have both scientific domain expertise and programming skills, and who work part time (10%-50%) on PACE. In addition, there will be interactions with the STFC Scientific Computing Department Systems Division, which maintains the hardware on which PACE will run.
Current phase Discovery

Work setup

Work setup
Opportunity attribute name Opportunity attribute value
Address where the work will take place ISIS Neutron and Muon Source, STFC Rutherford Appleton Laboratory, Harwell, Didcot OX11 0QX. Some travel to international partner sites may be required.
Working arrangements The supplier team will work on-site at ISIS and the STFC Scientific Computing Department, alongside the existing PACE team. The supplier should provide a flexible team - the skills mix of the team required may need to be altered or be complemented in the short term during the course of the project. ISIS will provide office accommodation, development software, hardware and infrastructure.
Security clearance Baseline Personnel Security Standard (BPSS) required

Additional information

Additional information
Opportunity attribute name Opportunity attribute value
Additional terms and conditions N/A

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Skills and experience
Opportunity attribute name Opportunity attribute value
Essential skills and experience
  • 1. Detail what experience your team has had developing distributed and parallel computing solutions and how this has satisfied the scientific requirements (4%)
  • 2. Outline your experience in developing/integrating profiling tools to compare how new solutions address performance bottlenecks in pre-existing codes (1%)
  • 3. Detail your experience in developing software in a scientific research environment using C/C++ and which will interface with MATLAB &/or Python. (3%)
  • 4. Outline your experience in Developing scientific software for a cross-platform environment (Windows, Linux, Mac) (2%)
  • 5. Outline your skills and experience in undertaking agile software engineering leadership in a technologically complex, scientific research environment (3%)
  • 6. Outline your experience of Producing unit, integration and acceptance tests to verify quality and correctness, (2%)
  • 7. Detail how you have integrated applications with algorithms to find the best-fit values of parameters in scattering model through least-squares fitting. (2%)
Nice-to-have skills and experience
  • 1. Detail your experience in Interfacing to 3rd party modelling codes, ideally materials modelling codes. (2%)
  • 2. Detail your experience of documenting projects and code for developers and future users (1%)

How suppliers will be evaluated

How suppliers will be evaluated
Opportunity attribute name Opportunity attribute value
How many suppliers to evaluate 3
Proposal criteria
  • Demonstrate how your proposed team’s technical expertise, solution, approach and methodology will ensure the successful delivery of this project. (30%)
  • Demonstrate how your team’s approach and solution will meets user needs (5%)
  • Provide details of the key risk and a plan for mitigating against them (5%)
Cultural fit criteria
  • Demonstrate how.......
  • your team will work on complex scientific activities with the multi-disciplinary PACE team, and the STFC Scientific Computing Department, how this will result in successful delivery of this project. (5%)
  • you will take the lead in design/delivery of the core data analysis framework alongside work of PACE team members, on the core framework and the other aspects of PACE. (4%)
  • you will share knowledge of architectural design and parallel computing expertise with PACE staff/staff recruited during the course of the coming 3-9 months so that the team becomes self-sufficient. (3%)
  • you can be flexible to evolving needs of the project to satisfy the scientific requirements as it moves from discovery to construction. (3%)
  • your teams’ approach, solution and methodology will ensure continuity of service throughout the life of the contract. (5%)
Payment approach Capped time and materials
Assessment methods
  • Written proposal
  • Case study
  • Presentation
Evaluation weighting

Technical competence

60%

Cultural fit

20%

Price

20%

Questions asked by suppliers

Questions asked by suppliers
Supplier question Buyer answer
1. Latest start date The commencement of the services is scheduled for 25th October 2019 in accordance with the published requirements. However, this is subject to change should there be any delays in the procurement process.
2. Procurement timeframe The anticipated procurement timeframes are as follows:
12th September 19 – Notification of shortlisting outcome
19th September 19 – Written proposal deadline
30th September / 1st October 19 – Presentations
16th October 19 – Notification
3. Technical Competence Criteria The technical competence section included within the above requirements has been allocated 20% of the overall technical competence weighting for the shortlisting stage.

The remaining 40% for technical competence will be allocated to the assessment stage. Further information on this will be shared with shortlisted bidders.
4. Discovery Outputs All outputs and relevant documents from the previous discovery / alpha phase will be shared with shortlisted bidders
5. Are the number of parallel threads required bigger than the number of threads running on a single batch computing node? If so, then how tightly coupled are the parallel processes? For example, does a solution require high-speed MPI interactions or is parallel batch processing required? The number of parallel threads is bigger than the number on a single computing node for significant fraction of the expected computations. The parallel processes are not envisaged to be tightly coupled. The calculations from models for scattering are expected to be embassingly parallel. However, refinement of adjustable parameters in those models will require sufficient information to be returned to a master node to compute the values of those parameters in the next iteration of a least-squares fitting algorithm, for example.