Met Office

Application Lifecycle Management (ALM) - Test Automation Skills Injection

Incomplete applications

14
Incomplete applications
11 SME, 3 large

Completed applications

24
Completed applications
18 SME, 6 large
Important dates
Opportunity attribute name Opportunity attribute value
Published Friday 2 June 2017
Deadline for asking questions Friday 9 June 2017 at 11:59pm GMT
Closing date for applications Friday 16 June 2017 at 11:59pm GMT

Overview

Overview
Opportunity attribute name Opportunity attribute value
Summary of the work The Met Office initiated an Application Lifecycle Management (ALM) project to improve the current delivery lifecycle. The project identified an opportunity to leverage industry best practice testing. The Met Office is looking for a supplier to help deliver the blueprint while at the same time up-skilling the current delivery teams.
Latest start date Saturday 1 July 2017
Expected contract length 12-18 months
Location South West England
Organisation the work is for Met Office
Budget range Between £500k and £900k based upon understanding of requirements from strategy.

About the work

About the work
Opportunity attribute name Opportunity attribute value
Why the work is being done The testing capability within the Met Office is being transformed from a Quality Assurance (QA) to a Total Quality Engineering (TQE) model. This involves introducing new practices such as Behaviour Driven Development (BDD) and Specification by Example (SBE). These practices consider quality at the outset and support Continuous Integration which enables a ‘learn fast, learn often’ culture. The current test automation approach was found to be lacking and unable to fully support a Total Quality Engineering / Continuous Integration (CI) model.
Problem to be solved A recent review identified the following challenges:
• No common tooling strategy.
• Duplicate tooling (e.g. 9 x Test Management tools).
• Over reliance on unscripted testing (~60%).
• Only ~20% of products have regression packs
• Automation scripts not maintained post-delivery.
• Emphasis on GUI automation.

Addressing these challenges will:
• Increase velocity.
• Increase quality.
• Increase repeatability.
• Shorten feedback cycles (learn fast, learn often)
• Reduce the overall time to market.
• Reduce testing technical debt.
• Drive a continuous improvement culture.
• Link test artefacts and test execution results to requirements in a common tool.
Who the users are and what they need to do Solution Engineering requires Supplier to:
• Provide automation (CI) skills injection (coaching / consulting)
• Work with teams to deliver automated tests
• Form part of strategic team to deliver the test automation strategy
Early market engagement
Any work that’s already been done An automated testing strategy was delivered in partnership with an existing supplier. We have begun to deliver against that strategy but have struggled to secure the right resources to help up skill the wider teams.

The strategy recommends engaging a 3rd party to help implement the strategy and up-skill the current Met Office teams.
Existing team • Delivery Teams – Project Manager, Release Manager, Technical Lead, Solution Engineers, Product Owner.
• As a Service – Solution Architects, Test Architects, Product Managers, Technical Testers, PMO.
Current phase Not started

Work setup

Work setup
Opportunity attribute name Opportunity attribute value
Address where the work will take place • Majority of engagements will be based in Exeter, UK.
• Option to work remotely as appropriate.
Working arrangements Core team of to be based onsite with the potential for the other roles to work remotely. Supplier to rotate core team based onsite as appropriate. The Met Office is open to onshore / nearshore solutions.
Security clearance Please consider when responding, the necessity for all individuals to be willing and able to pass SC clearance.

Additional information

Additional information
Opportunity attribute name Opportunity attribute value
Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Skills and experience
Opportunity attribute name Opportunity attribute value
Essential skills and experience
  • Experience of implementating strategic vision for automation testing in an software development environment.
  • Management of the adoption and running of testing processes and frameworks
  • Effective engagement across internal IT teams including representation and championing of testing methodologies within the wider Enterprise, Solution Architect & IT Manager community
  • Experience of supporting competency leadership from a testing perspective within internal IT teams.
  • Experience working with various contributory groups in order to ensure that the ‘Toolset’ within Service Transition teams is both effective and efficient
  • Provide advice and guidance on the levels and types of skilled resource, participate fully in recruitment campaigns and help with selection decisions for individual candidates
  • The ability to provide hands on expertise as and when required to support the squads and Test Leads with implementing the automated engineering approach.
  • Implementation and execution of relevant automated tests.
  • Mentoring and coaching of permanent staff to ensure that knowledge persists.
  • Expertise and knowledge about CI pipelines and dependencies & assistance with navigating challenges when they arise.
  • Working with appropriate delivery teams to ensure adoption of automated testing and any associated techniques.
Nice-to-have skills and experience

How suppliers will be evaluated

How suppliers will be evaluated
Opportunity attribute name Opportunity attribute value
How many suppliers to evaluate 5
Proposal criteria
  • The proposed approach, methodology and experience.
  • How the approach or solution meets user needs
  • The proposed technical solution
  • The lead time to commence the engagement
  • The value for money of the proposal
Cultural fit criteria
  • Work as a team with our organisation and other suppliers
  • Timeliness and effectiveness of communication
  • Willingness to engage as a partner
  • Openness and approach to knowledge transfer/training
Payment approach Time and materials
Assessment methods
  • Written proposal
  • Case study
  • Work history
  • Presentation
Evaluation weighting

Technical competence

60%

Cultural fit

15%

Price

25%

Questions asked by suppliers

Questions asked by suppliers
Supplier question Buyer answer
1. We would like to know can we work this project remotely. For example, initial discovery, set up etc first month based onsite in Exeter and then operating remotely from our London office with travel to Exeter when required. Assuming it's an Agile environment, daily standups by phone/webex/skype. There can be some elements of 'offsite', however, with the up-skilling element and engagement with our teams, the MO preference is for more of an on site presence. In addition, a remote working solution isn't in place for external companies within the MO network, so that will be an issue. Require a fixed % of staff on site which provides some flexibility with giving stability. Expectation is that we see how you would facilitate this in your presentation
2. 1. When is your current phase due to start? 2. Are you using agile/DevOps as your framework? 3. What types of applications/platforms are in-scope for testing? 4. Does the scope include non-functional testing, e.g. performance testing? 5. What test management tools are you using? 6. Is any development done in-house? 7. Is SC clearance required or just be willing to go through SC clearance? Will this be sponsored by the buyer? 8. Can you share any details of the automation strategy? 9. Have you implemented BDD or is this aspirational? 10. Are there business analysts within your existing team? 1. We have 6 key phases of which one is already underway but we would like resources on deck by July / August. 2. We are using Agile, our maturity level is low. 3. Testing types - Acceptance, Sys-Integration / Regression, System (e2e) Component Int & Component & Unit. 4. Test Mngt tools - JIRA & Zephr & finalising our toolset as we move forward. 6. Yes in house development. 7. Yes. 8.Yes, see answer 5. 9. We are attempting to implement BDD (but again this is immature in the MO). 10. No - Product Owners gather and communicate requirements.
3. In the experience needed - "Experience of supporting competency leadership from a testing perspective within internal IT teams." can you please explain the term "competency leadership". Competency is the term the MO use when reffering to standardization, best practice and overall level of skills required to deliver software projects.
4. Are there specific technologies that the Met Office is looking to target to be tested? Java / Java Script / Python
5. "Emphasis on GUI automation" is marked as a problem, is this because there is a lack of unit and integration testing happening or for another reason? This is indeed due to a lack of Integration testing currently in place.
6. Are there other third parties engaged to drive non-testing elements of Continuous Integration or is this driven internally? No, driven internally
7. If there is a transition to a BDD approach, do the existing test cases/requirements support this or is there an exercise required to update existing test collateral? There will be a mixture
8. We understand the Met Office want to simplify their usage of tools but will the project be looking at Test Data Management (TDM) tools and Service Virtualisation tools which might support the move to Continuous Integration also? We will be looking at TDM internally and this will be driven by the Test Tech Leads
9. Can you expand on your definition of competency leadership in relation to the 4th skill/experience? Competency is the term the MO use when refferiing to standardization, best practice and overall level of skills required to deliver software projects.
10. What is the current Project delivery model? Is the methodology based on agile scrum? We currently use Agile Scrum, Kanban and also we have a Waterfall approach to platform deliveries.
11. Please explain currently implemented agile methodology maturity across dev and test team This is low at the moment & hence why we are going through this process
12. Problem statement mentions - Unscripted testing (60%) - Regression packs (20%) What is the current state of the remaining 20% of test cases? The test cases do not exist for legacy systems
13. How is the existing test cases are categorized and tagged within test management tool - Smoke, Acceptance and Regression Test. - Test cases marked for Automation Test. - Based on Priority ( P1, P2 and P3). They are not & this is a consideration of the work we will be doing
14. This RFP was published earlier in the month of Nov last year? What was the reason it was scrapped off and published again now? Did the team progressed in terms of the required need since then? Impacts from IR35 & the loss of resource which gave us time to re-think our strategy.
15. What is the current duration between each release cycle? This depends on the projects timeframe requirements & what products and services are delivering to the relevant platform. We would expect each project to have release cycles for each Product being worked on.
16. How documentation process and repository are maintained in Met office: - Business requirements documents - Test case documents and Test data - Other references documents We use JIRA as a basis for our documentation, but also we have a wide range of repositories in which information is stored.
17. What type of mobile apps are Met office develops - iOS, - Android(Native app , Web App , Hybrid) The Apps are developed externally and are both.
18. Does the proposed automation/CI integration framework need to be implemented over existing legacy application or it will be implemented over new features development. As this will give us fair insight of Automation branches for different release cycles. It will be a blend of green field and navigation of legacy.
19. For performance testing do we have Performance benchmarks defined? Application performance is out of scope, CI capability is yet to be defined.
20. Is there any requirement for non-functional testing like, Security testing, Performance,Usability, Localization? No
21. Is there any desktop application within current product suite and needs CI based automation? No, all browser based.
22. What is the current maturity of existing staff in terms of usage of tools and technologies? The MO teams have various levels of maturity, but the average is probably medium / low.
23. Does Met office using any test controller as central driver for running various automated test like,GUI,API, Mobile automation? Not yet but expecting too as part of this work.
24. What is the underlying development technology used for Web and Mobile application (Java, C#, Angular JS)? Java / Java Script / Python
25. What challenges did Met office faced during implementation of automation strategies,designing blueprint and execution of automation test cases across different release cycles? None as this is what we are looking to resolve with this work.
26. Will testing team be utilized in deployment and environment creation or troubleshooting? Our delivery teams will be doing the environment creation with support from test leads and release engineering
27. In Current Met office product portfolio what is percentage of application type: - Web,Mobile,API,Desktop,Database. Any other? 40 web, 10 mobile, 40 API, 0 desktop, 10 database
28. What challenges faced during training and up skilling (an example would be helpful)? Main challenges have been around perception of practices.
29. What challenges were faced across Project life cycle and process implementation? Speed of delivery, quality & ability to deliver for independent products
30. Please define what is the release exit criteria and respective SLA? New Done of Done process is being worked on. Hundreds of products and SLA's vary across these.
31. Is there a high level resource plan in place which broadly talks about how many resources would be needed who will work along with the teams. Yes, this is a ratio to 1 to 2 teams across 12 - 18 months
32. Is there going to be a test case gap analysis session where gap and understanding will be found within existing test cases? No
33. Is there any application that is multi-tier based or includes third party interactions? Yes, we have a high density of products most of which are multi-tired
34. Is there a ramp-up/ramp down plan which has to work out in a phased manner? Please clarify, we expect this to run for 18months and will be down to the supplier to propose ramp up and ramp down approach.
35. Does Met Office has any recommendation for specific roles and experience level to fit within the required need? No, we expect the supplier to provide suitable resources, especially highly skilled in CI & setting up test automation capability for legacy and green field developments. Also, ability to understand how to phase in value add MI
36. What roles in the team are mandated for SC cleared resources? The general provision is that all roles which require access to Met Office IT systems must be capable of being SC Cleared. We do not mandate this clearance up-front.
37. Are you expecting any other security clearance besides SC cleared roles? For eg: BPSS, Discovery Scotland etc. No
38. Are you looking for a flexible for low cost model of delivery? Are you open to discuss ideas along economic deliveries? Yes, cost accounts for 25% of weighting.
39. Could the Authority please provide details of its travel and expense policy? For example would consultants be able to claim their travel expenses between their contractual work location and Exeter? The travel and expense policy will be sent to all shortlisted suppliers. Regrettably, it is not possible to attach documentation to the Digital Marketplace opportunity portal.
40. The documentation sets out the objectives of the Met Office, but then requests valuation is set against a proposal to carry out the work, but there is nowhere to propose how our company would structure a proposition to do the work for the Met Office, which could be inherently different based on their strategic objectives and adding value over and above this. Where and how should this be proposed - Is this to be done at the Tender stage? The proposal mentioned will be requested from all shortlisted suppliers as a second phase for this opportunity.
41. Is this a PQQ aiming to shortlist to 5 Tenderers? If so, when will the shortlisted organisations receive and be expected to return the Tender Submissions? If not, is this an RFP? and if so, how and where are you expecting the pricing to be formatted and how will it be evaluated? The first phase of the DOS2 opportunity offers suppliers the ability to demonstrate how their skills match with the essential and desirable skills required for this role. The second phase of this opportunity is the RfP stage, which will be released to the shortlisted 5 suppliers once the skills evaluation has been completed.
42. As most of the Questions to be answered are 20+ words long, answering in 100 words with a concise response that is powerful and demonstrative of capability is far too tight. Would it possible to get an extension to a maximum of 1000 words? Unfortunately not. The word limit for these opportunities are set by Crown Commercial Services, hence it is not possible for them to be amended by the Met Office.
43. Could you please clarify if the incumbant vendor would continue to be a part of this engagement ? What kind of transition is being planned for the new partner There is no incumbent vendor and the work hasnt started.
44. Could you please detail certain specific challenges that you faced during upskilling wider teams ? Main challenge is maintaining velocity while skills are being upped and the diversity of current skill levels