Awarded to Red Eye International Ltd

Start date: Monday 2 October 2017
Value: £83,160
Company size: SME
The Money Advice Service (MAS)

A/B Test Programme Implementation, Support and Consultancy

10 Incomplete applications

6 SME, 4 large

4 Completed applications

4 SME, 0 large

Important dates

Published
Monday 3 July 2017
Deadline for asking questions
Monday 10 July 2017 at 11:59pm GMT
Closing date for applications
Monday 17 July 2017 at 11:59pm GMT

Overview

Summary of the work
A specialist A/B test partner using Optimizely to help MAS improve conversion rate optimisation on our core website. This includes consultancy, test implementation and test reporting.
Latest start date
Thursday 31 August 2017
Expected contract length
12 months with option of 12 month extension
Location
No specific location, eg they can work remotely
Organisation the work is for
The Money Advice Service (MAS)
Budget range

About the work

Why the work is being done
MAS require an A/B test partner using Optimizely to help us improve conversion rate optimisation on our core site.
Problem to be solved
MAS does not currently have the in-house resource capacity to deliver the desired A/B test programme at the required speed.
Who the users are and what they need to do
Not applicable
Early market engagement
Any work that’s already been done
MAS have been running an A/B test programme for 20 months.
Existing team
The partner would work in conjunction with the MAS digital team.
Current phase
Not applicable

Work setup

Address where the work will take place
The Money Advice Service
Holborn Centre
120 Holborn
London
EC1N 2TD
Working arrangements
Must be able to attend monthly review meetings at MAS offices and potential adhoc request to attend at short notice.

Typical day to day contact will be via online communication channels such as Slack, Trello and Basecamp.
Security clearance

Additional information

Additional terms and conditions

Skills and experience

Buyers will use the essential and nice-to-have skills and experience to help them evaluate suppliers’ technical competence.

Essential skills and experience
  • Be experts in the creation and implementation of Optimizely tests
  • Be able to undertake front end development for more complex tests and QA within Optimizely
  • Consideration for mobile, ipad and desktop
  • Website accessibility standards
  • Be experts in creating new customer journey flows in Optimizely
  • Be experts in developing tests for SPA (Single Page Application) sites in Optimizely
Nice-to-have skills and experience
Demonstrable competent account and contract management will be viewed with encouragement

How suppliers will be evaluated

How many suppliers to evaluate
3
Proposal criteria
  • Organisational Structure and Core Business Relevance
  • Communications and Contract Management
  • Quality and Fulfilment of requirements
  • Information Technology
  • Understanding of Requirements
  • Project Team
  • Approach and Methodology
Cultural fit criteria
  • Work as a team with our organisation and other suppliers
  • Be transparent and collaborative when making decisions
  • Have a no-blame culture and encourage people to learn from their mistakes
  • Take responsibility for their work
  • Share knowledge and experience with other team members
  • Challenges the status quo
Payment approach
Time and materials
Assessment methods
  • Written proposal
  • Case study
  • Work history
  • Reference
  • Presentation
Evaluation weighting

Technical competence

60%

Cultural fit

10%

Price

30%

Questions asked by suppliers

1. Has the existing A/B test programme been run in house or by an external supplier?
Our A/B test programme started as a fully managed service 18 months ago with an external supplier reporting into an internal Product Manager, but we now utilise both in-house and external partner resource to design and implement tests.
2. Can you provide an indication of budget?
We are looking on average for between 30 to 50 hours per month test implementation and support.
3. Does remote location mean UK only or can it be any international locations like India?
We would only consider partners in the same time zone.
4. What analytics packages are currently being used?
Onsite analytics services we use include:
- Google Analytics 360
- Decibel Insight (but under review)
- Optimizely
5. What are the conversion points for success to be measured against?
As we are a non-commercial site this can vary on a test by test basis.
Typically we review success based on completion rate (did they complete a tools workflow) or interaction rate (did they interact with certain specific page elements on an article or other webpage).
6. Are KPIs set at a page-by-page level? If so what is the MAS sites core objective?
Our corporate KPIs can be seen in chapter 6 of the attached MAS business plan.. KPIs 3.3, 4.1 and 4.3 are most relevant to digital https://masassets.blob.core.windows.net/cms/files/000/000/659/original/Money_Advice_Service_2017_2018_Business_Plan_FINAL_PUBLIC_EDITION.PDF

We don't have KPIs on a page by page level but we do continually monitor completion rates and engagement on tools and articles so we can identify opportunities to improve them to support the overall corporate KPIs.
7. Once tests are successful what is the production process hereafter?
We either continue iterating variations within Optimizely or we raise it into the backlog for coded implementation onto the website with the dev team.
8. Are there any website personalisation campaigns currently running or in scope currently? Or is this brief specifically for MVT testing solely?
Currently it is MVT testing but we are looking into Optimizely X for personalisation.
9. Do MAS carry out any research beyond site analytics and on-site satisfaction surveys?
Yes
- Whatusersdo
- Lab testing
- Guerilla testing
- Focus groups
- Connexity Hitwise
- We also have our own in-house Insight and Evaluation team - they are not part of the digital function but often commission bespoke research
10. The brief talks about capacity issues from a production and technical competence perspective, so how do they see this breaking down between strategic planning, project management, test implementation and analysis?
This could vary on a test by test basis

20% strategic planning
15% project management
40% test implementation
25% analysis
11. Can MAS provide any volumetrics for the current and projected service?
The response to this question is based on the assumption that volumetrics refers to the number of hours of support required as opposed to anything to do with website usage/analytics.

We'd be looking at an average 55 hours support per month across test implementation, support and consultancy to match the current arrangement.