ATONAL CENTRAL DATA PLATFORM
Aggregation of disparate legacy and online data into a single multi-cloud data store for access, analysis, reporting and warehousing. Bulk and delta ingestion plus integration of relational data. Secure deployment and service management within existing client cloud subscriptions. Intelligent matching and obfuscation tools for PII data in line with GDPR.
- remote access
- data integration
- data integrity validation
- data analysis and reporting
- bulk and delta data ingestion
- real-time data ingestion
- cloud platform agnostic
- legacy and modern data platform support
- host within client cloud subscription
- secure data warehousing and backup
- push and pull data from multiple remote data sources
- online transaction and batch file processing
- low maintenance cost cloud agnostic solution
- can be hosted within client cloud subscription
- high security environment in UK cloud data centres
- highly scaleable architecture (horizontal and vertical scaling)
- customisable queries and reporting into online and desktop tools
- PII identity matching
- row based data integration and fallout
- can handle relational and non-relational data sources
£70000 per unit
- Education pricing available
0203 034 4711
|Software add-on or extension||No|
|Cloud deployment model||
|Service constraints||Currently limited to Amazon Web Services or Azure|
|Email or online ticketing support||Email or online ticketing|
|Support response times||
Within 4 hours (normal working day Monday to Friday) outside of a warranty or maintenance support contract
As defined by SLA for maintenance and support contracted clients
|User can manage status and priority of support tickets||No|
|Phone support availability||9 to 5 (UK time), Monday to Friday|
|Web chat support||Yes, at an extra cost|
|Web chat support availability||9 to 5 (UK time), Monday to Friday|
|Web chat support accessibility standard||WCAG 2.1 A|
|Web chat accessibility testing||We use Slack which aims to be WCAG 2.1 AA by the compliance date of 1/1/21|
|Onsite support||Yes, at extra cost|
We provide tiered levels of support. Each support tier covers a different number of expected support call outs which are factored into the tier price. Additional call outs are covered at a set hourly rate.
Call outs include cover for Data Integrity intervention that can be required due to poor data quality, infrastructure issues, security updates and patching or in the production of additional reports etc.
Additional support hours above tiered support limits are charged at £150 per hour or part. The minimum call out period for an engineer under tiered support is 1 hour.
We offer both a Technical Account Manager and Cloud Support Engineers.
Silver Support: £99 ex VAT per month; Email support only from our Technical Support
Gold Support: £240 ex VAT per month; Engineer time up to 2 hours & Email/telephone support
Platinum Support: £500 ex VAT per month: Engineer time up to 6 hours, Pro active monitoring of data quality, Email/telephone support
|Support available to third parties||Yes|
Onboarding and offboarding
We can provide onsite, online or user documentation training.
Documentation is provided free, online and onsite are provided at additional costs.
|End-of-contract data extraction||
The solution is designed to be hosted within a buyers cloud subscription. No additional work needs to be done to extract the data as the solution is completely owned by the buyer in their account.
If the service is to be terminated the data can be extracted out of the data storage accounts into large bulk data files.
Included in the basic price are the inclusion of up to 4 data sources with a limit of 400 data entities. Two reporting queries are included as well as status tracking administration queries. Support for a single month of warranty is included.
Additional cost are support hours, additional data sources, additional data entities or additional reporting queries are excluded from the normal contract price.
All cloud costs are excluded from the contract as they are the client's subscription. Expected cloud infrastructure costs for the proscribed solution are in the region of £400 per calendar month ex VAT but are subject to the size and frequency of the data ingestion and reporting.
Using the service
|Web browser interface||Yes|
|Application to install||No|
|Designed for use on mobile devices||Yes|
|Differences between the mobile and desktop service||
The service is access through 3rd party tools like PowerBI so can be accessed as an installed application with PowerBI Desktop, or online and through mobile devices.
The platform itself has no interfaces as it a service not a website.
|What users can and can't do using the API||
The API allows users to run queries and reports against data sets. It allows the export of particular data sets.
The API allows the system to integrate with online API injection as well as bulk batch ingestion through file transport mechanisms like SFTP in both push and pull.
The system is designed to prevent user level interaction with the ingested data, as the system can constantly be integrating deltas. This restriction is to ensure integrity of the data.
Access levels, extracts, and reports are all customisable by the client.
|API documentation formats|
|API sandbox or test environment||No|
|Description of customisation||
Since every client's data is different, customisation is necessary to accommodate ingestion of their various data sources, formats and schemas.
Customisation can be carried out by the supplier or by the buyer.
|Independence of resources||The system is capable of being both horizontally or vertically scaled. User access is only through the data store so this can be scaled automatically to the user demands on the system.|
|Service usage metrics||Yes|
Number of rows ingested.
Number of ingestion failures.
Number of retry rows currently in the system.
Transmission and Ingestion logs.
|Supplier type||Not a reseller|
|Staff security clearance||Other security clearance|
|Government security clearance||Up to Developed Vetting (DV)|
|Knowledge of data storage and processing locations||Yes|
|Data storage and processing locations||United Kingdom|
|User control over data storage and processing locations||Yes|
|Datacentre security standards||Managed by a third party|
|Penetration testing frequency||At least every 6 months|
|Penetration testing approach||In-house|
|Protecting data at rest||
|Data sanitisation process||Yes|
|Data sanitisation type||Deleted data can’t be directly accessed|
|Equipment disposal approach||A third-party destruction service|
Data importing and exporting
|Data export approach||
The data is accessed through secure SQL Queries.
Data can be exported through tools like PowerBI. The data can be exported in a variety of ways including file based CSV, Excel or Google Sheets, or through visualisation tools.
The data can be analysed both offline and online with obfuscation of PII information.
|Data export formats||
|Other data export formats||
|Data import formats||
|Other data import formats||
|Data protection between buyer and supplier networks||
|Data protection within supplier network||TLS (version 1.2 or above)|
Availability and resilience
The system is deployed within a client owned cloud subscription and the
Ingestion of information is dependent on the quality of the incoming data files and streams. Fall out is expected and the system tries to auto correct data where possible but errors are expected that would require additional Data Integrity. The data access for processing is defined under the cloud providers own SLA as it is native cloud data storage that is utilised. The level of errors are directly equal to the data quality of the incoming data.
|Approach to resilience||The system resilience is built into the platform through the use of Platform as a Service offerings from the different cloud platform providers. The infrastructure is designed to be immutable which means it is constantly destroyed and re-created to minimise system costs and improve security and system stability with patching and security improvements. No element is a single point of failure and as it utilises the PaaS capabilities it can be increased to global redundancy within multiple region fail over.|
The system uses Azure Monitor (for Azure installs) and AWS CloudWatch (for AWS installs) to monitor performance of the supporting services and provide a dashboard to visibly see the state of the services. These provide a number of means of contact to demonstrate outages, these include but are not limited to, SMS and Email. It is possible to customise the environment at additional cost to link the alerts to third party services like PagerDuty or ServiceNow.
Issues with data during ingestion are notified by email and all logs and error reports and processing notifications and are stored within the database and are accessible through 3rd party tools like PowerBI Desktop and through the different cloud portals and other SQL based 3rd party tools.
Identity and authentication
|User authentication needed||Yes|
|Access restrictions in management interfaces and support channels||
We have role based access controls on all the management and support consoles. These differentiate between Administrators and lesser roles and determine which controls / functions they are able to access.
Two factor authentication (including via Mobile Device Management) can be deployed using the native cloud provider platform. We can also use TLS client certificates to control access.
|Access restriction testing frequency||At least every 6 months|
|Management access authentication||
Audit information for users
|Access to user activity audit information||Users have access to real-time audit information|
|How long user audit data is stored for||User-defined|
|Access to supplier activity audit information||Users have access to real-time audit information|
|How long supplier audit data is stored for||User-defined|
|How long system logs are stored for||User-defined|
Standards and certifications
|ISO/IEC 27001 certification||No|
|ISO 28000:2007 certification||No|
|CSA STAR certification||No|
|Other security certifications||No|
|Named board-level person responsible for service security||Yes|
|Security governance certified||No|
|Security governance approach||We have documented processes and procedures under ISO9001:2015 and we are currently expecting to be certified under ISO27001 in 2019.|
|Information security policies and processes||
We ensure good governance through effective leadership, appointment of an Information Risk Owner, board level oversight of security compliance, security awareness training, risk management procedures and a culture of security amongst staff. Our security policy and processes are available to prospective clients on request.
Our architects follow national security guidelines and create highly secure environments with client subscriptions leveraging the cloud platform security facilities. We operate cloud first infrastructure so we are using AWS and Azure security protocols which can be adapted for client requirements i.e. MDM controlled 2FA.
Role based access control management, staff vetting and integrated and robust HR processes mitigate internal security risk.
|Configuration and change management standard||Supplier-defined controls|
|Configuration and change management approach||All components are stored within a secure source control environment, We utilise a number of different technologies depending on the platform chosen for the cloud hosting. These include Azure GitHub and BitBucket Pipelines (which can deploy to Azure and AWS). In addition all cloud subscription environments are logged which captures all relevant information pertaining to the cloud environment. All project level tasks are created and stored within Atlassian Jira and we use these to produce all change management reports in conjunction with the source controls used.|
|Vulnerability management type||Supplier-defined controls|
|Vulnerability management approach||
The cloud platforms supplied, currently AWS and Azure, offer robust threat monitoring through the various security platform as a service. The system is built on the premise of immutable infrastructure meaning it is rebuild from a vanilla build each time it is deployed. The system routinely destroys and rebuilds any not PaaS components like Virtual Machines if required for that particular communication method to legacy OSI Level 3 restrictions to IP based communications.
All potential threats are shown through the management consoles of the various cloud providers as well as notifications through SMS and Email.
|Protective monitoring type||Supplier-defined controls|
|Protective monitoring approach||All ingestion of data and access to the data stores is protected through a number of restrictions. Any potential compromises must originate within restricted IP addresses. Potential compromises can be aborted through the restricted white list of the IP's, simply removing one that appears compromised. All data is backed up and can be rewound to remove any potential ingestion issues, and all reporting access is read only.|
|Incident management type||Supplier-defined controls|
|Incident management approach||The system is built to be auto-healing where possible, this includes not only the cloud providers PaaS redundancy but the system is built to provide additional redundancy and system restarts and rebuilds. Error events are reported through SMS and Email and we also have email addresses for direct reporting. A telephone support during normal working hours is also available. We provide system reports on a monthly reporting schedule.|
|Approach to secure software development best practice||Conforms to a recognised standard, but self-assessed|
Public sector networks
|Connection to public sector networks||No|
|Price||£70000 per unit|
|Discount for educational organisations||Yes|
|Free trial available||No|