The Nanowire data analysis platform uses Artificial Intelligence and machine learning to capture, extract, analyse, train and understand unstructured files and social media. Users can search, cluster, classify and report on hidden patterns across their data sources. The system features a user-friendly front-end and a scalable data processing pipeline.
- Data analysis platform for file and social media analysis
- Text analytics, text mining and Natural Language Processing
- Optical Character Recognition (OCR) and text extraction
- Machine learning model training, text and image classification
- Integrated with SharePoint and Office365
- Social media monitoring and insight (Twitter/Reddit/Youtube and Tumblr)
- Data visualisation, statistics and reporting
- Fraud and anomaly detection
- Open source intelligence OSINT i.e. Companies House API mining
- Data storage in graph database (Neo4j), Elasticsearch and MongoDB
- Uncover hidden connections and clusters across data
- Powerful search across millions of documents and files
- Extract data from Office files, PDFs, images and social media
- Scalable processing from 100s to millions of files
- Expandable through new data connectors and analysis plugins
- Docker containerised processing backed by Kubernetes
- Deployable to public cloud (AWS/Google/Azure) or in-house in secure environments
- Secure APIs to access processed data
- Train machine learning models using TensorFlow and neural networks
- Run as an isolated system with no internet out requirements
£160 to £900 per person per month
- Education pricing available
- Free trial available
|Software add-on or extension||No|
|Cloud deployment model||Hybrid cloud|
Nanowire is provided as a self-service system and deployed as a SaaS on AWS, Azure or Google Cloud Platform. Nanowire can also be deployed in-house or to other cloud platforms.
A set amount of cloud compute and data volume is included within the monthly price or can be charged separately. Users may therefore, process large amounts of data within the licence agreement by using their own infrastructure or cloud provider.
Machine learning training requires a GPU (Nvidia Cuda compatible) enabled server (additional charges apply for hosted), but all other deployments may run without a GPU.
|Email or online ticketing support||Email or online ticketing|
|Support response times||
Email and telephone support is offered Monday to Friday from 09:00 – 17:00, excluding Bank Holidays.
Response times are staged depending upon severity of the support required - immediate for catastrophic situation, 1 hour for critical incidents, 2 hours for moderate incidents and 8 hours for minimal impact.
|User can manage status and priority of support tickets||Yes|
|Online ticketing support accessibility||None or don’t know|
|Phone support availability||9 to 5 (UK time), Monday to Friday|
|Web chat support||Web chat|
|Web chat support availability||9 to 5 (UK time), Monday to Friday|
|Web chat support accessibility standard||None or don’t know|
|How the web chat support is accessible||Web chat is available via a dedicated Slack channel per customer.|
|Web chat accessibility testing||None so far.|
|Onsite support||Yes, at extra cost|
|Support levels||We provide layers of service support with a dedicated customer manager for each client. Initial support is via ticket through JIRA Service desk, further support includes phone, email and Skype and optional onsite visits if required.|
|Support available to third parties||Yes|
Onboarding and offboarding
Spotlight Data offer help and guidance to get started and familiar with the service.
All new customers are invited to regular 'onboarding' webinar or group Skype calls. These are designed to familiarise users with the key functionality and demonstrate how data is analysed. These are interactive sessions with opportunities for questions to be raised. We offer these sessions every month throughout the contract if required.
A user manual is also provided which details the functionality of the system.
The setup fee includes a half-day onsite training workshop and a user requirements workshop, to ensure the system meets the customers needs.
|End-of-contract data extraction||Up to 30 days after the end of contract, all user data within the system can be extracted via a CSV or as a database dump. Data can be provided to an individual user or to the organisation as a whole (only with authority from the customer contact point) via raising a support ticket. This data can either be in CSV, JSON or a database dump from Elasticsearch or Neo4j. Other export formats may be available. All user data will deleted 30 days after the contract has ended.|
30 days before contract end, Spotlight Data will work with the customer to ascertain data export requirements and complete a 'project end' process. This process is an opportunity for both ourselves and the customer to review the project and includes lessons learnt, feedback on the system, feedback on the contract, next steps and discuss further opportunities.
Data export is included in the contract. We offer export in a variety of formats and may be able to export the data for use in other systems, however, data conversion may be chargeable.
Users will continue to be able to use the system until the contract ends, whereupon after 30 days all user data will be deleted, overwritten and then decommissioned to ensure security.
Using the service
|Web browser interface||Yes|
|Application to install||No|
|Designed for use on mobile devices||No|
|Accessibility standards||None or don’t know|
|Description of accessibility||We have not conducted accessibility testing yet.|
|Accessibility testing||We have not conducted accessibility testing yet.|
|What users can and can't do using the API||The data processing capability of the system can be accessed via the user web panel or via a secure API. The API allows users to submit files for processing and then query an endpoint to retrieve data analysis results for that file e.g. keyword analysis across a Word document or OCR of a scanned PDF.|
|API documentation formats||HTML|
|API sandbox or test environment||Yes|
|Description of customisation||
The SaaS user interface and data analysis methods are supplied "as-is", however new data analysis methods and data connectors can be added using 'plugins'. A plugin is a Docker container that takes data in and then outputs it for use in the system and web panel. We provide Python and Node.js plugin libraries and documentation so external developers can develop their own custom plugins and extend the system. Support for custom plugin development and new user interface features is available via the Nanowire support agreement.
Spotlight Data can also be contracted to extend the capability of Nanowire through custom development sprints. This could be updates to the web panel style, new charts, new reporting capability or new data analysis methods (e.g. custom machine learning algorithms).
Customers can set requirements for where the service and their data is hosted, however this may incur additional costs.
|Independence of resources||
The system uses Kubernetes auto-scaling to scale analysis services and resources inline with the volume of data throughput and the amount of data processing tasks queued in the message queues.
The resources are then scaled back when no longer required.
Customers are provided their own Kubernetes cluster which is hosted on shared infrastructure by default, but private infrastructure is available (e.g. dedicated AWS EC2 instances).
|Service usage metrics||Yes|
We provide logs of usage and data processing across users within that particular organisation. All processing, access requests and error messages are centrally logged within a Kubernetes service. Logs are archived outside the cluster in a secure UK AWS S3 bucket.
Service and usage metrics are available via an administrative dashboard (Kibanan/Grafana), but for now are available on request via a service ticket.
|Supplier type||Not a reseller|
|Staff security clearance||Staff screening not performed|
|Government security clearance||Up to Developed Vetting (DV)|
|Knowledge of data storage and processing locations||Yes|
|Data storage and processing locations||United Kingdom|
|User control over data storage and processing locations||Yes|
|Datacentre security standards||Managed by a third party|
|Penetration testing frequency||At least every 6 months|
|Penetration testing approach||In-house|
|Protecting data at rest||
|Data sanitisation process||Yes|
|Data sanitisation type||
|Equipment disposal approach||A third-party destruction service|
Data importing and exporting
|Data export approach||
The system is designed to analyse large amounts of data, so we provide export facilities:
Users can export data themselves from the web panel as a CSV for reuse in Excel.
A database dump of all the data can also be provided via a support ticket request. The data is stored in Elasticsearch as JSON and JSON-LD (Linked-data) and can be imported into another Elasticsearch database or converted for reuse elsewhere. We provide converters to CSV and other formats.
|Data export formats||
|Other data export formats||
|Data import formats||
|Other data import formats||
|Data protection between buyer and supplier networks||
|Data protection within supplier network||TLS (version 1.2 or above)|
Availability and resilience
The level of availability is 99.50% for when we host the SaaS. We use AWS, Azure or Google Cloud Platform for hosting.
We offer a service credit if we don't meet guaranteed levels of availability. A pro-rata credit for loss of service from the day the customer reports the fault until the day it is resolved, calculated by taking the annual service charge for the service, dividing it by the number of business days and multiplying it by the number of days of loss of service
|Approach to resilience||
We use Kubernetes to orchestrate and manage the system and data processing, which is designed for scalable data workloads. A Kubernetes cluster, features self-healing and redundant server nodes and the ability to autoscale depending upon CPU and RAM load. We use a loadbalancer with health checks to ensure all server nodes are 'alive', with a Slack alert to notify when a node doesn't respond.
Our deployments run in a single UK datacentre or UK cloud hosting provider (AWS/Azure/Google). Further resilience can be provided if required, but may attract additional costs.
|Outage reporting||Email and Slack alerts are automatically pushed to ourselves and the customer if an outage is detected. We are in the process of developing a public status dashboard.|
Identity and authentication
|User authentication needed||Yes|
|Other user authentication||JSON Web Token for API usage|
|Access restrictions in management interfaces and support channels||Access is limited via IP address of connecting devices and users with administrative access rights.|
|Access restriction testing frequency||At least every 6 months|
|Management access authentication||
Audit information for users
|Access to user activity audit information||Users contact the support team to get audit information|
|How long user audit data is stored for||User-defined|
|Access to supplier activity audit information||Users contact the support team to get audit information|
|How long supplier audit data is stored for||At least 12 months|
|How long system logs are stored for||At least 12 months|
Standards and certifications
|ISO/IEC 27001 certification||No|
|ISO 28000:2007 certification||No|
|CSA STAR certification||No|
|Other security certifications||Yes|
|Any other security certifications||
|Named board-level person responsible for service security||Yes|
|Security governance certified||No|
|Security governance approach||We hold Cyber Essentials and at time of submission, are starting Cyber Essentials Plus and ISO 27001/ 9001 accreditation. We document each security incident and are developing security policies in line with the ISO accreditation process.|
|Information security policies and processes||All security incidents are flagged to the Information Security Officer for review and action. We are in the process of updating our policy documents, which include: risk log, risk reporting, risk management, IT security and operational security manual.|
|Configuration and change management standard||Supplier-defined controls|
|Configuration and change management approach||Spotlight Data use secure source control with revision history of each change. All changes are beta tested, then pushed to a staging environment and verified before before deployed to a live environment. Customers are then notified of new feature releases and a product update page updated.|
|Vulnerability management type||Supplier-defined controls|
|Vulnerability management approach||Spotlight Data operate a threat management policy which includes regular penetration testing and server updates. All servers managed by ourselves, are set to automatically apply security updates, we virus scan all code before deployment and monitor security updates for software packages we use. If we discover a threat, we apply patches within 24 hours of notification. We also review syslog messages and flag user authentication errors.|
|Protective monitoring type||Supplier-defined controls|
|Protective monitoring approach||Spotlight Data employ active monitoring of the system and identifies any suspicious behaviour outside normal usage. All network and Kubernetes log information is subjected to regular audit and we are working to flag anomalous behaviour using machine learning, using our own Nanowire pipeline. This information is then used to identify threats and respond to incidents.|
|Incident management type||Supplier-defined controls|
|Incident management approach||We use Atlassian JIRA to asign tasks for mitigation and ensure security events are handled. Events such as high CPU or memory usage, are monitored and then handled if there is an impact on the end user. All incidents are logged in an incident report template in Atlassian Confluence, this include time, causes, resolution and mitigation. Common problems and fixes are recorded in a knowledge base to avoid rework and improve efficiency. This documentation can be shared with the customer if required.|
|Approach to secure software development best practice||Supplier-defined process|
Public sector networks
|Connection to public sector networks||No|
|Price||£160 to £900 per person per month|
|Discount for educational organisations||Yes|
|Free trial available||Yes|
|Description of free trial||A free trial is available on request by contacting us.|
|Pricing document||View uploaded document|
|Skills Framework for the Information Age rate card||View uploaded document|
|Service definition document||View uploaded document|
|Terms and conditions document||View uploaded document|