“Service providers are struggling with enormous amounts of data” has been a common refrain of the past 5 years. The refrain has lately taken on a new tone, with the impending arrival of the 5G technology for delivery of connected services.
Be it IoT, the Connected Car, the consumer last-mile, or the most common mobile over-the-top services, 5G promises to increase the data-deluge multi-fold, as much as it promises to improve the quality of service delivery, reduce the data access latency, and enhance the efficiency of the mobile infrastructure.
The 5G technology is going to affect all forms of services in successive waves of change. Mobile service providers will be the first ones to acquire and adopt 5G. As 5G capable devices and infrastructure equipment becomes common (and thus cheaper), the next wave of adoption will come from connected service providers such as cable and consumer access providers.
Next will be providers of software, storage, and infrastructure, and everything else “as-a-service” providers. Customers will expect seamless service creation, acquisition, delivery, take-down, and re-launch – all at their own command and control – with each wave of the roll-out.
How will the service providers meet these customer expectations? To begin with, they must listen to what their customer are asking for. The ask will come in the form of advanced data analytics, with the expectation of the ability of pin-point use of artificial intelligence, machine & deep learning, and of inferencing, using as input all the data flowing over the 5G networks.
Service providers need to start preparing today for his emerging era of 5G and start improving their analytics and learning abilities.
A large section of the service provider vertical is struggling with the even the basics of analytic capability. The struggle is the direct result of the complexity of analytics software, and complexity of the underlying hardware & networking. Getting a single instance of an analytics software stack completely configured, tested, and delivered to users in production commonly takes weeks to months.
That estimate also assumes that the underlying server, networking, and storage infrastructure components are properly connected, configured, and ready. This makes a production-ready solution instance already obsolete by the time it is delivered, as the market has moved to the next version of everything. Thus begins the long and iterative process of lifecycle management of the solution. Initial, large investments beg for a long succession of smaller yet numerous investments throughout the lifecycle of the solution, which reduce the overall efficiency of this approach.
Dell EMC has created a “ready architecture” approach to address this issue. A Dell EMC ready architecture consists of several guides or books, created by our team of expert technical professionals after thorough experimentation in our global labs – all with the objective of “doing the heavy lifting” for our customers so they don’t have to.
A reference architecture guide helps our customers with a precise bill of materials, how they are all put together and why, and options and alternatives available for each component of the architecture. As a whole the reference architecture exemplifies a small / medium / large size of the solution, with guidelines to create a larger or smaller scale instance of the solution.
A deployment and validation guide creates a blueprint by which a service provider customer can create a complete data analytics solution stack in their lab. How our team of experts did it in our labs is documented in the fullest detail. A battery of validation tests that were carried out are included in the documentation. Several partner companies of Dell EMC worked with our experts to ensure that their software stacks, as well as their use-case configuration work seamlessly on the reference architecture.
Equally important is the performance baseline guide included in the ready architecture. For each partner company’s software stack, use cases of pertinence to the telecom service providers were selected. Then, our team of experts measured performance baselines while running those use-cases with sample, representative data sets. The experimental setup used, and the Hadoop and software vendor optimizations used are documented in detail. Clear performance metrics are reported for each measurement. The data presented will give technical and business decision makers a sense of what to expect from the ready architecture, as well as a what-if analysis in case of scale-up or –down of the solution infrastructure.
In this series of blog posts, our team describes the elements of version one of Dell EMC’s Ready Architecture for Service Provider Analytics (SPA v1). We have chosen two Dell EMC significant partners to showcase for this ready architecture. They are:
- Zaloni, with their Data Lake platform offering and Customer 360 use-case
- Cardinality, with their Perception Analytics platform for the Operational Intelligence use-case
The Zaloni Data Lake simplifies processing of large amounts of data using Cloudera Hadoop. Data ingestion is simplified, and the time-to-results is significantly reduced. Future blog posts will show how the SPA v1 ready architecture with Zaloni can get you from thought-to-results faster than anyone else.
The Cardinality Perception platform utilizes a Docker and Kubernetes container ecosystem for ultra-fast processing of real-time ingestion of data streams with a unique approach to the ETL part of data analytics. Unique data annotation and labelling follows, which in conjunction with real-time information gathered from a provider’s mobile infrastructure, yields significant operational insights. The Perception platform is able to scale linearly as your data grows – all owing to the modern, container-based architecture of the platform.
Stay tuned for the next edition of this blog series to learn more about Dell EMC’s Service Provider Analytics ready architecture v1 (SPA v1).
For more information, please see the posts:
- Do You Have Deep Insights Yet?
- Service Provider Analytics – Unlocking the Value of Data