Stay Competitive With Next Generation OEM Solutions

An independent survey recently predicted that companies that engage in above average levels of OEM partnership have the opportunity to accelerate sales growth and cost reductions by 35 and 45 percent by 2025. On a more negative note, the survey also predicted that by 2025, up to 50 percent of current businesses will cease to exist.

Digital dilemmas

Here’s the big question – which side of the wave of disruption, do you want to be on? The wave can either propel you forward, leave you floating, looking for the next one to chase or immerse you in the deep. We all know that staying relevant and competitive is tough. There is no end game. New capabilities are affecting every industry and show no sign of abating.

Customer experience is a key differentiator, yet expectations are rising and there’s the sheer pace of technology changes to contend with. Take, for example, supply chain security. It’s no longer enough to worry about the protection of your own IT systems but you now have to think about that of your critical suppliers. How much do you know about the third-party components in your solution and where they originate from?

There are also challenges finding the right people – you need a slew of engineering talent, not just to develop your IP but also to manage integration, networking and the underlying platform. Business continues to get more complex.

Next gen solutions

What can you do differently to stay competitive and accelerate growth? In my book, you have stay ahead of the curve and outpace disruption by innovating faster. I believe that as industries evolve through digital transformation, solution builders and their customers need a whole new generation of solutions to support their applications. We need horses for courses – the same old, same old isn’t going to cut it any more. A picture paints a thousand words so let’s talk a few industry examples.

Digital transformation in manufacturing

For example, take the typical manufacturing plant. In the beginning, we had manual production with humans doing the work. Smart machines then arrived, freeing workers up to do more value-add activities. Smart factories were the next evolution, leveraging inventory and analytical systems. Machine vision then arrived on the scene to bring higher levels of accuracy to assembly and product inspection. Fast forward to today. Welcome to the era of the smart, interconnected factory, capable of linking into core business systems and shaping customer demand to match the factory and supplier capabilities.

Increased appliance workload

All good but here’s the crunch – with all these additional demands, the appliance workload is increasing all the time. Smart devices are continuing to feed even more data into the mix. In the smart, interconnected factory, your appliance now needs to be multi-functional, managing everything to do with production, from supply and demand modelling, and communicating with suppliers through to prioritizing production schedules and ramping up or down certain factories.

Digital transformation is pervasive

Healthcare is another great example. Increasingly, technology is changing how patients and doctors interact with artificial intelligence being used for rapid and accurate diagnostics. In the telecom world, we can predict network usage and optimize for customer demand before a problem occurs. In smart surveillance, AI can analyse live video to stop crime in real time, allowing a single set of eyes to cover many cameras at once. When suspicious activity is detected, the relevant camera is highlighted and security personnel are called, only when needed.

Data deluge

Apart from offering sophisticated solutions and an improved customer experience, what do these four examples share What is the Edge?

It may sound like an obvious question but what is the Edge? Where is it located? The short answer is that can be wherever you want or need it to be. There’s no single definition. For example, in the telecom world, the Edge is considered anything not located in the core data center.  The Edge might be a micro data center placed in a sub region of a large city.

Meanwhile in a smart city environment, the Edge might be a smart traffic light or a video surveillance camera. It could be processing video to count the number of pedestrians on a sidewalk to avoid sending all the raw video to the cloud or sending only when necessary or interesting.  In the transport or maritime industry, might be talking about the back of a truck or a ship on the high seas collecting engine data watching for anomalies and uploading data when available bandwidth is present. In many cases, we’re talking about challenging environments. As a result, rugged and highly available systems blended with the ability to withstand extreme temperatures are critical characteristics.

in common? The answer is not just data, but real time understanding of the data. However, this tidal wave of data is threatening to overwhelm and consume us. Having an architecture that can handle this is critical. Surviving the wave is no longer enough. It’s all about being prepared to ride and surf the data for the benefit of your business and your customers versus going under.

In tandem, of course, we’ve seen huge advances in compute power, moving from the traditional single-purpose operating systems to virtual environments and multi-functional appliances, all the way through to converged and hyper-converged solutions.

The age of containerization and offload

Well, get ready – Next Gen Computing is the new frontier! Think highly available compute power, extending from the Cloud to the Core to the Edge, elastic scalability and software-defined everything. By wrapping functions in a virtual machine or container, you can treat that function as an atomic entity. The advantages are that you can independently roll out or replace virtual functions in a modular fashion without impacting other functions.

The modularity of these functions also allows you to build a multi-purpose solution. For example, in the case of the smart, interconnected factory, you can drop both the virtual machine vision solution plus a failure predicting solution into a single appliance, size it and go. It really represents the best of every world.

Of course, as individual appliances need to run faster, you’ll need to offload capabilities to pack more processing power into denser space. It’s all about mapping what your workload needs to do and choosing the right accelerator – be that FPGA, GPGPU or ASIC.

Data collection and analysis

Of course, compute is just one half of the solution. It’s also going to matter where compute is done. In terms of analysis, I believe that industry needs to move to the Edge for data collection and real-time analysis while continuing to avail of a centralized Cloud for overall infrastructure. It’s usually not an either/or – you probably need both.

It may sound like an obvious question but what is the Edge? Where is it located? The short answer is that can be wherever you want or need it to be. There’s no single definition. For example, in the telecom world, the Edge is considered anything not located in the core data center. In this instance, the Edge might be a micro data center, placed in a sub region of a large city.

Meanwhile in a smart city environment, the Edge might be a smart traffic light or a video surveillance camera. For example, it could be processing video to count the number of pedestrians on a sidewalk to avoid sending all the raw video to the cloud or sending only information that’s necessary or interesting.

Act on insights in real time

The advantages are clear. With the Edge, you can process data close to the source and act on insights in real time. As more and more data sets are generated, it’s simply not going to be possible to send all data to the Cloud, at least not in real time, when they are most valuable. From a cost perspective, it makes sense to intelligently aggregate data at the Edge and send only what’s interesting or relevant out to the Cloud. In a lot of workloads, the Edge can be used to make quick data-based decisions with the Cloud just notified of the outcomes.

Take the telecom example. With 5G, and the proliferation of mobile phone and IoT devices together with the growth of high content delivery services, edge computing is set to become increasingly relevant. Due to bandwidth, latency and security needs, not everything can and will go to the Cloud for routing or processing. I believe that it will be necessary to analyse traffic or data at the Edge, act on the data, if necessary, and then route the relevant data to the appropriate Cloud over the most cost-efficient uplink.

Time to re-engineer your architecture

In summary, IoT, AI, machine learning, and analytics are merging. It will become impossible to meet new customer expectations by using traditional server appliances and the Cloud. The workloads are too intense, the networking too complex, and the environments too diverse.

The key take-away is that hardware and software re-architecture is critical – the appliance and the way we conduct real-time data analysis has to evolve to address changing needs. The applications of the future will not run on the appliances of the past. The battle lines are already being drawn. Prepare now and get ready to ride the wave!

What has been your experience in the market? Are you planning to re-architect your hardware and software solutions? We welcome your comments and questions.

Learn more about next gen solutions from Dell EMC OEM: www.dellemc.com/next-gen-oem

Listen to the playback of our webinar on this topic here.

Join our LinkedIn OEM & IoT Solutions Showcase page here.

Keep in touch. Follow us on Twitter @dellemcoem

About the Author: Alan Brumley

Alan Brumley is Chief Technology Officer of Dell Technologies OEM Solutions, a specialist division helping customers to grow their businesses by leveraging Dell’s hardware, software and services as part of their own solutions as well as helping organizations take advantage of the rapid evolution in the Internet of Things. With 20-years’ leadership experience in the IT industry, Alan liaises with Dell EMC’s global business units to act as the voice of the customer. He is responsible for leading engagement activities with OEM customers, channel partners and technology partners. Collaborating with his CTO peers in the partner network, Alan and his team evaluate emerging technologies and promote early adoption and integration of breakthrough solutions with customers as well as designing new procurement options. Alan has over 18 years’ experience as an innovator with Dell EMC, previously designing Enterprise Solutions products in the Server Management Firmware group with a focus on modular enclosures that converge storage, networking, and compute capabilities, as well as server management solutions for these products. With a hardware and BIOS background, he holds over ten U.S. patents for the company and has delivered many first offerings in the Enterprise product family, including designing the Dell EMC PowerEdge M1000 enclosure. Alan started his career in a technology startup and has also enjoyed a five-year career as an innovator with Data Performance and Warehousing Solutions at NCR. Alan is excited that the Internet of Things has now merged his work focus with his personal passion for autonomous embedded systems and wireless communications, feeding data into Enterprise solutions for deep analytics.