By Dan Inbar, President, Engineering and Development, Infrastructure Solutions Group at Dell Technologies
Remember the scene in The Matrix where Trinity needs to learn how to fly a helicopter while under attack, and central command (aka Tank) downloads a program teaching her how to do it? The Matrix may have been the stuff of science fiction when it first hit the big screen more than 20 years ago, but ‘downloadable expertise’ could be possible within the next ten years.
That’s the power of software in a hyper-connected world. Everything from new updates and features to quick bug fixes can be pushed to a device, car and now, even infrastructure, remotely and seamlessly.
As Dell Technologies’ Future of the Economy report noted, the 2018 Tesla update is a watershed example of a frictionless, down-the-line upgrade. Very shortly after launch, it was reported that Tesla’s long-awaited Model 3 braking distance was unsafe. Tesla’s engineers swiftly released a software update to fix the problem. Never before had an automaker addressed a glaring flaw that threatened to sink its flagship car—while bypassing traditional costs of recalling and repairing vehicles mechanically. It may have been a first, but with the latest paradigms in technology, it certainly won’t be the last.
Taking R&D to The Edge
As compute rapidly shifts to the Edge, in pursuit of data, innovation and development will also shift into live production with greater speed. Updates will be pushed live through software, without requiring massive infrastructure overhauls. That’s a game-changer. Future-looking innovations can become present-day differentiators with R&D sitting at the nexus between deployment, management, and support.
This transfer to the Edge will create a decentralized IT infrastructure environment that doesn’t sacrifice the agility of the operational model first demonstrated by public clouds. In fact, it will address some of the flaws associated with the public cloud–notably latency.
If a business is too far away from a distant data center, it will struggle to deliver real-time experiences to both internal and external consumers. However, by 2025, Gartner predicts 75% of data will be created and processed outside the traditional data center or cloud. By moving to a flexible and agile IT environment at the Edge, R&D can untether itself from static infrastructure to deliver real-time benefits at every stage of a product lifecycle. So, incremental updates can be tested in sub-production mode and then rolled into customer products to create a livable feedback loop for R&D.
By reducing latency and enhancing usable bandwidth, the Edge makes the exchange of data and response times sufficiently fast enough to answer the needs of real-time test and optimization cases, seen in circumstances such as autonomous vehicle training and inference. This, in turn, ensures that production vehicles are able to respond to data in an instant to avoid collisions and keep passengers safe. As well as more day-to-day use cases such as no-checkout retail stores and sensors in shops that measure theft.
According to a Forrester Consulting study commissioned by Dell Technologies, 35 percent of businesses have already started increasing the number of proofs-of-concept (PoC) data uses at the Edge; 25 percent plan to do so in 1-3 years.
A new PB (personal best) in R&D
A confluence of other technology factors is also paving the way for software-defined R&D to break new ground.
As a runner, setting a new personal best (time) isn’t as easy as declaring, “I’ll just run faster”. They’ll need to improve their gait, foot placement, overall fitness, rest between exercise, hydrate more and recover properly. There are various factors involved. Businesses now have an opportunity to set a new PB in R&D, thanks to the Edge, as well as new generations of chips and silicon, common software and hardware, common APIs, the rise of alternative architectures, and simulation software.
For instance, with common software, you can tap into lessons learned by looking externally to open-source software. Companies principally open-source their discoveries because they know that everyone benefits if the underlying technology can be freely shared to drive human progress. This was Dell Technologies’ thinking when we ‘donated’ the Data Confidence Fabric (in recognition that the industry needs a model for rating data as trustworthy, or not).
Firms can also look internally if they’ve created something akin to a common software framework, that encompasses the collection and curation of pre-tested building blocks and software components. By standardizing commonly used software modules, developers can save time, resources, and unnecessary failure.
Embedding a ‘build once, re-use many’ mindset to identify, vet and maintain worthwhile assets and projects will expedite a company’s R&D and set it on the path to success from day one.
In time, companies could be freed from having to develop code altogether, liberating them to focus on front-end innovation. Low-Code-No-Code practices are already abstracting a lot of innovation development. Soon AI could be writing its own code.
R&D Is Digital and Digital Is R&D
We’ve heard it said time and time again that every business is a software business and software is eating the world. We’re seeing this reflected in the changing nature of commercial R&D. In the recent past, businesses couldn’t do R&D without digital technology. Now digital technology is the R&D.
Similarly, in the past R&D was sometimes treated as a sporadic side-project. Today, it’s software-defined and ergo, continuous, frictionless, and nearly always builds on previous, incremental learnings to achieve business outcomes–not just create nodes.