Can you imagine going under the knife only to find out afterwards you didn’t actually need surgery? I recently heard a story of a woman who had a lump removed from her breast that was benign. I was shocked. But after some research, I discovered it’s not uncommon.
Even today, with all the technological advancements, unnecessary surgeries happen. About 90 percent of breast lump removals are discovered to be unnecessary. The good news is that technology such as artificial intelligence (AI) now exist to help solve these problems. Doctors like Manisha Bahl at Massachusetts General Hospital have shown that these unnecessary surgeries could be reduced by nearly one-third by using machine learning.
It’s true. AI is real and widespread.
AI innovation is happening everywhere, not just in healthcare. Approximately 71 percent of companies said they had already implemented AI or were planning to implement AI in the next 12 months. The chart shows the responses from 2,106 data and analytics technology decision makers at global organizations.
We wanted to better understand how AI impacts companies, their lines of business, and the IT department. So, we commissioned Forrester Consulting to help us answer these questions with primary research. The details and findings formed the basis of a Forrester Consulting Thought Leadership Paper on the topic.
What’s clear is that the world is no longer waiting for Silicon Valley to develop an all-in-one solution to solve their problems. Individuals and organizations are using new foundational technology to innovate with AI on their own. Foundational technology such as 3D printing is democratizing manufacturing and improving product development. Connected IoT devices are now ubiquitous, and generating lots of raw data. Powerful new 4-socket servers with serial and parallel processing are used to digest massive amounts of data to compute insights in real-time. These technologies are now accessible to everyone, not just the huge research institutions or Fortune 50 companies.
Organizations are using AI to better understand their customers and develop better products. In the next 12 months, 54 percent of companies are planning to use AI to deliver a better customer experience. Across the globe, in every vertical, individuals and organizations are innovating with AI and its applications – machine learning and deep learning.
Could AI replace your doctor? Maybe. AI isn’t replacing everything your doctor does. The hospital of the future isn’t going to run itself. Similar to manufacturing, the advancements in healthcare appear to be focused on repetitive tasks and variability reduction. Let’s look at an example.
The human eye is fallible. Doctors have human eyes. Even a handful of the best doctors can look at the same medical image and come to different conclusions. The question is how often does this happen? In 1959, a ground-breaking article cited that radiologists missed approximately 30% of positive findings. More than five decades later, numerous studies still confirm the discrepancy of radiologic interpretations.
Enter the digital eye powered by AI. The digital eye is already better than the human eye in many areas. In fact, the accuracy of the digital eye is expected to further extend its lead on the human eye. These improvements will continue to advance radiology, pathology, dermatology, and ophthalmology precision.
At Stanford University, researchers are using the digital eye. They have created an AI algorithm to identify skin cancer. Human skin is full of lesions that are non-cancerous. Lesions like freckles, moles, and skin tags are very common and benign. It can be difficult to spot the cancerous ones. The Stanford researchers trained their deep learning algorithm with 130,000 images. What they found was that their algorithm’s efficiency at diagnosing skin cancer is similar to a doctor’s.
When Henry Ford radically changed how cars were built, humans performed almost every task, sometimes with mind-numbing repetition. Today, robots perform many of these sequences in a pre-programmed path. They do it with precision and without complaints. But, there are still problems to solve and efficiencies to gain, leading some to believe the days of the “dumb” factory are numbered. The factory of the future will be smart and run itself with predictive maintenance, yield enhancements, and automated quality testing.
The benefits of AI go beyond manufacturing operations. AI used in manufacturing business processes can have dramatic impacts, too. In fact, McKinsey believes there are huge advantages to an AI-enhanced supply chain:
- Forecasting errors can be reduced by as much as 50 percent
- Lost sales from out-of-stock products can be reduced by up to 65 percent
- Inventory stockpiles can be reduced by 20 percent to 50 percent
In 2005, Ritz-Carlton launched a central system to help deliver flawless and memorable customer service. They named it “Mystique” and it made staff observations about guests at one hotel available to all of its properties. Mystique solved a problem of information sharing across its 60 hotel properties. When hotel staff learned something new about a guest (like their preference for Diet Coke), they would enter it into Mystique. The company’s goal was to note five preferences about each guest. When the guest returned to any Ritz-Carlton, the hotel staff would satisfy at least three or more of these preferences.
More than a decade after the launch of “Mystique,” today’s traveler demands this across the hospitality industry. They expect hotels to know everything about them – what they like and what they don’t – and to accommodate and tailor messages around it. This goes beyond the size of the bed, preferred floor, and feather or foam pillows. Promotional emails no longer have to offer golf packages to those who don’t play golf. Customers want to see offers that appeal to their tastes, not the one-size-fits-all messages of the past. AI is helping do that by tapping into data sources like social media.
Is your company completely ready for AI? Probably not.
CIOs must emerge as the leader on AI for the entire business. The AI research we commissioned Forrester to execute revealed that there are numerous uncoordinated AI projects happening across the company. Lines-of-business leaders initiated most AI deployment efforts. Often, they engage IT for support. But, approximately 15% to 20% of time, IT is completely in-the-dark. Why?
There is little doubt that IT is best positioned to lead all AI projects across the company. In fact, involving IT has compounding AI benefits. Companies that involve IT are 3x more likely to adopt machine learning platforms and 2x as likely to adopt deep learning platforms. On the other side of the equation, firms with lines of business who go it alone explore and adopt about half the number of AI building blocks.
At least part of the reason why lines-of-business are bypassing IT is the lack of a modernized data center. IT owns the technology infrastructure, data, and software applications. IT is best positioned to serve as a hub for all AI initiatives and can connect to outside data sources and interconnect internal data sources across business units. But, the reality is that most data centers are not ready for AI initiatives. Respondents stated that some of their most challenging infrastructure issues for AI strategies were around server automation and security. Plus, 61% said they lacked servers with purpose-built processors like GPUs and FPGAs.
How to prepare for AI initiatives in the data center
We asked Forrester to put together a checklist to help CIOs lead AI in their company. It’s a great place to start because it includes strategic, tactical, and practical guidance backed by data. Some of the recommendations focus on the organization. Some focus on modernizing the infrastructure.
Modernized infrastructure to support AI usually starts with new servers. It’s critical that these new servers support GPUs and FPGAs. CPUs are great for serial processing. GPUs and FPGAs are great for parallel processing. When computational tasks can be performed in parallel, the server offloads them to the GPU or FPGA. This frees up the CPU and is the key to cut learning times down from days and weeks to minutes and hours.
Just a few short years ago, servers with specialized parallel processing capabilities were limited. These server platforms were expensive and so were the GPUs they needed. Today, things are different. The Dell EMC PowerEdge portfolio is packed with servers that are purpose-built to handle AI and machine learning. Late last year, we rolled out the PowerEdge C4140, which is an ultra-dense, accelerator optimized platform that can support 2 CPUs and 2 GPUs in a 1U space. Today, we further extend our commitment to AI with the announcement of two new 4-socket servers.
- The PowerEdge R840 is a dense, 2U platform with support for up to 4 Intel CPUs and up to 2 GPUs or up to 2 FPGAs. It’s geared to turbocharge data analytics with its flexible performance and capacity options including a 24 NVMe drive configuration and significant storage space.
- The PowerEdge R940xa is a 4U platform built for extreme acceleration. It supports a 1:1 CPU to GPU ratio with up to 4 Intel CPUs and up to 4 GPUs or up to 8 FPGAs. Large internal storage (with up to 32 drives) provides an alternative to rising cloud fees and security risks.