Wherever we go, we leave traces. Today’s technology allows us to capture those traces and analyze them. How does a visitor actually use your website? How long do they stay? What is the average amount of interaction a tweet from the company account gets? Do people behave differently at different times of the day or across different regions?
With the wealth of data available on each individual, customer knowledge can be more accurate than ever, leading to the widely desired ‘360° view’. This paves the way for customer segmentation to become extremely refined – so refined we could action the segment of one.
The Potential of Hyper-Targeting
Increased granularity of the customer typology can lead to more personalized offers and communication, changing the overall customer experience. Hyper-personalization is here and it’s real. A study from Accenture showed that 75% of consumers are more likely to buy if a brand shows it recognizes them as a unique individual. By contrast, a study from Marketo found that 63% of consumers get mad at being pushed just generic ads.
As we can see, hyper-targeting is a tool with great potential for marketing and sales teams. The key issue is determining what type of correlations are meaningful to make a difference. For instance, consider a theoretical group of people that mostly consumes media on the go, interacts most with your channels in the morning and never clicks on banner ads. They may be more sensitive to snackable content that pops up among the batch of morning updates.
Tracing the Customer Journey
Once this more refined customer segmentation is in place and supported by analytics that can pinpoint meaningful correlations in the data, the system can only improve as more data will be fed into it and you will learn from each action. It will shed light on the customer journey in unprecedented ways, aggregating all these very individual paths into a bigger picture.
Monitoring Big Data from a disparate array of sources helps marketeers and salespeople understand trends among their audiences. For this, it’s necessary to remember that people base decision-making on emotion as much as they do on cognition. Sentiment analysis can demonstrate the prevalence of emotional factors and indicate when and how people make certain decisions. Is the customer ready to buy? Are they hesitant? What do they need? All of that information is lurking in the data. These data streams can come from social media trends, tonal shifts in online communication, traditional press outlets or IoT-data.
The Holy Grail of Prediction
Eventually, combining analytics and insights on your audience and adding AI to the mix generates superior predictive analytics: based on similar circumstances, movements and contexts, it could suggest your marketing teams what kind of actions and campaigns to prepare. And with greater accuracy than the current iteration of predictive analytics. Marketing automation company Boomtrain says predictive marketeers are 1.8 times likelier than their traditional peers to exceed their corporate targets.
However, for the prediction rates to work well, the data must be qualitative – it should be possible to trace all data sources and verify their reliability through one digitally transformed IT setup. Marketing organizations should be able to have the details that matter mapped on an individual level to offer a personal and relevant customer outreach program, combined with the right bird’s eye view on the overall market. Having only detailed views leads to incoherence, and only the bigger picture blunts the effectiveness of your campaigns.
Data Must Flow
Dell EMC is a pioneer in this field. Naturally, being a technology company, our marketing teams use data lakes – a data lake is a combination of structured, semi-structured and unstructured data – to build comprehensive customer analytics, 360° customer views and hyper-targeted customer engagement programs.
In addition to needing the right kind of technology to be built on, a central tenet in the design of your data lake is the ‘single source of truth’. This means that there is one overarching data stream where all information comes from and flows into. However, only when the IT setup is aligned with the business objectives and vice versa can data truly flow through the organization unimpeded. Then, it can be brought together to form the basis of a coherent analysis and action plan.
The Risks of Hyper-Targeting
While a deep customer profile and refined segmentation are a blessing for marketing teams, the customer may be more cautious. Privacy concerns are high on the agenda in technology environments, and with the European Union’s General Data Protection Regulation (GDPR) entering in full force in May 2018, these concerns will only become more pressing.
As such, transformation initiatives that want to take advantage of Big Data are not just about technology and processes, but also about culture. Research from marketing agency Vieo Design indicates that 79% of consumers feel like bad ads “stalk” them – while 83% agrees that “not all ads are bad”. A simple check Vieo proposes before launching a targeted campaign is this one: “Would you do this in person, too?”.
In summary, a system that works with Big Data to help generate better lead conversion through hyper-segmentation must meet the following conditions.
- Built on the right technology foundations, with a ‘single source of truth’.
- Able to sort, correlate and aggregate data according to relevance.
- Flexible enough to go from very detailed, individual profiles to the general trends.
- Geared for continuous improvement and AI involvement.
- Taking into account privacy and data protection concerns.
For marketing organizations, the investment in the science of marketing is a top of mind-issue. Another key first step for marketing organizations is to team up with their CIOs to enable innovation. Please share some of your experiences with hyper-targeting – how do you do it? What would you recommend to your peers?