Application Case 7
Application Case 7
1
Netflix: Using Big Data to Drive Big Engagement: Unlocking the Power of Analytics to Drive
Content and Consumer Insight
The Problem
In theory, fostering more intimate consumer relationships becomes easier as new sources of
data emerge, data volumes continue their unprecedented growth, and technology becomes
more sophisticated. These developments should enable businesses to do a much better job of
personalizing marketing campaigns and generating precise content recommendations that drive
engagement, adoption, and value for subscribers
Netflix, an undisputed leader and innovator in the over-the-top (OTT ) content space,
understands this context better than most. It has staked its business and its brand on delivering
highly targeted, personalized experiences for every subscriber— and has even begun using its
remarkably detailed insights to change the way it buys, licenses, and develops content, causing
many throughout the Media and Entertainment industries to sit up and take notice.
To support these efforts, Netflix leverages Teradata as a critical component of its data and
analytics platform. More recently, the two companies partnered to transition Netflix to the
Teradata Cloud, which has given Netflix the power and flexibility it needs—and, so, the ability to
maintain its focus on those initiatives at the core of its business.
The Netflix story is a model for data-driven, directto-consumer, and subscriber-based companies
— and, in fact, for any business that needs engaged audiences to thrive in a rapidly changing
world.
After beginning as a mail-order DVD business, Netflix became the first prominent OTT content
provider and turned the media world on its head; witness recent decisions by other major media
companies to begin delivering OTT content.
One major element in Netflix’s success is the way it relentlessly tweaks its recommendation
engines, constantly adapting to meet each consumer’s preferred style. Most of the company’s
streaming activity emerges from its recommendations, which generate enormous consumer
engagement and loyalty. Every interaction a Netflix subscriber has with the service is based on
meticulously culled and analyzed interactions—no two experiences are the same.
In addition, as noted above, Netflix has applied its understanding of subscribers and potential
subscribers—as individuals and as groups—to make strategic purchasing, licensing, and content
development decisions. It has created two highly successful dramatic series—House of Cards
and Orange is the New Black—that are informed in part by the company’s extraordinary
understanding of its subscribers
While those efforts and the business minds that drive them make up the heart of the company’s
business, the technology that supports these initiatives must be more powerful and reliable
than that of its competitors. The data and analytics platform must be able to:
• Rapidly and reliably handle staggering workloads; it must support insightful analysis of billions
of transactional events each day—every search, browse, stop, and start—in whatever data
format that records the events.
• Work with a variety of analytics approaches, including neural networks, Python, Pig, as well as
varied Business Intelligence tools, like MicroStrategy.
• Easily scale and contract as necessary with exceptional elasticity. • Provide a safe and
redundant repository for all of the company’s data.
• Fit within the company’s cost structure and desired profit margins.
With these considerations in mind, Netflix and Teradata teamed up to launch a successful
venture to bring Netflix’s Teradata Data Warehouse into the cloud.
Hybrid Analytical Ecosystems and a Unified Data Architecture: ystems and a Unified Data
Architecture: Netflix’s reliance on a hybrid analytical ecosystem that leverages Hadoop where
appropriate but refuses to compromise on speed and agility was the perfect fit for Teradata.
Netflix’s cloud environment relies on a Teradata-Hadoop connector that enables Netflix to
seamlessly move cloud-based data from another provider into the Teradata Cloud. The result is
that Netflix can do much of its analytics off a world-class data warehouse in the Teradata Cloud
that offers peace-of-mind redundancy, the ability to expand and contract in response to
changing business conditions, and a significantly reduced need for data movement. And,
Netflix’s no-holds-barred approach to allowing their analysts to use whatever analytical tools fit
the bill demanded a unique analytics platform that could accommodate them. Having a partner
that works efficiently with the full complement of analytical applications—both its own and
other leading software providers—was critical.
Teradata’s Unified Data Architecture (UDA) helps provide this by recognizing that most
companies need a safe, cost-effective collection of services, platforms, applications, and tools
for smarter data management, processing, and analytics. In turn, organizations can get the most
from all their data. The Teradata UDA includes:
• A powerful discovery platform offers companies discovery analytics that rapidly unlock
insights from all available data using a variety of techniques accessible to mainstream business
analysts.
• A data platform (e.g., Hadoop) provides the means to economically gather, store, and refine all
a company’s data and facilitate the type of discovery never before believed possible.
Netflix scrupulously adheres to a few simple and powerful metrics when evaluating the success
of its personalization capabilities: eyeballs. Are subscribers watching? Are they watching more?
Are they watching more of what interests them?
With engagement always top of mind, it’s no surprise that Netflix is among the world’s leaders
in personalizing content to successfully attract and retain profitable consumers. It has achieved
this standing by drawing on its understanding that in a rapidly changing business and technology
landscape, one key to success is constantly testing new ways of gathering and analyzing data to
deliver the most effective and targeted recommendations. Working with technology partners
that make such testing possible frees Netflix to focus on its core business. Moving ahead, Netflix
believes that making increased use of cloud-based technology will further empower its
customer engagement initiatives. By relying on technology partners that understand how to
tailor solutions and provide peace of mind about the redundancy of Netflix’s data, the company
expects to continue its organic growth and expand its capacity to respond nimbly to
technological change and the inevitable ebbs and flows of business.
1. What does Netflix do? How did they evolve into this current business model?
AMC Networks Is Using Analytics to Capture New Viewers, Predict Ratings, and Add Value for
Advertisers in a Multichannel World
Over the past 10 years, the cable television sector in the United States has enjoyed a period of
growth that has enabled unprecedented creativity in the creation of high-quality content. AMC
Networks has been at the forefront of this new golden age of television, producing a string of
successful, critically acclaimed shows such as Breaking Bad, Mad Men, and The Walking Dead.
Dedicated to producing quality programming and movie content for more than 30 years, AMC
Networks owns and operates several of the most popular and award-winning brands in cable
television, producing and delivering distinctive, compelling, and culturally relevant content that
engages audiences across multiple platforms.
Despite its success, AMC Networks has no plans to rest on its laurels. As Vitaly Tsivin, SVP
Business Intelligence, explains:
We have no interest in standing still. Although a large percentage of our business is still linear
cable TV, we need to appeal to a new generation of millennials who consume content in very
different ways.
TV has evolved into a multichannel, multistream business, and cable networks need to get
smarter about how they market to and connect with audiences across all of those streams.
Relying on traditional ratings data and third-party analytics providers is going to be a losing
strategy: you need to take ownership of your data, and use it to get a richer picture of who your
viewers are, what they want, and how you can keep their attention in an increasingly crowded
entertainment marketplace
The challenge is that there is just so much information available—hundreds of billions of rows of
data from industry data-providers such as Nielsen and comScore, from channels such as AMC’s
TV Everywhere live Web streaming and video-on-demand service, from retail partners such as
iTunes and Amazon, and from third-party online video services such as Netflix and Hulu.
lix and Hulu. “We can’t rely on high-level summaries; we need to be able to analyze both
structured and unstructured data, minute-by-minute and viewerby-viewer,” says Tsivin. “We
need to know who’s watching and why—and we need to know it quickly so that we can decide,
for example, whether to run an ad or a promo in a particular slot during tomorrow night’s
episode of Mad Men.”
AMC decided it needed to develop an industryleading analytics capability in-house and focused
on delivering this capability as quickly as possible. Instead of conducting a prolonged and
expensive vendor and product selection process, AMC decided to leverage its existing
relationship with IBM as its trusted strategic technology partner. The time and money
traditionally spent on procurement were instead invested in realizing the solution, accelerating
AMC’s progress on its analytics roadmap by at least six months.
In the past, AMC’s research team spent a large portion of its time processing data. Today, thanks
to its new analytics tools, it is able to focus most of its energy on gaining actionable insights.
“By investing in big data analytics technology from IBM, we’ve been able to increase the pace
and detail of our research an order of magnitude,” says Tsivin. “Analyses that used to take days
and weeks are now possible in minutes, or even seconds.” He added,
Bringing analytics in-house will provide major ongoing cost-savings. Instead of paying hundreds
of thousands of dollars to external vendors when we need some analysis done, we can do it
ourselves—more quickly, more accurately, and much more cost-effectively. We’re expecting to
see a rapid return on investment.
pecting to see a rapid return on investment. As more sources of potential insight become
available and analytics becomes more strategic to the business, an in-house approach is really
the only viable way forward for any network that truly wants to gain competitive advantage
from its data.
Many of the results delivered by this new analytics capability demonstrate a real transformation
in the way AMC operates. For example, the company’s business intelligence department has
been able to create sophisticated statistical models that help the company refine its marketing
strategies and make smarter decisions about how intensively it should promote each show.
With deeper insight into viewership, AMC’s direct marketing campaigns are also much more
successful than before. In one recent example, intelligent segmentation and look-alike modeling
helped the company target new and existing viewers so effectively that AMC video-on-demand
transactions were higher than would be expected otherwise.
This newfound ability to reach out to new viewers based on their individual needs and
preferences is not just valuable for AMC; it also has huge potential value for the company’s
advertising partners. AMC is currently working on providing access to its rich data sets and
analytics tools as a service for advertisers, helping them fine-tune their campaigns to appeal to
ever-larger audiences across both linear and digital channels.
Tsivin concludes, “Now that we can really harness the value of big data, we can build a much
more attractive proposition for both consumers and advertisers—creating even better content,
marketing it more effectively, and helping it reach a wider audience by taking full advantage of
our multichannel capabilities.”
1. What are the common challenges that broadcasting companies are facing today?
2. How can analytics help to alleviate these challenges? 2. How did AMC leverage analytics
to enhance its business performance? 3. What were the types of text analytics and text
minisolutions developed by AMC networks? Can you think of other potential uses of text
mining applications in the broadcasting industry?
Techniques for successfully detecting deception—that is, lies—have wide applicability. Law
enforcement can use decision support tools and techniques to investigate crimes, conduct
security screening in airports, and monitor communications of suspected terrorists. Human
resources professionals might use deception-detection tools to screen applicants. These tools
and techniques also have the potential to screen
e-mails to uncover fraud or other wrongdoings committed by corporate officers. Although some
people believe that they can readily identify those who are not being truthful, a summary of
deception research showed that, on average, people are only 54 percent accurate in making
veracity determinations (Bond & DePaulo, 2006). This figure may actually be worse when
humans try to detect deception in text.
Using a combination of text mining and data mining techniques, Fuller et al. (2008) analyzed
person-of-interest statements completed by people involved in crimes on military bases. In
these statements, suspects and witnesses are required to write their recollection of the event in
their own words. Military law enforcement personnel searched archival data for statements that
they could conclusively identify as being truthful or deceptive. These decisions were made on
the basis of corroborating evidence and case resolution. Once labeled as truthful or deceptive,
the law enforcement personnel removed identifying information and gave the statements to the
research team. In total, 371 usable statements were received for analysis. The text-based
deception-detection method used by Fuller et al. was based on a process known as message
feature mining, which relies on elements of data and text mining techniques. A simplified
depiction of the process is provided in Figure 7.3.
First, the researchers prepared the data for processing. The original handwritten statements had
to be transcribed into a word processing file. Second, features (i.e., cues) were identified. The
researchers identified 31 features representing categories or types of language that are
relatively independent of the text content and that can be readily analyzed by automated
means. For example, first-person pronouns such as I or me can be identified without analysis of
the surrounding text. Table 7.1 lists the categories and examples of features used in this study.
The features were extracted from the textual statements and input into a flat file for further
processing. Using several feature-selection methods along with 10-fold cross-validation, the
researchers compared the prediction accuracy of three popular data mining methods. Their
results indicated that neural network models performed the best, with 73.46 percent prediction
accuracy on test data samples; decision trees performed second best, with 71.60 percent
accuracy; and logistic regression was last, with 65.28 percent accuracy.
The results indicate that automated text-based deception detection has the potential to aid
those who must try to detect lies in text and can be successfully applied to real-world data. The
accuracy of these techniques exceeded the accuracy of mostother deception-detection
techniques, even though it was limited to textual cues.
3. What do you think are the main challenges for such an automated system?
From ticket sales to starting lineups, the Orlando Magic have come a long way since their
inaugural season in 1989. There weren’t many wins in those early years, but the franchise has
weathered the ups and downs to compete at the highest levels of the NBA.
often struggle to build a big enough revenue base to compete against their larger market rivals.
By using SAS® Analytics and SAS® Data Management, the Orlando Magic are among the top
revenue earners in the NBA, despite being in the 20th-largest market.
The Magic accomplish this feat by studying the resale ticket market to price tickets better, to
predict season ticket holders at risk of defection (and lure them back), and to analyze concession
and product merchandise sales to make sure the organization has what the fans want every
time they enter the arena. The club has even used SAS to help coaches put together the best
lineup.
The Magic accomplish this feat by studying the resale ticket market to price tickets better, to
predict season ticket holders at risk of defection (and lure them back), and to analyze concession
and product merchandise sales to make sure the organization has what the fans want every
time they enter the arena. The club has even used SAS to help coaches put together the best
lineup.
“Our biggest challenge is to customize the fan experience, and SAS helps us manage all that in a
robust way,” says Alex Martins, CEO of the Orlando Magic. Having been with the Magic since the
beginning (working his way up from PR Director to President to CEO), Martins has seen it all and
knows the value that analytics adds. Under Martins’ leadership, the season-ticket base has
grown as large as 14,200, and the corporate sales department has seen tremendous growth.
But like all professional sports teams, the Magic are constantly looking for new strategies that
will keep the seats filled at each of the 41 yearly home games. “Generating new revenue
streams in this day of escalating player salaries and escalating expenses is important,” says
Anthony Perez, vice president of Business Strategy. But with the advent of a robust online
secondary market for tickets, reaching the industry benchmark of 90 percent renewal of season
tickets has become more difficult.
“In the first year, we saw ticket revenue increase around 50 percent. Over the last three years—
for that period, we’ve seen it grow maybe 75 percent. It’s had a huge impact” said Anthony
Perez, vice president of Business Strategy, Orlando Magic.
Perez’s group takes a holistic approach by combining data from all revenue streams (concession,
merchandise, and ticket sales) with outside data (secondary ticket market) to develop models
that benefit the whole enterprise. “We’re like an inhouse consulting group,” explains Perez.
In the case of season ticket holders, the team uses historical purchasing data and renewal
patterns to build decision tree models that place subscribers into three categories: most likely to
renew, least likely to renew, and fence sitters. The fence sitters then get the customer service
department’s attention come renewal time.
“SAS has helped us grow our business. It is probably one of the greatest investments that we’ve
made as an organization over the last half-dozen years because we can point to top-line revenue
growth that SAS has helped us create through the specific messaging that we’re able to direct to
each one of our client groups.”
When analytics showed the team that 80 percent of revenue was from season ticket holders, it
decided to take a proactive approach to renewals and atrisk accounts. The Magic don’t have a
crystal ball, but they do have SAS® Enterprise Miner™, which allowed them to better understand
their data and develop analytic models that combine three pillars for predicting season ticket
holder renewals:
• Secondary market activity (were the unused tickets successfully sold on secondary sites?).
The data mining tools allowed the team to accomplish more accurate scoring that led to a
difference—and marked improvement—in the way it approached customer retention and
marketing.
Perez likes how easy it is to use SAS—it was a factor in opting to do the work in-house rather
than outsourcing it. Perez’s team has set up recurring processes and automated them. Data
manipulation is minimal, “allowing us more time to interpret rather than just manually
crunching the numbers.” Business users throughout the organization, including executives, have
instant access to information through SAS® Visual Analytics. “It’s not just that we’re using the
tools daily; we are using them throughout the day to make decisions,” Perez says.
Being Data-Driven
“We adopted an analytics approach years ago, and we're seeing it transform our entire
organization,” says Martins. “Analytics helps us understand customers better, helps in business
planning (ticket pricing, etc.), and provides game-to-game and year-to-year data on demand by
game and even by seat.”
“And analytics has helped transform the game. GMs and analytics teams look at every aspect of
the game, including movements of players on the court, to transform data to predict defense
against certain teams. We can now ask ourselves, ‘What are the most efficient lineups in a
game? Which team can produce more points vs. another lineup? Which team is better
defensively than another?’”
“We used to produce a series of reports manually, but now we can do it with five clicks of a
mouse (instead of five hours overnight in anticipation of tomorrow’s game). We can have
dozens of reports available to staff in minutes. Analytics has made us smarter,” says Martins.
What’s Next?
“Getting real-time data is the next step for us in our analytical growth process,” says Martins.
“On a game day, getting real-time data to track what tickets are available and how to maximize
yield of those tickets is critical. Additionally, you're going to see major technological changes and
acceptance of the technology on the bench to see how the games are played moving forward.
Maybe as soon as next season you’ll see our assistant coaches with iPad® tablets getting real-
time data, learning what the opponent is doing and what plays are working. It’ll be necessary in
the future.
“We’re setting ourselves up to be successful moving forward. And in the very near future, we’ll
be in a position again to compete for a conference championship and an NBA championship,”
says Martins. “All of the moves made this year and the ones to come in the future will be done
in order to build success on [and off] the court.’’
1. According to the application case, what were the main challenges the Orlando Magic was
facing?
2. How did analytics help the Orlando Magic to overcome some of its most significant challenges
on and off the court?
3. Can you think of other uses of analytics in sports and especially in the case of the Orlando
Magic? You can search the Web to find some answers to this question.
Researchers conducting searches and reviews of relevant literature face an increasingly complex
and voluminous task. In extending the body of relevant knowledge, it has always been
important to work hard to gather, organize, analyze, and assimilate existing information from
the literature, particularly from one’s home discipline. With the increasing abundance of
potentially significant research being reported in related fields, and even in what are
traditionally deemed to be nonrelated fields of study, the researcher’s task is ever more
daunting if a thorough job is desired.
In new streams of research, the researcher’s task can be even more tedious and complex. Trying
to ferret out relevant work that others have reported can be difficult, at best, and perhaps even
nearly impossible if traditional, largely manual reviews of published literature are required. Even
with a legion of dedicated graduate students or helpful colleagues, trying to cover all potentially
relevant published work is problematic.
Many scholarly conferences take place every year. In addition to extending the body of
knowledge of the current focus of a conference, organizers often desire to offer additional
minitracks and workshops. In many cases, these additional events are intended to introduce
attendees to significant streams of research in related fields of study and to try to identify the
“next big thing” in terms of research interests and focus. Identifying reasonable candidate topics
for such minitracks and workshops is often subjective rather than derived objectively from the
existing and emerging research.
In a recent study, Delen and Crossland (2008) proposed a method to greatly assist and enhance
the efforts of the researchers by enabling a semiautomated analysis of large volumes of
published literature through the application of text mining. Using standard digital libraries and
online publication search engines, the authors downloaded and collected all the available
articles for the three major journals in the field of management information systems: MIS
Quarterly (MISQ), Information Systems Research (ISR), and the Journal of Management
Information Systems (JMIS). To maintain the same time interval for all three journals (for
potential comparative longitudinal studies), the journal with the most recent starting date for its
digital publication availability was used as the start time for this study (i.e., JMIS articles have
been digitally available since 1994). For each article, Delen and Crossland extracted the title,
abstract, author list, published keywords, volume, issue number, and year of publication. They
then loaded all the article data into a simple database file. Also included in the combined data
set was a field that designated the journal type of each article for likely discriminatory analysis.
Editorial notes, research notes, and executive overviews were omitted from the collection.
Table 7.2 shows how the data were presented in a tabular format.
In the analysis phase, the researchers chose to use only the abstract of an article as the source
of information extraction. They chose not to include the keywords listed with the publications
for two main reasons: (1) under normal circumstances, the abstract would already include the
listed keywords, and therefore inclusion of the listed keywords for the analysis would mean
repeating the same information and potentially giving them unmerited weight and (2) the listed
keywords could be terms that authors would like their article to be associated with (as opposed
to what is really contained in the article), therefore, potentially introducing unquantifiable bias
to the analysis of the content.
The first exploratory study was to look at the longitudinal perspective of the three journals (i.e.,
evolution of research topics over time). To conduct a longitudinal study, Delen and Crossland
divided the 12-year period (from 1994 to 2005) into four 3-year periods for each of the three
journals. This framework led to 12 text mining experiments with 12 mutually exclusive data sets.
At this point, for each of the 12 data sets, the researchers used text mining to extract the most
descriptive terms from these collections of articles represented by their abstracts. The results
were tabulated and examined for time-varying changes in the terms published in these three
journals.
As a second exploration, using the complete data set (including all three journals and all four
periods), Delen and Crossland conducted a clustering analysis. Clustering is arguably the most
commonly used text mining technique. Clustering was used in this study to identify the natural
groupings of the articles (by putting them into separate clusters) and then to list the most
descriptive terms that characterized those clusters. They used SVD to reduce the dimensionality
of the term-by-document matrix and then an expectation-maximization algorithm to create the
clusters. They conducted several experiments to identify the optimal number of clusters, which
turned out to be nine. After the construction of the nine clusters, they analyzed the content of
those clusters from two perspectives: (1) representation of the journal type (see Figure 7.8a)
and (2) representation of time (Figure 7.8b). The idea was to explore the potential differences
and/or commonalities among the three journals and potential changes in the emphasis on those
clusters; that is, to answer questions such as “Are there clusters that represent different
research themes specific to a single journal?” and “Is there a time-varying characterization of
those clusters?” The researchers discovered and discussed several interesting patterns using
tabular and graphical representation of their findings (for further information, see Delen and
Crossland, 2008).
1. How can text mining be used to ease the insurmountable task of literature review?
2. What are the common outcomes of a text mining project on a specific collection of journal
articles? Can you think of other potential outcomes not mentioned in this case?
Application Case 7.6
Creating a Unique Digital Experience to Capture Moments That Matter at Wimbledon
Known to millions of fans simply as “Wimbledon,” The Championships are the oldest of tennis’s
four Grand Slams, and one of the world’s highest-profile sporting events. Organized by the All
England Lawn Tennis Club (AELTC), it has been a global sporting and cultural institution since
1877.
The organizers of The Championships, Wimbledon, and AELTC have a simple objective: every
year, they want to host the best tennis championships in the world—in every way and by every
metric.
The motivation behind this commitment is not simply pride; it also has a commercial basis.
Wimbledon’s brand is built on its premier status; this is what attracts both fans and partners.
The world’s best media organizations and greatest corporations—IBM included—want to be
associated with Wimbledon precisely because of its reputation for excellence.
For this reason, maintaining the prestige of The Championships is one of AELTC’s top priorities,
but there are only two ways that the organization can directly control how the rest of the world
perceives The Championships.
The first, and most important, is to provide an outstanding experience for the players,
journalists, and spectators who are lucky enough to visit and watch the tennis courtside. AELTC
has vast experience in this area. Since 1877, it has delivered two weeks of memorable, exciting
competition in an idyllic setting: tennis in an English country garden.
The second is The Championships’ online presence, which is delivered via the wimbledon.com
Web site, mobile apps, and social media channels. The constant evolution of these digital
platforms is the result of a 26-year partnership between AELTC and IBM.
He adds, “Digital is different: it’s our platform, where we can speak directly to our fans—so it’s
vital that we give them the best possible experience. No sporting event or media channel has
the right to demand a viewer’s attention, so if we want to strengthen our brand, we need
people to see our digital experience as the number-one place to follow The Championships
online.”
To that end, AELTC set a target of attracting 70 million visits, 20 million unique devices, and 8
million social followers during the two weeks of The Championships in 2015. It was up to IBM
and AELTC to find a way to deliver.
IBM and AELTC embarked on a complete redesign of the digital platform, using their intimate
knowledgeof The Championships’ audience to develop an experience tailor-made to attract and
retain tennis fans from across the globe.
“We recognized that while mobile is increasingly important, 80% of our visitors are using
desktop computers to access our Web site,” says Alexandra Willis, head of Digital and Content at
AELTC. She continued.
Our challenge for 2015 was how to update our digital properties to adapt to a mobile-first
world, while still offering the best possible desktop experience. We wanted our new site to take
maximum advantage of that large screen size and give desktop users the richest possible
experience in terms of high-definition visuals and video content—while also reacting and
adapting seamlessly to smaller tablet or mobile formats.
On the mobile side, the team recognized that the wider availability of high bandwidth 4G
connections meant that the mobile Web site would become more popular than ever—and
ensured that it would offer easy access to all rich media content. At the same time, The
Championships’ mobile apps were enhanced with real-time notifications of match scores and
events—and could even greet visitors as they passed through stations on the way to the
grounds.
The team also built a special set of Web sites for the most important tennis fans of all: the
players themselves. Using IBM' Bluemix' technology, it built a secure Web application that
provided players a personalized view of their court bookings, transport, and on-court times, as
well as helping them review their performance with access to stats on every match they played
To supply its digital platforms with the most compelling possible content, the team took
advantage of a unique opportunity: its access to real-time, shot-by-shot data on every match
played during The Championships. Over the course of the Wimbledon fortnight, 48 courtside
experts capture approximately 3.4 million data points, tracking the type of shot, strategies, and
outcome of each and every point.
The Championships. Over the course of the Wimbledon fortnight, 48 courtside experts capture
approximately 3.4 million data points, tracking the type of shot, strategies, and outcome of each
and every point.
These data are collected and analyzed in real time to produce statistics for TV commentators
and journalists—and for the digital platform’s own editorial team.
streams of data coming in from all 19 courts, and whenever something significant happened—
such as Sam Groth hitting the second-fastest serve in Championships’ history—it let us know
instantly. Within seconds, we were able to bring that news to our digital audience and share it
on social media to drive even more traffic to our site.
tter and uncover the compelling narratives within the data, faster than anyone else, was key. If
you wanted to experience the emotions of The Championships live, the next best thing to being
there in person was to follow the action on wimbledon.com.
Another new capability tried in 2015 was the use of IBM’s NLP technologies to help mine
AELTC’s huge library of tennis history for interesting contextual information. The team trained
IBM Watson™ Engagement Advisor to digest this rich unstructured data set and use it to answer
queries from the press desk.
The same NLP front-end was also connected to a comprehensive structured database of match
statistics, dating back to the first Championships in 1877—providing a one-stop shop for both
basic questions and more complex inquiries.
“The Watson trial showed a huge amount of potential. Next year, as part of our annual
innovation planning process, we will look at how we canuse it more widely—ultimately in
pursuit of giving fans more access to this incredibly rich source of tennis knowledge,” says
Desmond.
IBM hosted the whole digital environment in its Hybrid Cloud. IBM used sophisticated modeling
techniques to predict peaks in demand based on the schedule, popularity of each player, time of
day, and many other factors—enabling it to dynamically allocate cloud resources appropriately
to each piece of digital content and ensure a seamless experience for millions of visitors around
the world.
In addition to the powerful private cloud platform that has supported The Championships for
several years, IBM also used a separate SoftLayer' cloud to host the Wimbledon Social Command
Centre and provide additional incremental capacity to supplement the main cloud environment
during times of peak demand.
The elasticity of the cloud environment is key because The Championships’ digital platforms
need to be able to scale efficiently by a factor of more than 100 within a matter of days as the
interest builds ahead of the first match on Centre Court.
Online security is a key concern today for all organizations. For major sporting events in
particular, brand reputation is everything—and while the world is watching, it is particularly
important to avoid becoming a high-profile victim of cyber crime. For these reasons, security has
a vital role to play in IBM’s partnership with AELTC.
Over the first five months of 2015, IBM security systems detected a 94 percent increase in
security events on the wimbledon.com infrastructure compared to the same period in 2014.
The success of the new digital platform for 2015— supported by IBM cloud, analytics, mobile,
social, and security technologies—was immediate and complete. Targets for total visits and
unique visitors were not only met but also exceeded. Achieving 71 million visits and 542 million
page views from 21.1 million unique devices demonstrates the platform’s success in attracting a
larger audience than ever before and keeping those viewers engaged throughout The
Championships.
“Overall, we had 13% more visits from 23% more devices than in 2014, and the growth in the
use of wimbledon.com on mobile was even more impressive,” says Willis. “We saw 125%
growth in unique devices on mobile, 98% growth in total visits, and 79% growth in total page
views.”
Desmond concludes, “The results show that in 2015, we won the battle for fans’ hearts and
minds. People may have favorite newspapers and sports Web site that they visit for 50 weeks of
the year— but for two weeks, they came to us instead.”
He continued, “That’s a testament to the sheer quality of the experience we can provide—
harnessing our unique advantages to bring them closer to the action than any other media
channel. The ability to capture and communicate relevant content in real time helped our fans
experience The Championships more vividly than ever before.”
Background
Founded in 1894, Barbour is an English heritage and lifestyle brand renowned for its waterproof
outerwear—especially its classic waxed-cotton jacket. With more than 10,000 jackets ordered
and handmade each year, Barbour has held a strong position in the luxury goods industry for
more than a century, building a strong relationship with fashion-conscious men and women of
the British countryside. In 2000, Barbour broadened its product offering to include a full lifestyle
range of everyday clothes and accessories. Its major markets are the United Kingdom, the
United States, and Germany; however, Barbour holds a presence in more than 40 countries
worldwide, including Austria, New Zealand, and Japan. Using individualized insights derived with
the services and digital marketing capabilities of Teradata Interactive, Barbour ran a one-month
campaign that generated 49,700 new leads and 450,000 clicks to its Web site.
The Challenge: Taking Ownership of Customer Relationships
Barbour has experienced outstanding consistent growth within its lifetime, and in August 2013,
it launched its first e-commerce site in a bid to gain a stronger online presence. However, being
a late starter in the e-commerce world, it was a challenge for Barbour to establish itself in the
saturated digital arena. Having previously sold its products only through wholesalers and
independent retail resellers, Barbour wanted to take ownership of the end-user relationship,
whole customer journey, and perception of the brand. While the brand is iconic and highly
respected around the world, Barbour was aware of the importance in establishing direct
relationships with its target audience—especially when encouraging users to engage with its
new e-commerce platform. It also understood that it needed to take more control of shaping
the customer journey. That way Barbour could create and maintain the same exceptional level
of quality in the user experience as that applied to the manufacturing of its products. To do this,
the company needed to develop its understanding of its target market’s online behavior. With
the goal of reaching its target audience in order to build meaningful customer relationships,
Barbour approached Teradata. Barbour’s marketing department needed Teradata Interactive to
offer a solution that would increase its knowledge of the unique characteristics and needs of its
individual customers, as well as support the launch of its new UK e-commerce Web site.
The increasing shift to global e-commerce and the growth in digital consumerism require brands
to hold a strong online presence. This also means that retailers have to implement strategies
that support their customers’ evolving wants and needs, online and offline. Barbour and
Teradata Interactive embarked on the design and construction of a Lead Nurture Program that
ran over a one-month period. The campaign objective was to not only raise awareness and
create demand for immediate sales activity but also to create a more longterm engagement
mechanism that would lead to more sales over a sustained period of time. It was clear from the
start that the strong relationship Barbour enjoys with its customers was a crucial factor that set
it apart from its luxury retail competitors. Teradata Interactive was keen to ensure this
relationship was respected through the lead generation process.
The execution of the campaign was unique to Barbour. Typical lead generation campaigns were
often executed as single registration events with a single sales promotion in mind. The data
were usually restricted to just e-mail addresses and basic profile fields, generated without
consideration of the registrant’s personal needs and imported to be used solely for generic
newsletter campaigns. This strategy often missed a huge opportunity for brands when learning
about their prospects, often resulting in poor sales conversions. Teradata Interactive understood
that the true value of lead generation is twofold. First of all, by using the registration event to
gather as much information as possible, the understanding of future buying intent and its
affecting factors are developed. Second, by making sure that the collated data are effectively
used to deliver valuable and individualized content, relevant sales opportunities are provided to
the customer when they are next in the market to buy. To make sure this strategy drove long-
term sales, Teradata Interactive built a customer lifecycle program which delivered content over
e-mail and online display.
The nurture program content was integrated with display advertising and encouraged social
media sharing. With Teradata Interactive’s smart tagging of nurture content, Barbour was able
to segment audiences according to their product preferences and launch display re-targeting
banners. Registrants were also invited to share content socially, which enabled Teradata
Interactive to identify “social propensity” and segment users for future loyalty schemes and
“Tell-a-Friend” activities. In addition to the focus of increasing Barbour’s newsletter base,
Teradata conducted a data audit to analyze all of the data collected and better understand what
factors would influence user engagement behavior.
Results
The strong collaboration between Teradata and Barbour meant that over the one-month
campaign period, Barbour was able to create new and innovative ways of communicating with
its customers. More than 49,700 leads were collected within the UK and DACH regions, and the
lead generation program showed open rates of up to 60 percent and click-through-rates of
between 4 and 11 percent. The campaign also generated 450,000+ clicks to Barbour’s Web site
and was so popular with fashion bloggers and national press that it was featured as a story in
The Daily Mirror. Though the campaign was only a month long, a key focus was to help
Barbour’s future marketing strategy. A preference center survey was implemented into the
campaign design, which resulted in a 65 percent incentivized completion rate. User data
included:
• Device engagement
A deep level of insight has effectively given Barbour a huge capability to deliver personalized
content and offers to its user base.
1. What does Barbour do? What was the challenge Barbour was facing?
Founded nearly two decades ago, Tito’s credits the advent of social media with playing an
integral role in engaging fans and raising brand awareness. In an interview with Entrepreneur,
founder Bert “Tito” Beveridge credited social media for enabling Tito’s to compete for shelf
space with more established liquor brands. “Social media is a great platform for a wordof-mouth
brand, because it’s not just about who has the biggest megaphone,” Beveridge told
Entrepreneur.
As Tito’s has matured, the social team has remained true to the brand’s founding values and
actively uses Twitter and Instagram to have one-on-one conversations and connect with brand
enthusiasts. “We never viewed social media as another way to advertise,” said Katy Gelhausen,
Web & social media coordinator. “We’re on social so our customers can talk to us.”
To that end, Tito’s uses Sprout Social to understand the industry atmosphere, develop a
consistent social brand, and create a dialogue with its audience. As a result, Tito’s recently
organically grew its Twitter and Instagram communities by 43.5 percent and 12.6 percent,
respectively, within four months.
Tito’s quarterly cocktail program is a key part of the brand’s integrated marketing strategy. Each
quarter, a cocktail recipe is developed and distributed through Tito’s online and offline
marketing initiatives.
It is important for Tito’s to ensure that the recipe is aligned with the brand’s focus as well as the
larger industry direction. Therefore, Gelhausen uses Sprout’s Brand Keywords to monitor
industry trends and cocktail flavor profiles. “Sprout has been a really important tool for social
monitoring. The Inbox is a nice way to keep on top of hashtags and see general trends in one
stream,” she said.
The information learned is presented to Tito’s in-house mixology team and used to ensure that
the same quarterly recipe is communicated to the brand’s sales team and across marketing
channels. “Whether you’re drinking Tito’s at a bar, buying it from a liquor store or following us
on social media you’re getting the same quarterly cocktail,” said Gelhausen.
The program ensures that, at every consumer touch point, a person receives a consistent brand
experience—and that consistency is vital. In fact, according to an Infosys study on the
omnichannel shopping experience, 34 percent of consumers attribute cross-channel consistency
as a reason they spend more on a brand. Meanwhile, 39 percent cite inconsistency as reason
enough to spend less.
At Tito’s, gathering industry insights starts with social monitoring on Twitter and Instagram
through Sprout. But the brand’s social strategy does not stop there. Staying true to its roots,
Tito’s uses the platform on a daily basis to authentically connect with customers.
Sprout’s Smart Inbox displays Tito’s Twitter and Instagram accounts in a single, cohesive feed.
This helps Gelhausen manage inbound messages and quickly identify which require a response.
“Sprout allows us to stay on top of the conversations we’re having with our followers. I love how
you can easily interact with content from multiple accounts in one place,” she said.
Tito’s approach to Twitter is simple: engage in personal, one-on-one conversations with fans.
Dialogue is a driving force for the brand, and over the course of four months, 88 percent of
Tweets sent were replies to inbound messages.
Using Twitter as an open line of communication between Tito’s and its fans resulted in a 162.2
percent increase in engagement and a 43.5 percent gain in followers. Even more impressively,
Tito’s ended the quarter with 538,306 organic impressions—an 81 percent rise. A similar
strategy is applied to Instagram, which Tito’s uses to strengthen and foster a relationship with
fans by publishing photos and videos of new recipe ideas, brand events, and initiatives.
Using the Instagram Profiles Report, Tito’s has been able to measure the impact of its Instagram
marketing strategy and revise its approach accordingly. By utilizing the network as another way
to engage with fans, the brand has steadily grown its organic audience. In four months,
@TitosVodka saw a 12.6 percent rise in followers and a 37.1 percent increase in engagement.
On average, each piece of published content gained 534 interactions, and mentions of the
brand’s hashtag, #titoshandmadevodka, grew by 33 percent.
Social is an ongoing investment in time and attention. Tito’s will continue the momentum the
brand experienced by segmenting each quarter into its own campaign. “We’re always getting
smarter with our social strategies and making sure that what we’re posting is relevant and
resonates,” said Gelhausen. Using social to connect with fans in a consistent, genuine, and
memorable way will remain a cornerstone of the brand’s digital marketing efforts.
Using Sprout’s suite of social media management tools, Tito’s will continue to foster a
community of loyalists.
1. How can social media analytics be used in the consumer products industry?
2. What do you think are the key challenges, potential solutions, and probable results in
applying social media analytics in consumer products and services firms?