shop. That way, the attendants can offer
the right product for each
shopper. The facial recognition technology is not only useful for
customer identification. The retail chain also uses Alipay's “Smile to
Pay” facial recognition payment system for store checkout.
Facial recognition technology is now capable of detecting people's
feelings, too. AI algorithms can infer emotions by analyzing human
facial expressions in images, recorded videos, and live cameras. The
feature is beneficial for marketers to understand how customers
respond to their products and campaigns without the presence of a
human observer.
Thus, emotion detection is used for product concept and ad testing in
online interviews and focus groups. Respondents who share access to
their webcams are asked to watch a picture
or a video and have their
facial reactions analyzed. For instance, Kellogg's used facial
expression analysis from Affectiva for developing ads for Crunchy
Nut. The company tracks the viewer's amusement and engagement
when watching the commercials during the first viewing and
repetition.
Disney experimented with emotion detection by installing cameras
in cinemas showing its movies. Tracking millions of facial
expressions throughout the film,
Disney can learn how much
moviegoers enjoy every scene. It is useful to improve filmmaking for
future projects.
Due to its real-time analysis, the same technology can be utilized to
provide responsive content according to the audience's reactions.
The obvious use case will be for dynamic ads on out-of-home (OOH)
billboards. Ocean Outdoor, an outdoor advertising company,
installed billboards with cameras that detect audience mood, age,
and gender to deliver targeted ads in the United Kingdom.
Another use case in development is for car drivers. A few automakers
began testing facial recognition technology to enhance the
experience. Upon recognizing the car owner's face,
a car can
automatically open, start, and even play the owner's favorite playlist.
And when the technology detects that the driver's face looks tired, it
can recommend that the driver take a rest.
A related technology is an eye-tracking sensor. With this technology,
companies can understand where a viewer focuses attention based
on eye movements, for example, when seeing an ad or a video.
Marketers can essentially create a heatmap and learn which specific
areas in the ad create more excitement and engagement. Palace
Resorts utilized eye tracking in its marketing campaign. The
hospitality company creates a microsite where visitors can take a
video quiz and give their consent
for the use of eye-tracking
technology via webcams. Visitors will be asked to choose from a pair
of videos with a combination of various holiday elements. Based on
the direction of their gaze, the site will recommend one of the
company's resorts that best fits the interests of the visitor.
Voice is another way to recognize humans and trigger contextual
actions. AI can analyze the properties of vocal speech—speed, brief
pauses, and tones—and discover embedded emotions. The health
insurance company Humana uses voice analysis from Cogito in its
call centers to understand a caller's feelings and recommend a
conversational technique to the call center agent.
When the caller
sounds annoyed, for example, the AI engine will give alerts to the
agent to change approach. It is essentially coaching the agents to
build a better connection with the callers in real time.
British Airways also experiments with understanding its passengers’
mood onboard the aircraft. It launched the “happiness blanket,”
which can change color based on a passenger's state of mind. The
blanket came with a headband that monitors brain waves and
determines if a passenger is anxious or relaxed. The experiment
helped the airline understand changes in mood across the customer
journey: when watching in-flight entertainment, during meal service,
or when sleeping.
Most importantly, the technology allows flight
attendants to quickly identify which passengers are unhappy and
make them feel more comfortable.
Mood detection from facial expressions, eye movements, voice, and
neuro-signals is not yet mainstream in marketing applications. But it
will be the key to the future of contextual marketing. It is critical to
understand the customer's state of mind, aside from their basic
demographic profiles.