For decades, sci-fi promised tech that would converse with us naturally. From HAL in the 1968 cult classic, 2001: A Space Odyssey, to Marvel Comics’ super-villain, Ultron, we’ve been imagining for years the day when we’d have computers that were truly fluent in human language. That future is slowly becoming reality with Siri, Alexa, and other voice assistants.

But how do we understand their impact? We measure, of course. That’s the new frontier of analytics, which Adobe is stepping into. What’s it all mean – in general, but for pharma marketers in particular? Let’s discuss …


Systems that use natural language have been held as the apex of a computing possibility since the Turing Test or “imitation game” in 1950: could a computer converse well enough to fool a human into thinking it was another human?

Today, as everything is slowly becoming connected to the Internet, we’re able to do many things: control the refrigerator, music, television, lights and heat at home; manage our schedules; set up reminders; answer questions and more – using natural verbal interactions. They’re not quite Turing-proof yet, but they’re getting close.

With the rise of popularity in personal digital voice assistants such as Apple Siri, Amazon Alexa, Google Assistant and Microsoft Cortana, brands looking to use voice assistants to create value for customers often want to know how people are using them.

To answer this, Adobe recently announced the addition of voice analytics to their services that help digital marketers and advertisers. Adobe’s machine-learning and artificial-intelligence capabilities can now be used to better understand how people are interacting with voice assistants.

Skill-fully Crafted

Before going into how Adobe does the tracking, and how it can be used, it’s helpful to understand the basics of how voice assistants work. Their work is made up of “skills,” which have two parts: “intents” or “actions,”  and “entities” or “parameters.” In a weather-forecast skill, like “Siri, tell me the weather in Kansas City,” “tell me the weather” is the intent or action, and “in Kansas City” is the entity or parameter. And “Alexa, play songs by the Beatles” has the action “play songs” and the parameter “by the Beatles.”

At the heart of these assistants is a natural-language processing component. It parses what you say, and identifies the intent and entity. Then, additional information can be used by the skill to refine the response. This could include things like the location, the time, or information from past interactions with you.

Adobe’s voice analytics works like that for a website or device. In addition to intent and entity, it captures elements like:

  • User ID and number of users
  • Session length, number of sessions, frequency of use
  • Error rate and path (e.g., Did your question find what you were looking for? Did you keep using it after that?)

Why Does It Matter?

As voice assistants become more integrated into our lives, health-related queries will increase – just as happened with mobile devices, and with the web overall.

Understanding those frontiers became crucial to pharma marketing, and it will be the same here. Brands need to understand what users are seeking, and they want to be able to influence their search results. Just like search engine optimization provides relevance to web searching, and web analytics provides insights to tailor content to individuals’ needs, voice content optimization and personal assistant analytics and insights will do the same for voice-based interactions.

Today, the optimization might influence delivery of content (voice and visual with the release of Echo Show) for existing digital assets such as sales aids, doctor discussion guides, FAQs, news, etc. These can be indexed by natural-language engines such as IBM Watson, and paired with personal assistant skills to provide useful, relevant information in a conversational format that’s continuously refined with the help of insights gained through analytics.

Eventually, the influence of the personal assistant could include more than just basic information, such as diagnosis, and we’ll see tools like Buoy Health turn into skills that refine based on the analysis of shared health information (through HealthKit, ResearchKit and others). Someday, your assistant may not only guess why you’re sick, but “prescribe” a treatment personalized for your patient journey.

As an expert in cloud marketing platforms, including Adobe, Intouch can optimize your existing content and analytics to prepare you to enter this new voice-based world. Contact us today to schedule a brief, collaborative innovation workshop and start driving results.