Superhuman Listening with HP Autonomy

At some point, we’ve all thought about our preferred super power – whether flight, invisibility or superhuman strength. Well, I’m here to make the case for superhuman listening.
Imagine walking into a giant stadium filled with people tweeting, taking photos and posting to Facebook. Now imagine that you could capture, analyze and comprehend all of those social conversations as they happen. 
This superhuman listening is possible with HP Autonomy ExploreCloud and IDOL (Intelligent Data Operating Layer) tools. ExploreCloud gathers the conversation and data points, while IDOL makes sense of it. They make a great team, and here’s a quick overview of how they work together:
  1. Listening: Enter a few “listening rules” (like keywords or platform parameters), and ExploreCloud begins pulling in conversations and metadata immediately. This semi-structured data is normalized so you can look at it holistically. 
  2. Analyzing: IDOL then reads the thousands of tweets, articles, posts, videos and photos that thunder by in real time. Context is key here. Using contextual and language clues, IDOL can determine the topics, locations dialects and sentiment vibes of online posts. For example, the phrase "can't wait" usually has a negative connotation in an order-status email, but is nearly always positive in a Tweet among friends.
  3. Sorting: ExploreCloud sorts all conversations into “Projects” based on the topic of focus. ExploreCloud customers use Projects in many different ways. Projects could focus on individual product lines or initiatives. In this phase, ExploreCloud could also filter out profanity or unrelated posts that distract from the conversation.
  4. Visualizing: Possibly the best thing about IDOL and ExploreCloud is the technology’s ability to visualize the data. IDOL prioritizes the data based on the concept of the conversation, finding relationships and surfacing trends. ExploreCloud then lets you interact with the data in a variety of visual displays.

Unlike other big data tools, HP Autonomy lets the data speak for itself. By using context to prioritize conversation concepts, ExploreCloud understands information the same way humans do and avoids reliance on manual tagging, keywords or metadata. Additionally, IDOL is not just counting words; it maps conversations based on the concepts they support, rather than pre-determined keywords. This conceptualization process allows for the unexpected, and enables the data to speak for itself rather than simply support preconceived notions. Here are three real-life examples of HP Autonomy in action: 

We recently used IDOL during football’s biggest game. Although the listening rules we set up focused on football and the two teams in question, IDOL quickly revealed an unexpected –and very cute – topic we hadn’t considered. The Puppy Bowl was also driving meaningful conversation and photos. 

The word cloud, node map and video examples below were developed during the Sundance Film Festival using HP ExploreCloud.

We’ve also used HP Explore and IDOL to help our customers glean unexpected insights from their customers. For example, NASCAR’s Fan and Media Engagement Center (FMEC), powered by HP Autonomy, can track conversations around live events and announcements, and help NASCAR communicators identify opportunities for engagement. As the video below explains, NASCAR uses the FMEC to inform real-time broadcast and reporting decisions:​

With cuddly puppies, moving films and heart-pounding races, I always look forward to seeing HP Autonomy‘s analysis. For more information or the latest from the team, follow HP Autonomy on Twitter or Facebook.