7 Oct
2019

Dont Fall into the Text Analytics Trap

CallMinerSP_Words_774.png Text analytics resources, such as what is frequently available from many CX platforms, have been trained on text data. As such, they were exclusively built with the structure of that data in mind (punctuation, single speaker, third-person formal, spell-checked text). The issue is that voice communication is dramatically different from text-based communication. If you go down the voice transcript to text analytics route, your results will be less than adequate. The primary reason is that it’s not simply about what was spoken. Rather, it’s how you say it, when you speak it. Even what you might not be saying within a dialogue will influence intent and sentiment within voice communication! The bottom line is that applying a voice transcription to text analytics workflow to accurately capture the voice of your customer and voice of your agents will not work well for CX pros. Results will be far less contextually relevant, making it difficult to gauge loyalty or sentiment meaning. A more insidious concern is sending your root cause analysis in the wrong direction by completely missing a key element of unsolicited feedback or being mis-directed by a false positive. What AI Really Changes in the Contact Center Eric Krapf April 08, 2019 It’s all about making the whole process work better, faster, and cheaper. Search parameters allow for the concept of time, which is not available in text-based communication. And classifications extend substantially beyond keyword spotting, taking into account the myriad ways that concepts, such as escalations or dissatisfaction, are expressed in speech. Customer experience (CX) professionals need to be vigilant when they hear promises like “your call center call recordings can be transcribed in real time and processed through text analytics to derive insights.” This refrain tends to be emphasized by CX platform vendors that rely exclusively on text analytics to process feedback from customers. Sounds reasonable, right? Tags:News & Viewsspeech analyticstext analyticsvoicetextCallMinerSpeech TechnologiesAI & AutomationAI & Speech Technologies Articles You Might Like Speech analytics, on the other hand, has been trained on data derived from the different part of the brain, as noted above, to consider spoken dialogue. Speech analytics is tuned for the unstructured nature of voice communication. In addition, acoustics add another dimension to provide contextual accuracy with meaning emphasis. CallMiner_Table_take2_640px.png Speech analytics solutions are designed to handle the nuances of voice-based conversations. This has an enormous impact on the ability to surface insight that matters from this unstructured data. For example, predefined or user configured searches and classifications can more accurately identify topics, behaviors, procedures, and other events in spoken conversations. Different styles, connector words, fillers/silence, and disjointed flow are “meaning influencers” that text analytics systems relying on structure would likely miss. See All in Speech Technologies » Speech Analytics on Your Horizon Gary Audin March 22, 2019 If you haven’t studied speech analytics for your business, you need to start now — or be left behind. The following will educate CX pros with the information needed to understand why voice is fundamentally different from text. This awareness will enable you to effectively take advantage of the voices of your customers and agents to drive CX enhancements. Learn more in our May 16 webinar, Text and Speech Analytics Are Not Created Equal, featuring our team of data science experts. We Don’t Talk and Write the Same WayIn 2015, cognitive scientists at Johns Hopkins University discovered that writing and speaking are supported by different parts of the brain. This impacts not just the motor control for writing and how you form words when you speak, but also word and phrase construction. Some examples of the differences include: Log in or register to post comments read more

Read More