proact a solution for contact center analytics n.
Skip this Video
Loading SlideShow in 5 Seconds..
ProACT: A Solution for Contact Center Analytics PowerPoint Presentation
Download Presentation
ProACT: A Solution for Contact Center Analytics

ProACT: A Solution for Contact Center Analytics

223 Vues Download Presentation
Télécharger la présentation

ProACT: A Solution for Contact Center Analytics

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Shourya Roy <> ProACT: A Solution for Contact Center Analytics Unstructured Information Management Group IBM India Research Lab Behind the Scene: Raghu, Sree, Diwakar, Rahul, Shantanu, Sumeet, Venkat…

  2. Brief Intro I have been associated with IBM Research since 2002 Working in the area of text analytics Prior to that between 2000-02, I used to be seen mostly in H1 Mess, TT Room and TV Room - sometimes in CSE Dept. classrooms and rarely in my advisor Prof. Soumen Chakrabarti’s office 

  3. Contact Center : Application of Structured-Unstructured Data Integration Contact Points Enterprise Customer Branch office Web Agent Self Service Call Center IVR Products & Services Unstructured Call logs & transcripts Emails, Surveys Structured Customer/Product Transaction Data Structured Agent Data Automation of C-Sat analysis Analytics for Agent Performance Customer Preferences Instant Market Intelligence • Customer preferences • Dissatisfaction Drivers • Lifetime Value Management Analyze Agent Performance • Improve C-Sat, Upsell Rate Analyze Contact Drivers • Improve FAQs, Web pages Integrate & Analyze Structured & Unstructured Data

  4. I will be Talking About • Customer Satisfaction Analysis • Scenario and Importance • Solution and derived BI • Issues • Agent Performance Analysis • Analysis of telephonic transcriptions to identify scope of improvement in a contact center • Automatically Building Domain Models • Automatically building Domain Models from noisy telephonic transcriptions • Possible applications

  5. I will be Talking About • Customer Satisfaction Analysis • Agent Performance Analysis • Automatically Building Domain Models

  6. Customer Satisfaction (C-Sat) • Wikipedia says“Customer satisfaction is a business term which is used to capture the idea of measuring how satisfied an enterprise's customers are with the organization's efforts in a marketplace.” • In BPO scenario, it is crucial from client’s point of view to monitor QoS provided by Contact Center • C-Sat analysis is mostly a part of the agreement between Contact Center and client • C-Sat is different from SLAs

  7. C-Sat Scenario Domain Customer Knowledge Query Specialist Email DB 9 9 7 1 Response Immediate and helpful response Feedback Feedback Request Analysts Report C-Sat DB

  8. Sample Verbatims with Labels

  9. Architecture Emails, Surveys Application Specific Data logs Transcribe Speech Self-Service • Speech • Web Language Skill & Cust-Sat annotators Upsell/Product Sentiment Annotators Integrate & Analyze Structured & Unstructured Data Business Intelligence Explore/ View/Report Identification of Cross Up Sell opportunities Summary of customer Views on products Integrated Views of Agent Performance Agent Training, Deployment Market Intelligence for Enterprise Clients

  10. Demonstration ProACT BI Tool BI

  11. Challenges • Technical Challenges • Too many class labels(>35) and insufficient data(<10 cases for some of the categories). Examples are IncompleteResolution, CannedResponse, PolicyIssues. • Grouped into a higher level categories. Examples are Resolution, Communication, Uncontrollables • Short, poorly written text • Noisy Data • No fixed rule for manual labelling leading to inconsistencies • Same/similar verbatims being assigned different labels by human labellers • Changing labels • Labels tend to change over time • Business Challenges • Smooth transition from exisiting manual C-Sat analysis process to a complete automated one

  12. Summary • Accuracy • Subjective as manual labelling is not accurate • Ballpark accuracy figures range from 60-75% • Going forward • Real-life deployment in different contact centers • Insightful Business Intelligence (BI) Tool • Can we introduce C-Sat analysis in a new process without requiring any training data?

  13. I will be Talking About • Customer Satisfaction Analysis • Agent Performance Analysis • Automatically Building Domain Models

  14. Call Analysis to Improve Sales and Agent Performance • Scenario • A car rental process outsourced to a call center – people calling up to rent cars • Objective • Call centers want maximum number of car bookings as well as car pick ups • INCREASE agent conversion rate • Approach • Analyse transcriptions of telephonic conversation and find out the key actionable and differentiating insights Architecture

  15. Application of TAKMI for CRM Application of TAKMI to Customer Relationship Management in Car Rental Process Agent Language (choice of phrases etc.) and compliance to guidelines Customer Contact Center Analysis of reasons for agents making Customers rent cars Agent Calls are transcribed Evaluation Knowledge Characteristic ina Customer Segment Customer Models Textual Data AAA MemberSegment Customer Evaluate Effects of different segments Contact Center Analyze Reasonsand Retry Enter Bookings information Unbooked Campaign Investigate for Enhancements (to other agents and cust segments) Booked Textual Data

  16. Highlights : Identifying Actionable Insights • Detect customer intent at start of call and suggest actions. • Weak start (“I would like to know the rate”, ”I just want to get a price on midsize car”) • Strong start (“Hey, I would like to pick up a car”, “I need to make a reservation please”) • In weak start case, “pick up” is improved by mentioning discount phrases. • In strong start case, “pick up” is concretized by mentioning value selling phrases and discount phrases. • Asking for clean driving recorddecreases “pick up” in strong start case

  17. Highlights : Call Flow and Compliance • Mandatory questions for call process checking • 1. Brand name in opening “welcome to Alamo” • 2. Proper opening “My name is” • 3. Confirms age 25 “age 25” • 4. Confirms check/debit card in their own names “check card in your name” • 5. Confirms clean driving record and license “you need clean driving record” • 6. Ask for future reservations “anything else “ • 7. Brand name in conclusion “thank you for calling National” • In 137 reservation calls… • Agents are not confirming “age over 25” in 36% calls. • Agents are not confirming “clean driving record” in 44% calls. • In total 936 calls… • Agents are not starting with brand name in 11% calls.

  18. Difference between Strong and Weak Start 65% 23% 9% pick up information pick up not picked up strong weak Customer intent at start of call 35% 8% 49% Based on the customer’s start, “not picked up (NS or CC)” is predictable.

  19. Difference between pick up and not picked up in Weak Start 21 13 Number of calls containing discount relating phrases 10 9 pick up not picked up 10/13 * 100 = 76.9 % 9/21 * 100 = 42.9 % Discount relating phrases are mentioned by the agent more frequently in “pick up” data.

  20. Detection of Improper Call Process (cont) How many mandatory questions are mentioned by the agent ? In 16 reservation calls, only less than 3 questions are mentioned.

  21. Detection of Improper Call Process (cont) In these 16 reservation calls, 2 questions are not mentioned at all.

  22. I will be Talking About • Customer Satisfaction Analysis • Agent Performance Analysis • Automatically Building Domain Models

  23. Scenario • Call centers handle customer complaints, issues for computer sales to mobile phones to apparels • Typically domains have manually created domain models which contain types of problems solved in each category, solutions library, typical question-answers, appropriate call opening and closing styles etc • Each instance in a domain requires separate domain model • These models are dynamic in nature and change over time Our objective is “automatic generation of domain models from largely available noisy transcriptions of telephonic conversations between call center agents and customers”

  24. Example: Snippet of Automatic Transcription

  25. Domain Model • We define the Domain Model as a Topic Taxonomy where every node is characterized by • Topics • Typical Question-Answers (QAs) • Typical Actions • Call Statistics

  26. Block Diagram 1 5 Call statistics Model ASR Builder Voice help-desk data Database, archive, replicate Can you access yahoo? Is modem on? Stopword N-gram 2 Removal Extraction Feature Engineering Component Taxonomy 3 Clusterer 4 Builder Clusters of different granularity

  27. are you using lotus notes six do you have the lotus notes closed ………………….. avg. transcription length = 1214.540984 words avg. call duration = 712.7395 secs replicate error unable to find path to server go to the workspace on left hand side look under server icon ………..

  28. Conclusion • Huge amount of unstructured data is being produced everyday in contact centers • Analysis can help to improve customer satisfaction, agent productivity, call handling time • Opportunity to play with real “real-life data” • Learning experience • Importance of handling noise in unstructured data • Workshop on Analytics for Noisy Unstructured Text Data (at IJCAI 07) [] – deadline 25 Sep (day after tomorrow!!)

  29. Thanks!!

  30. BACKUP

  31. Bharti PoC • Demonstrate how text analytics can add value to the existing Complaint Management Systems and make it more efficient • Demonstration of the software • Possible ways to extend this work • Discussion


  33. Some Relevant Text Analytics Techniques • Cleaning up of data • Spelling correction • irregular dial, iregular dialtone,irregular dt fone,irrgular dt psl,irregular dt so plss,irregular dt due are grouped together • Abbreviation expansion • …. • Annotators • Extracted problem areas such as Intermittent Dial Tone, Rosette Issue etc. Hints taken from questions provided by Bharti • Address segmentation such as Subhash Nagar, Bhopal, M.P. etc. • Sentiment, Product, Services • Application specific • ….

  34. Then? • Relevant structured and extracted annotated data is loaded into star-schema and a Business Intelligence (BI) application is developed on top of that • The BI application is capable of showing different views of the data by doing slice-and-dice, rollup-drilldown, association, comparison etc. operations Lets see the demo of BI application developed on hard-faults data collected from M.P.