Recommendation 4.2.3:
Consider integrating emotional intelligence into the chatbot
Benefits Users
Cognitive
Phase 1: Gather & Organize
Read More
- An article on Artificial Emotional Intelligence
User Research
- Learn about the user’s emotional state while completing this task. Is the task content innately distressing, like death or debt? Observe people completing the task with human representatives. Do the representatives exhibit emotional intelligence when working with users? Ask the representatives about how they recognize emotional distress and how they change the task process or communication style in response to detected distress.
Phase 2: Design & Implement
Design Question
-
Could you detect the user’s emotional language to prompt a transfer to a human representative?
- Some users may get frustrated if they cannot complete the task with the chatbot, regardless of the chatbot’s success rate with the task. If you have human representatives available, they may be better suited to working with users who are getting frustrated with the chatbot. Users who are frustrated may change their communication style to reflect their emotional state.
Examples
-
If I’m having trouble navigating a phone system, I want the system to recognize my confusion and suggest an alternative, like speaking directly with a human operator. I don’t want the system to calmly send me in circles.
-
If the details of my task are upsetting (e.g., unusually high fees owed, anything to do with death), I want the chatbot to offer the option to talk to a human.
-
If I indicate in a survey that I was not satisfied with my interaction with the chatbot, I want someone to reach out and try to help me.
Phase 3: Test & Evaluate
Ask the User
-
Did the chatbot respond appropriately to the emotions you felt?
- This question is subjective; use a Likert scale. This question can be asked mid-study or post-study.