Contribution · Application — Customer support
AI for Chat Deflection and Knowledge Base Agents
The classic support economics case: 60-70% of tickets are repeatable questions already answered in the KB. A well-grounded RAG chat agent can deflect a big chunk of that volume, freeing human agents for genuinely complex cases. The reliable pattern: strict RAG grounding, clear citations, graceful escalation, and zero promises the agent can't keep. The failure modes are well-documented — Air Canada's famous chatbot case — so do it right.
Application facts
- Domain
- Customer support
- Subdomain
- Self-service
- Example stack
- Claude Sonnet 4.7 or GPT-5 for response generation · LlamaIndex / Haystack RAG over KB · pgvector or Pinecone for retrieval · Zendesk / Intercom / Salesforce for chat deployment · Escalation routing with transcript context
Data & infrastructure needs
- Public KB articles with tags and ownership
- Common intent taxonomy
- Escalation triggers and policies
- Consent and transcript retention policy
Risks & considerations
- Hallucinated policies — the Air Canada problem (courts held company liable)
- Failure to escalate anxious or vulnerable users
- Accessibility — chat as sole channel excludes some users
- Prompt injection via customer messages
- DPDPA/GDPR on chat transcripts and PII in user questions
Frequently asked questions
Is AI chat deflection safe?
With strict RAG grounding, disclosed identity, easy human escalation: yes. Never let the LLM make commitments on refunds, policies, or accounts without backing. The Air Canada case established: the company owns what the chatbot says.
What LLM is best for support chat?
Claude Sonnet 4.7 is cost-effective at scale; GPT-5 Turbo handles complex multi-turn well. Fine-tuned smaller models work for high-volume, narrow use cases. The grounding pipeline matters more than model choice.
Regulatory concerns?
India: Consumer Protection Act (binding promises), DPDPA. US: FTC on deceptive AI, state consumer protection, ADA for accessibility. EU: AI Act transparency (must disclose AI), Consumer Rights Directive, GDPR.
Sources
- Consumer Protection Act India 2019 — accessed 2026-04-20
- EU AI Act — transparency — accessed 2026-04-20