If you're putting a chatbot on your website, you're trusting it with customer conversations—including potentially sensitive information. That trust needs to be earned.
This guide covers the security questions to ask vendors, compliance requirements, and configuration best practices.
The 10 Security Questions
1. Where is data stored?
Why it matters: Data location affects legal jurisdiction and compliance requirements. EU customers may require EU data storage.
What to ask:
- Which cloud provider hosts the data?
- Which geographic region?
- Is data encrypted at rest?
Red flag: "We use multiple providers" without specifics.
2. Who processes the AI?
Why it matters: If the chatbot uses external AI (OpenAI, Claude, etc.), conversations go to that provider.
What to ask:
- Which AI models process conversations?
- Is data sent to third-party AI providers?
- Can data be used for training their models?
Red flag: Vague answers about "proprietary AI."
3. Who can access conversation data?
Why it matters: Every person with access is a potential security risk.
What to ask:
- Only my team?
- Vendor support team?
- Third-party subprocessors?
Red flag: No clear access control documentation.
4. How long is data retained?
Why it matters: Longer retention = more data = more liability.
What to ask:
- What's the default retention period?
- Can I configure shorter retention?
- What's the deletion process?
Red flag: "We keep everything forever for analytics."
5. What compliance certifications exist?
Why it matters: Certifications prove the vendor takes security seriously.
What to look for:
- SOC 2 Type II
- ISO 27001
- GDPR compliance documentation
- Industry-specific: HIPAA (healthcare), PCI DSS (payments)
Red flag: "We're working on SOC 2" for years.
6. Is there a Data Processing Agreement?
Why it matters: Required under GDPR. Defines responsibilities.
What to ask:
- Is a DPA available?
- What terms does it include?
- Who are the subprocessors?
Red flag: Never heard of a DPA.
7. How is data transferred?
Why it matters: Data in transit can be intercepted.
What to ask:
- Is HTTPS enforced?
- What TLS version?
- Are there additional encryption layers?
Red flag: Any mention of unencrypted transfer.
8. What happens if there's a breach?
Why it matters: Incidents happen. Response matters.
What to ask:
- What's the incident response plan?
- How quickly will you be notified?
- What support is provided during breaches?
Red flag: "That's never happened" isn't a plan.
9. Can I delete all my data?
Why it matters: You need to respond to customer deletion requests.
What to ask:
- Can I request full data deletion?
- How long does deletion take?
- Is it truly deleted or just deactivated?
Red flag: Complicated or lengthy deletion processes.
10. What happens when I cancel?
Why it matters: Your data shouldn't linger after you leave.
What to ask:
- How long is data retained after cancellation?
- Can I export everything before closing?
- Is there a formal offboarding process?
Red flag: Data kept indefinitely after cancellation.
Security Checklist
Vendor Evaluation
- Security certifications documented
- Data processing agreement available
- Subprocessor list disclosed
- Encryption confirmed (transit + rest)
- Incident response plan exists
- Clear answers to all 10 questions
Configuration
- HTTPS enforced on your site
- Strong admin passwords + 2FA
- Role-based access configured
- Session timeouts enabled
- Data retention limits set
Compliance (if applicable)
- Privacy policy updated to mention chatbot
- Cookie consent includes chat widget
- GDPR consent mechanism (EU users)
- HIPAA BAA signed (healthcare)
Ongoing
- Access audit scheduled quarterly
- Data review scheduled monthly
- Subprocessor changes monitored
- Security review annually
Industry-Specific Requirements
Healthcare (HIPAA)
If chatbot handles protected health information:
- Business Associate Agreement required with vendor
- PHI must be encrypted at rest and in transit
- Audit logs must be maintained
- Breach notification procedures must exist
Many consumer chatbot vendors cannot sign a BAA. Verify this upfront.
Finance (PCI DSS)
If payment information might be discussed:
- Never collect full card numbers in chat
- Mask any payment data that appears
- Redirect payment discussions to secure channels
- Regular security assessments required
E-commerce
For online stores:
- Order details may contain PII (addresses, phone numbers)
- Don't let chatbot store passwords
- Integrate with order systems rather than asking for details in chat
Security Risks to Understand
Data Exposure
Conversations might contain:
- Passwords (accidentally shared)
- Credit card numbers
- Personal health information
- Proprietary business information
Mitigation: Pattern detection to identify and redact sensitive data.
AI Hallucination
AI might generate incorrect information:
- Wrong prices
- Inaccurate policies
- Non-existent features
Mitigation: Ground AI responses in actual website content.
Prompt Injection
Malicious users might try to manipulate AI:
- Extract training data
- Make bot say inappropriate things
- Bypass restrictions
Mitigation: Input sanitization, output filtering, robust system prompts.
What Good Security Looks Like
Data Storage:
- Hosted on secure cloud infrastructure
- Data encrypted at rest with AES-256
- Regular encrypted backups
- Clear geographic location
AI Processing:
- Enterprise-grade AI providers with data protection
- No conversation data used for model training
- Processing logs retained only for debugging
Access Control:
- Role-based access for team members
- Two-factor authentication available
- Session management and automatic timeout
- Audit logs for admin actions
Compliance:
- GDPR-compliant data handling
- Data deletion on request (reasonable SLA)
- DPA available for all customers
- Transparent privacy policy
Retention:
- Configurable retention periods
- Full export available anytime
- Complete deletion on account closure
Breach Response:
- Notification commitment
- Dedicated security team
- Documented incident response procedure
The Bottom Line
Before deploying any chatbot:
- Ask the 10 questions — Get clear answers
- Verify certifications — Don't accept "working on it"
- Get the DPA signed — Required for GDPR
- Configure properly — Use all available security features
- Monitor ongoing — Security isn't set-and-forget
The convenience of AI chat isn't worth compromising customer trust.
Choose vendors who take security as seriously as you do.
Start free with Kya — Review our security documentation. We're happy to answer all 10 questions.


