Chatbots are great for summaries, drafts, and brainstorming. They are not a safe place for sensitive data. Assume anything you paste could be stored, reviewed, or leaked.

What not to share

  • Passwords, MFA codes, or recovery details.
  • Customer data, contracts, or internal documents.
  • Personal identifiers like SSNs, addresses, or bank details.

Verify outputs before you use them

Chatbots can be wrong with high confidence. Treat outputs like a draft and verify any facts, dates, or legal guidance with trusted sources.

Verification matters more than perfect prompting.

Safe ways to use chatbots

  • Ask for outlines, then fill in the details yourself.
  • Use generic examples instead of real data.
  • Redact sensitive details before pasting.
If you would not post it on a public forum, do not paste it into a chatbot.

Workplace best practices

Follow your company’s AI policy. If none exists, assume chatbots are external vendors and avoid sharing internal information.