← All posts

AI and privacy: what small business owners need to know before using ChatGPT

You paste a customer complaint into ChatGPT to draft a reply. You upload a contract to Claude for a summary. You type client names, revenue numbers, and business details into AI tools every day. But have you thought about where that data goes?

This isn't a scare piece — AI tools are safe to use for most tasks. But there are a few things every small business owner should understand.

What happens to your data

When you type something into an AI tool, your input is sent to the company's servers to generate a response. The key question is: does the company use your input to train its AI models?

  • ChatGPT (free plan): By default, OpenAI may use your conversations to improve their models. You can opt out in Settings > Data Controls > "Improve the model for everyone." When you turn this off, your chats are still stored for abuse monitoring (up to 30 days) but not used for training.
  • ChatGPT (Plus/Team/Enterprise): Conversations are not used for model training by default.
  • Claude (free and Pro): Anthropic does not use your conversations to train models by default. You can choose to provide feedback that may be used for training, but this is opt-in.
  • Canva: AI features process your input to generate results, but Canva states it does not use your content to train AI models unless you opt in.
  • Grammarly: Text is processed for suggestions and deleted from servers shortly after. Grammarly states it does not use your text to train models.

What you should never put into AI tools

Regardless of the privacy policy, avoid typing these into any AI tool:

  • Social Security numbers, credit card numbers, or bank details — never
  • Passwords or API keys — never
  • Health records or patient information — this could violate HIPAA
  • Full client contracts with sensitive financial terms — summarize the key points instead of pasting the whole document
  • Employee personal data — addresses, salary details, performance reviews with names

Practical tips for safe AI use

  • Anonymize when possible. Instead of "Write a reply to John Smith who spent $5,000 on our platinum package," try "Write a reply to a customer who purchased our premium package." AI doesn't need real names or exact numbers to generate useful output.
  • Use business accounts. Paid plans typically have stronger privacy protections and don't use your data for training.
  • Check the settings. Both ChatGPT and Claude have data privacy toggles. Take 2 minutes to review them.
  • Don't upload sensitive documents. If you need to analyze a contract, paste the relevant sections rather than uploading the entire file.

What about customer data?

If you're using AI to draft emails to customers, generate reports that mention clients, or create content based on customer feedback — you should have a basic understanding of your obligations.

In general: don't paste personally identifiable customer information into AI tools unless the tool's privacy policy allows it and your customers would reasonably expect it. When in doubt, anonymize.

The bottom line

AI tools are safe for the vast majority of small business tasks — writing captions, drafting emails, brainstorming ideas, creating descriptions. The risk comes from pasting sensitive data you wouldn't want shared publicly.

A simple rule: if you wouldn't type it into a Google search, don't type it into an AI tool. Follow that, review your privacy settings, and you'll be fine.