What Not to Put Into ChatGPT at Work
10 things you should never paste into ChatGPT, Claude, or any AI tool at work. A practical checklist for keeping your job and your data safe.
The 10-Item "Don't Paste That" Checklist
We get it. AI tools are fast, useful, and sitting right there in your browser tab. But that convenience comes with a trap: it's very easy to paste something into ChatGPT, Claude, or Gemini that should never leave your company's walls.
Most people don't get fired for using AI. They get fired for what they put into it. Here are the ten things we see professionals paste into AI tools every day that could cost them their jobs, their clients, or both.
- Customer personal data (names, emails, phone numbers, addresses). Even if you're just asking AI to "clean up this spreadsheet," you may be sending PII to a third-party server. That's a potential GDPR, CCPA, or HIPAA violation depending on your industry.
- Employee records or HR documents. Performance reviews, salary data, disciplinary notes. Pasting these into a consumer AI tool means you've just shared confidential employee information with an external system. Your HR and legal teams would not be pleased.
- Source code from proprietary software. Your company's codebase is intellectual property. Consumer AI tools may use inputs for training. Even enterprise-tier tools carry risk if your engineering team hasn't approved them. Samsung learned this the hard way in 2023.
- Financial data before public disclosure. Revenue figures, earnings projections, M&A details. If it hasn't been publicly reported, pasting it into an AI tool could constitute a data breach. In regulated industries, it could trigger SEC scrutiny.
- Passwords, API keys, or access tokens. This sounds obvious, but it happens constantly. People paste config files, environment variables, or error logs that contain credentials. Once it's in the chat, you've lost control of it.
- Client contracts or legal agreements. The terms of your deals are confidential. Asking AI to "summarize this contract" means sending your client's proprietary terms to a third party. That may violate the very NDA you're trying to summarize.
- Internal strategy documents. Roadmaps, competitive analyses, board presentations. These contain your company's strategic thinking. Sharing them with an external AI tool is sharing them with a company that may serve your competitors.
- Medical or health information. If you work in healthcare, insurance, or benefits, pasting patient or member data into a non-HIPAA-compliant tool is a violation. Full stop. Even a single record counts.
- Customer support transcripts with identifying details. Chat logs, support tickets, and call notes often contain account numbers, purchase history, and personal details. Strip all of that before you paste.
- Anything marked "Confidential" or "Internal Only." If a document has a classification label on it, that label exists for a reason. Respect it. If you wouldn't email the content to a stranger, don't paste it into an AI tool.
The Policy Test: Before you paste anything, ask yourself: "Does my company have an AI usage policy, and would this action comply with it?" If you don't know the policy, find it. If there isn't one, treat everything as restricted until someone tells you otherwise. The safest default is caution.
How to Use AI Safely With Sensitive Work
The answer isn't "never use AI." The answer is to sanitize your inputs. Strip out identifying details, replace real names with placeholders, and remove numbers that matter. Here's a prompt template you can copy and paste:
This takes 2 minutes of prep and eliminates most of the risk. Think of it like redacting a document before handing it to an outside consultant. Because that's exactly what you're doing.
What IS Safe to Put Into AI Tools
Not everything is off-limits. Here's what you can generally use AI tools for without worry:
- Generic writing tasks. "Help me write a professional email declining a meeting" contains no sensitive data.
- Public information. Summarizing a published article, explaining a regulation, or researching a publicly known topic.
- Structural templates. "Give me a framework for a quarterly business review" reveals nothing about your business.
- Learning and explanation. "Explain the difference between FIFO and LIFO accounting" is just education.
- Sanitized data. Anything where you've replaced real details with placeholders before pasting.
The pattern is simple. If the content is generic, public, or thoroughly anonymized, you're fine. If it's specific, proprietary, or personally identifiable, stop and sanitize first.
A Note on "Enterprise" AI Tools
Enterprise versions of ChatGPT, Claude, and Gemini typically promise not to train on your data and offer stronger privacy protections. That's meaningful, but it doesn't mean anything goes. Your company's data classification policies still apply. Enterprise tools reduce the risk of data leaking into model training. They don't eliminate the risk of unauthorized data sharing with a third-party vendor.
Always check which tier your organization is paying for. The free version and the enterprise version often have very different data handling policies.
The bottom line: AI is a tool, not a confessional. The filter between your clipboard and the chat box is you. Sanitize first, paste second, and when in doubt, just don't.
Get new guides delivered every Tuesday.
AI news, prompts, and workflows you can use between meetings. Under 60 seconds.