Will AI Learn My Stuff? A Simple Guide to AI Privacy (Part 2 of 5)
Series: The Andraluma AI Primer, Part 2 of 5
The Andraluma Compass - By Marco Lam
In the first part of this series, we demystified what a Chat AI is. Now, we must address the single most important question that holds people back from using these powerful new tools: "If I type in a personal or business problem, is that information being learned, stored, and shared?"
It's a smart and vital question. As a CISSP-certified cybersecurity professional and business founder, I've spent my career managing sensitive data. Let me give you a simple, non-technical way to think about AI privacy, so you can interact with these tools confidently and safely.
The answer depends on whether you are in the "Public Library" or the "Private Library."
The Public Library (Free AI Tools)
Think of the free, publicly available versions of AI tools (like the standard ChatGPT) as a helpful librarian in a large public library. When you have a conversation, the librarian uses the general topics you discuss to become smarter and more helpful to everyone in the future. The system is designed with privacy features to anonymise data and forget the specific details of your individual chat over time.
However, it's still a public space. You wouldn't discuss your secret business plan or your detailed personal financial information out loud in the middle of a library. The same logic applies here.
My professional advice is to treat these public tools like a postcard. Don't write anything on it that you wouldn't be comfortable with a stranger potentially seeing. For general knowledge, brainstorming, or creative writing, they are fantastic. For sensitive information, you need a different solution.
The Private Library (Paid or Business AI Tools)
Think of the paid, business-focused versions of AI (like Microsoft Copilot for 365 or ChatGPT for Enterprise) as hiring a private librarian for your company. This librarian has signed a strict, legally binding confidentiality agreement.
When you use these tools, your conversations and your company's data are kept within your secure, private environment. The AI company explicitly guarantees that your information is not used to train their public models. Your data remains your own. This is the secure option for working with confidential client information, internal strategy documents, or personal data.
You Are in Control
Understanding this distinction is the key to using AI safely and powerfully. The responsibility is not to fear the tool, but to choose the right tool for the right task. For brainstorming public information, the "Public Library" is a wonderful resource. For any work involving sensitive information, the "Private Library" is the only professional choice.
Now that we know how to interact safely, which AI should you choose? In Part 3 of this series, we'll compare the biggest names in the game: ChatGPT, Gemini, and Claude.
For Further Reading:
For those who wish to read the official policies and further guidance on AI and data privacy, these resources are the best starting points.
1. The Official Policy: OpenAI's Data Usage Explanation
Source: OpenAI (creators of ChatGPT)
Article:
https://help.openai.com/en/articles/5722435-how-your-data-is-used-to-improve-model-performanceConnection: This is the primary source. It explains directly from the developer how user data is handled for their consumer service, providing the official details behind the "Public Library" analogy.
2. The Australian Privacy Watchdog: The OAIC
Source: The Office of the Australian Information Commissioner (OAIC)
Article:
https://www.oaic.gov.au/newsroom/ai-and-privacy-what-you-need-to-knowConnection: The OAIC is the Australian Government's independent privacy authority. This resource provides the official Australian perspective on AI and privacy law, which is crucial context for local readers.
3. A Third-Party Perspective: TechCrunch
Source: TechCrunch (A leading technology publication)
Article:
https://techcrunch.com/2023/04/05/chatgpt-privacy-controls/Connection: This article from a respected tech journal provides an independent overview of the privacy controls available in tools like ChatGPT. It offers a helpful third-party perspective on how users can manage their own data.