Who Sees Your AI Chats? Unpacking ChatGPT & Gemini Privacy Risks

Fastcompany

Artificial intelligence (AI) tools like ChatGPT and Google Gemini offer significant utility, but they also present notable privacy challenges. A core concern is that most AI assistants retain comprehensive records of user conversations. These records are not only accessible on your devices but are also frequently stored online, sometimes indefinitely, raising risks of exposure through system bugs or security breaches. Furthermore, some AI providers may permit human review of these chat logs.

These practices warrant consideration, particularly if you intend to share personal thoughts or sensitive information with AI tools. To enhance privacy, users can adjust settings, utilize private conversation modes, or opt for AI assistants designed with privacy as a default.

A review of the privacy settings and policies of major AI assistants reveals important details about how they handle user data and the options available for managing it.

ChatGPT (OpenAI)

  • Default Data Use: By default, ChatGPT uses user data for AI model training. OpenAI advises that this training data may inadvertently include personal information.

  • Human Review: OpenAI’s FAQ states that conversations may be reviewed by humans to improve its systems.

  • Disabling AI Training: Users can opt out of their data being used for AI training by navigating to Settings > Data controls > Improve the model for everyone.

  • Private Chat Mode: A “temporary chat” mode is available, which keeps conversations out of your history and prevents them from being used for AI training. This can be activated by clicking “Turn on temporary chat” in the top-right corner.

  • Chat Sharing: Conversations can be shared by generating a shareable link. (OpenAI previously launched and then removed a feature that allowed search engines to index shared chats).

  • Targeted Advertising: OpenAI’s privacy policy states that it does not sell or share personal data for contextual behavioral advertising, nor does it process data for targeted ads or sensitive personal data to infer consumer characteristics.

  • Data Retention: Temporary and deleted chats are retained for up to 30 days, though some may be kept longer for security and legal obligations. All other data is stored indefinitely.

Google Gemini

  • Default Data Use: Gemini utilizes user data for AI model training by default.

  • Human Review: Yes, human reviewers may access your chats. Google advises against entering any data you would not want a reviewer to see. Data seen by a reviewer is retained for up to three years, even if you delete your chat history.

  • Disabling AI Training: To disable AI training, visit myactivity.google.com/product/gemini, click the “Turn off” drop-down menu, and select either “Turn off” or “Turn off and delete activity.”

  • Private Chat Mode: There is no dedicated private chat mode. However, turning off Gemini Apps Activity will hide your chat history from the sidebar. Note that re-enabling this feature without deleting past data will make your history reappear.

  • Chat Sharing: Conversations can be shared by generating a shareable link.

  • Targeted Advertising: Google states it does not use Gemini chats to show ads, although its general privacy policy allows for this. Google commits to communicating any policy changes.

  • Data Retention: Data is stored indefinitely, unless auto-deletion is enabled within Gemini Apps Activity.

Anthropic Claude

  • Default Data Use: Anthropic does not use conversations to train its AI models unless manually reported by the user or if the user opts into testing new features.

  • Human Review: No, human review is not performed unless conversations are flagged for violating usage policies.

  • Disabling AI Training: Not applicable due to its default policy.

  • Private Chat Mode: There is no specific private chat mode. Users must manually delete past conversations to remove them from their history.

  • Chat Sharing: Conversations can be shared via a generated link.

  • Targeted Advertising: Anthropic does not use conversations for targeted advertising.

  • Data Retention: Data is retained for up to two years, or seven years for prompts flagged for trust and safety violations.

Microsoft Copilot

  • Default Data Use: Microsoft uses user data to train its AI models.

  • Human Review: Yes. Microsoft’s privacy policy indicates it employs both automated and manual (human) methods for processing personal data.

  • Disabling AI Training: This option is somewhat hidden. Users can disable model training by clicking their profile image, then their name, navigating to Privacy, and disabling “Model training on text.”

  • Private Chat Mode: No dedicated private chat mode. Chats must be deleted individually or by clearing the entire history from Microsoft’s account page.

  • Chat Sharing: Conversations can be shared by generating a link. Shared links cannot be unshared without deleting the chat itself.

  • Targeted Advertising: Microsoft uses user data for targeted ads and has indicated plans to integrate ads with AI. This can be disabled by clicking your profile image, then your name, navigating to Privacy, and disabling “Personalization and memory.” A separate link allows for disabling all personalized ads for your Microsoft account.

  • Data Retention: Data is stored for 18 months unless manually deleted.

xAI Grok

  • Default Data Use: Grok uses user data to train its AI models.

  • Human Review: Yes. Grok’s FAQ states that a “limited number” of “authorized personnel” may review conversations for quality or safety purposes.

  • Disabling AI Training: This can be disabled by clicking your profile image, then navigating to Settings > Data Controls, and disabling “Improve the Model.”

  • Private Chat Mode: A “Private” button at the top right allows users to keep a chat out of their history and prevent it from being used for AI training.

  • Chat Sharing: Conversations can be shared by generating a link. Shared links cannot be unshared without deleting the chat.

  • Targeted Advertising: Grok’s privacy policy states it does not sell or share information for targeted advertising purposes.

  • Data Retention: Private chats and even deleted conversations are stored for 30 days. All other data is stored indefinitely.

Meta AI

  • Default Data Use: Meta AI uses user data to train its AI models.

  • Human Review: Yes. Meta’s privacy policy indicates manual review is used to “understand and enable creation” of AI content.

  • Disabling AI Training: There is no direct opt-out. U.S. users can submit a specific form, while users in the EU and U.K. can exercise their right to object.

  • Private Chat Mode: No dedicated private chat mode is available.

  • Chat Sharing: Yes, shared links automatically appear in a public feed and may show up in other Meta apps.

  • Targeted Advertising: Meta’s privacy policy indicates it targets ads based on collected information, including interactions with AI.

  • Data Retention: Data is stored indefinitely.

Perplexity

  • Default Data Use: Perplexity uses user data to train its AI models.

  • Human Review: Perplexity’s privacy policy does not mention human review.

  • Disabling AI Training: This can be disabled by navigating to Account > Preferences and disabling “AI data retention.”

  • Private Chat Mode: Yes, an “Incognito” mode is available by clicking your profile icon and selecting it under your account name.

  • Chat Sharing: Conversations can be shared by generating a link.

  • Targeted Advertising: Yes. Perplexity states it may share information with third-party advertising partners and may collect data from other sources (e.g., data brokers) to improve ad targeting.

  • Data Retention: Data is retained until the user deletes their account.

Duck.AI

  • Default Data Use: Duck.AI does not use user data to train its AI models, thanks to agreements with major providers.

  • Human Review: No.

  • Disabling AI Training: Not applicable due to its default privacy policy.

  • Private Chat Mode: No dedicated private chat mode. Users must delete previous chats individually or all at once via the sidebar.

  • Chat Sharing: No, sharing chats is not an option.

  • Targeted Advertising: No, chats are not used for targeted advertising.

  • Data Retention: Model providers retain anonymized data for up to 30 days, unless required for legal or safety reasons.

Proton Lumo

  • Default Data Use: Proton Lumo does not use user data to train its AI models.

  • Human Review: No.

  • Disabling AI Training: Not applicable due to its default privacy policy.

  • Private Chat Mode: Yes, a private mode can be activated by clicking the glasses icon at the top right.

  • Chat Sharing: No, sharing chats is not an option.

  • Targeted Advertising: No, chats are not used for targeted advertising.

  • Data Retention: Proton does not store logs of your chats.