Blogosfera Navas & Cusí

Nuestro bufete de abogados Navas & Cusí con sedes en Madrid y Barcelona posee carácter multidisciplinar y con una vocación internacional (sede en Bruselas), está especializado en derecho bancario , financiero y mercantil.
Contact us
To guarantee quality and personalized attention, we serve by prior appointment in person or by video conference. We don't work for results.

Artificial intelligence is revolutionizing the way we work, with tools such as ChatGPT and Claude becoming essential instruments for many professionals. However, when it comes to sensitive information and personal data, it is crucial to understand the privacy implications of their use. Many customers ask us questions about privacy and misuse, so let’s go ahead with the two most common providers

Privacy Policies: OpenAI (ChatGPT) vs. Anthropic (Claude)

Chatgpt and data collection

OpenAI, the company behind ChatGPT, maintains a comprehensive personal data collection policy. In terms of account information, the system stores basic user data, including name, contact, date of birth and payment information, along with all associated transaction history.

User content is another key aspect of data collection. This includes the totality of conversations held with the system, any files or images shared, and any additional information provided during the use of the service. In other words, everything we type, unless we indicate otherwise, they can use it to train their system, although they commit to anonymizing the data with AI.

Claude and its focus on privacy

Anthropic has taken a significantly different approach to user privacy. The most notable feature is their commitment not to use conversations for training future models, ensuring that both inputs and outputs remain strictly private. This policy is complemented by a more restrictive data retention system and greater transparency about the use of information. It appears that the user does not hand over their data for training to the system, being indifferent whether they are paying users or not.

Use of data for training

The standard version of ChatGPT allows conversations to be reviewed by the OpenAI team and used for training for future versions of the model. To address privacy concerns, OpenAI offers ChatGPT team or enterprise versions that do not use the data for training and provide greater control over the information shared.

Claude, on the other hand, guarantees by default that no conversations will be used for training, with no special subscriptions required. This focus on confidentiality by design represents a significant advantage for users concerned about privacy and ease of managing sensitive data.

Recommendations for professional use

The selection of an AI platform should be based on a careful assessment of the specific privacy needs of each organization.

The implementation of adequate protection measures is crucial. For ChatGPT users handling sensitive information, team subscription may be necessary, while Claude users can benefit from built-in privacy safeguards by default.

However, we recommend caution in any situation as black box systems can give some unpleasant surprises through prompt hacking.

Conclusions

The fundamental difference between the two platforms lies in their approach to privacy. Chatgpt requires a special subscription (team) to guarantee that data is not used in training, or to opt-out, while Claude offers these guarantees by default in all its versions.

It is necessary to strike a balance between taking advantage of the benefits of AI and maintaining information security. With the right measures and a privacy-conscious approach, these tools can be used securely in professional environments.

If you have any questions or feel that your content may have been affected by any policy, please do not hesitate to contact our firm. At Navas & Cusí, we have lawyers specialized in artificial intelligence, ready to advise you in this type of conflicts.

Note: This article is made with the privacy policy for Europe in November 2024. Any analysis should be made specifically.

Author
Navas & Cusí Abogados
Artículo anterior Artículo siguiente