Closing the Deal with Data Security: How Microsoft 365 Copilot’s Data Protection Practices Give It an Edge Over OpenAI

Closing the Deal with Data Security: How Microsoft 365 Copilot’s Data Protection Practices Give It an Edge Over OpenAI

Companies are quickly learning that AI doesn't just scale solutions - it also scales risks. Take, for instance, what happened at Samsung: Multiple employees unintentionally leaked confidential codes and sensitive data to ChatGPT. This information could be used to train the chatbot further and end up in its future replies to others.

When customers come across cases like this, they raise more concerns about AI tools, questioning whether their benefits can be harnessed without compromising reputation, security, or compliance. Microsoft Partners also hear these concerns when selling Microsoft 365 Copilot, and they often have to explain why it is a more secure and responsible AI tool for businesses compared to OpenAI. In this blog, we will guide you on how to address these customer concerns when selling Microsoft 365 Copilot, and how to effectively win them over with its strong focus on data security. Let’s get going!

“Is My Personal and Company Data Used for Training Microsoft Copilot?”

No. Microsoft 365 Copilot does not use your personal or company data to train or improve Copilot or its AI features, nor are they used to train other third-party products or services, such as OpenAI. The only exception would be if your tenant administrator has chosen to opt into optional data sharing, but this is under your organization's control.

How to Win Them Over Compared to OpenAI

Samsung employees have learned this the hard way: ChatGPT, whether used for personal or work-related matters, by default saves chat history and uses those conversations for further model training. Personal questions, company files shared, email responses to clients, or discussions about code errors - these chats could be about anything, and there is a potential risk of unintentional disclosure of sensitive or private information, especially if employees are using versions like ChatGPT Free or ChatGPT Plus.

In response to these concerns, OpenAI recently introduced ChatGPT Team and ChatGPT Enterprise. These new versions are designed for organizations and come with enhanced security and privacy measures. The big change here is that user conversations or company data won’t be used for the AI learning process.

A key factor to highlight for potential customers is that there's still a possibility that their employees may continue to use ChatGPT Free or ChatGPT PLUS, with associated risks. Additionally, there's still a 'but' in ChatGPT Team and ChatGPT Enterprise. Even if they do not use data for training OpenAI models, they haven't fully addressed concerns related to third parties. While they ensure that user data isn't shared with third parties for marketing purposes, they're not entirely transparent about how they handle the data for training.

“Is Microsoft 365 Copilot GDPR Compliant?”

Yes. Microsoft 365 Copilot takes GDPR compliance seriously and is notably transparent about their data processing practices.

GDPR requires openness about what, where, and how data is collected and used. For Microsoft 365 Copilot, they can find detailed explanations and graphs about it on Microsoft Learn. Microsoft 365 Copilot is open about this information and does not conceal it, even when it may not necessarily benefit them. This level of transparency enables organizations to precisely understand how their data is handled and assess any potential concerns, for example:

Microsoft 365 Copilot ensures that prompts, responses, and data accessed through Microsoft Graph aren't used to train large language models, assuring that their company's confidential data is secure and will not end up in other responses. However, they are transparent about the fact that when a user interacts with Microsoft Copilot for Microsoft 365 apps like Word, PowerPoint, Excel, OneNote, Loop, etc., they do store data related to these interactions. This stored data includes the user's prompt, Copilot's response, and information used to support Copilot's response.

While prospects may raise questions about the necessity and extent of data storage in compliance with GDPR requirements, it's important to mention to them that this data is processed and stored in alignment with contractual commitments made to their organization's other content in Microsoft 365. Additionally, Microsoft 365 admins have tools at their disposal to manage this data, such as Microsoft Purview, which can be used to set retention policies for data related to interactions with Copilot. Microsoft 365 Copilot is also very clear that companies have full control over their data, and Copilot does not have access to any organizational data within their tenant boundary.

With GDPR, they have the 'right to be forgotten'. Since Microsoft 365 Copilot does not use data for training, they have the capability to completely hard-delete a user's history of interactions with Microsoft Copilot for Microsoft 365, including the user account. This data cannot be restored, and all data linked to the user account is permanently removed from the Microsoft cloud.

For European Union companies concerned about the potential transfer of their company data outside the EU, it's important to highlight that Microsoft Copilot offers additional safeguards to ensure GDPR compliance and adherence to Microsoft's EU Data Boundary. This means that the personal and company data of EU customers is securely stored and processed within the EU, alleviating any data residency concerns.

It is also worth mentioning that Microsoft 365 Copilot takes a serious approach to addressing any organizational concerns related to data protection. They state, "We prioritize an open dialogue with our customers, partners, and regulatory authorities to better understand and address concerns, thereby fostering an environment of trust and cooperation."

How to Win Them Over Compared to OpenAI

Like many companies these days, OpenAI says it is GDPR compliant. While this may sound good on paper, it's important to highlight to potential customers that there are several ongoing investigations primarily due to concerns about the transparency and legality of its data-processing practices. These investigations have valid reasons and highlight genuine concerns about OpenAI’s GDPR compliance that should be taken into account.

GDPR requires companies to be open about how they collect and use personal data. However, OpenAI doesn't provide clear information about the rules they follow for gathering and processing data. For companies, without a clear understanding of where and how their data is utilized, there's a risk of losing control over confidential or sensitive information.

Under GDPR, data should only be used as necessary, for specific, explicit, legitimate purposes, and clear reasons. However, with OpenAI, there's concern that it might use the data for purposes beyond what it's supposed to.

In OpenAI, they do not offer the 'right to be forgotten.' As AI is continuously learning and evolving, protecting privacy or addressing issues such as the accidental disclosure of confidential company information becomes particularly complex. It is worth mentioning to potential customers that OpenAI acknowledges the technical intricacies involved and states that they "are not able to delete specific prompts from your history." Consequently, they advise users against sharing any sensitive or private information during their interactions with the AI. While companies can inform employees about this and how to safely use OpenAI, there is still a risk of unintentional leaks that cannot be rectified.

OpenAI's recent decision to relocate its legal entity to Dublin, Ireland, as part of its commitment to GDPR compliance is indeed a noteworthy step. They aim to closely align with GDPR and enhance privacy oversight. With the updated Privacy Policy for users in the EEA, Switzerland, and the UK, effective from February 15, 2024, one significant change is the appointment of OpenAI Ireland Limited as the data controller. This change may give many customers the impression of GDPR compliance concerning data processing within the EU.

The winning point here is that there are still some aspects of the privacy policy that overlook this change and raise eyebrows. The new policy clearly states, "We WILL transfer your data to recipients outside of the EEA, Switzerland, and the UK," and "that third country may not offer the same level of data protection as your home country." That means not the possibility of transferring their data outside the EU, but clear actions that are not compliant with GDPR.

The discussions have also brought up OpenAI's new Europe Terms of Use, where OpenAI states, "If you do not agree to the changes, you must stop using our Services." This all-or-nothing choice may raise concerns about alignment with GDPR principles, including clear consent, data portability, data retention, and the lawful basis for data processing. It also suggests that OpenAI may not be willing to engage in discussions and address organizational concerns.

“Does Microsoft 365 Copilot Share My Company's Data with Third Parties?”

No. Microsoft 365 Copilot prioritizes the security and privacy of your company's data. Your data is not shared with any third party unless you explicitly grant permission to do so.

Companies can enhance Microsoft 365 Copilot's functionality with plugins that enable Copilot to access and utilize third-party applications and services. These third-party apps and services may have their own policies and practices for sharing data with third parties. Therefore, companies should be aware that when using such plugins, they may be subject to the privacy practices and policies of the specific third-party applications, which are separate from those of Microsoft 365 Copilot.

How to Win Them Over Compared to OpenAI

OpenAI's approach to data sharing varies between their different services. For ChatGPT Team and ChatGPT API, OpenAI mentions that they share data with third-party contractors under strict confidentiality and security obligations, primarily for the purpose of reviewing for abuse and misuse. However, they do not provide information on how they share data with third parties in ChatGPT Enterprise, which is tailored for business use. This lack of transparency in data sharing practices can be a point of concern for potential customers.

Microsoft Partners can also use OpenAI's privacy policy to win over customers who are concerned about data sharing practices with third parties. OpenAI's policy says that data could be shared with third parties without telling users unless they're legally required to.

This means that the company's sensitive data could potentially be handled by others, and they might not even be aware of it. GDPR, though, has strict regulations in this regard. It requires that users must always be informed of third-party data sharing and must provide clear consent for it, so OpenAI's policy doesn't fully align with it.

Takeaways & Next Steps

By showcasing the advantages of Microsoft 365 Copilot and its commitment to data privacy and GDPR compliance, you present a compelling selling point to customers concerned about data protection. Microsoft Partners can also draw attention to OpenAI's data security concerns regarding transparency and GDPR compliance. By highlighting Microsoft 365 Copilot's edge in these areas, you can position it as a more secure and responsible AI tool for businesses.

So, go ahead, feel confident in showcasing Microsoft 365 Copilot's data protection capabilities, and seize the opportunity to close more deals. If you need help with pre-sales activities, learn more about how we can assist you with support, demos, presentations, and Proof of Concepts for your clients.

Want to learn more about Copilot and get other exclusive content straight to your inbox? Subscribe to our newsletter now and stay ahead of the curve!