Answers to Copilot questions you’ve always wondered
With the onset of the powerful Microsoft AI tool, Copilot, we have been getting more questions on surrounding data security. If AI can gather all this information, what is it doing with it in the background?
Rest assured; your data is safe. It is protected by comprehensive, industry-leading compliance, security, and privacy controls. Copilot for Dynamics 365 has a set of core security and privacy practices and is built around Microsoft’s Responsible AI Standards. Additionally, because Copilot is running in Azure cloud, your data is secured and not shared with OpenAI.
Copilot and your data
To sum it up, you are in control of your data. Microsoft doesn’t share your data unless you’ve granted permission. Additionally, Copilot doesn’t use your customer data to train the AI features unless you provide it consent to do so. Copilot adheres to your existing data permissions and policies. Microsoft Learn offers this list of how your data is handled and how to control it: https://learn.microsoft.com/en-us/power-platform/faqs-copilot-data-security-privacy#copilot-in-dynamics-365-apps-and-power-platform
In short, your prompts (inputs) and Copilot’s responses (outputs or results):
- Are NOT available to other customers.
- Are NOT used to train or improve any third-party products or services (such as OpenAI models).
- Are NOT used to train or improve Microsoft AI models, unless your tenant admin opts into sharing data with them. Learn more at FAQ for optional data sharing for Copilot AI features in Dynamics 365 and Power Platform.
Should you opt in and share your data?

This is the Copilot question of the century. If you opt in and share, you will get better, faster, and more personalized results, but do you want to share it? By default, the data sharing option is disabled. After you opt-in, you can choose to opt-out at any time. To do this you need access to the Power Platform admin center. Just turn off/on the “Data sharing for Dynamics 365 Copilot and Power Platform Copilot AI Features” button.
When you opt-out, your data is no longer shared. You own your data and Microsoft will only use your data within the services you’ve agreed upon. If you decide Copilot is not for you, all your data will be deleted within 30 days of opting out.
Is Copilot always factual?
As with any generative AI, Copilot responses are not guaranteed to be 100 percent factual. They are educated guesses based on machine understanding of human concepts. Copilot can process tons of information very quickly, but occasionally it misinterprets that information. A bonus – Sometimes the results are very entertaining, and it gives you some insight as to how our brains are different from computers.
Microsoft is committed to continuously improving the accuracy of responses, but as you might expect, we recommend that you thoroughly review any Copilot-generated information and use your best judgment before sharing it with others.
Microsoft’s development teams are actively addressing challenges such as misinformation, disinformation, content blocking, data safety, and the mitigation of harmful or discriminatory content, adhering strictly to responsible AI principles. There is plenty of guidance if you need it. For example, Microsoft Learn offers instructions, prompts, and reminders to review and edit responses, and to verify facts, data, and text for accuracy prior to use. Where possible, Copilot will cite its information sources, whether they are public or internal, so that you can confirm the accuracy of responses.
Ensuring responsible and safe AI usage
Copilot uses a robust content filtering system in the Azure OpenAI Service. It addresses categories such as Hate & Fairness, Sexual Content, Violence, Self-harm, and more. This type of filtering evaluates both the input prompts and generated responses. It uses classification models, effectively identifying and blocking harmful content. While it can’t stop everything, it does a pretty good job of ensuring that the results you get don’t fall into those categories. Here are more details on content filtering: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/content-filter?tabs=warning%2Cuser-prompt%2Cpython#harm-categories
If you have additional Copilot questions about how to best leverage the AI tool or about data privacy, Boyer & Associates has the answers. Give us a call and let’s discuss how we can help your business.