Microsoft Doesn’t Want Employees Sharing Sensitive Data With ChatGPT

Microsoft may be going all-in on OpenAI tech and ChatGPT, but that doesn't mean the company wants sensitive information shared with it....
Microsoft Doesn’t Want Employees Sharing Sensitive Data With ChatGPT
Written by Staff
  • Microsoft may be going all-in on OpenAI tech and ChatGPT, but that doesn’t mean the company wants sensitive information shared with it.

    Microsoft is rolling out ChatGPT across multiple products and has no objection to its own employees using the tech. However, the company wants to make sure no sensitive information is shared with the AI.

    “Please don’t send sensitive data to an OpenAI endpoint, as they may use it for training future models,” a senior engineer wrote in an internal post that was reviewed by Business Insider.

    The memo demonstrates one of the biggest challenges moving forward with large language model AIs, namely controlling what information it has access to, and how that information will be used if it is shared.

    ChatGPT is a conversational AI that learns from its interactions and what people type into it. As such, it’s not surprising that Microsoft wants to make sure no sensitive information is shared with it, since the AI could then end up using that information in its responses to users.

    “Human beings sign NDAs and consequently have incentives to be careful in how they share information. But large language models such as ChatGPT do not have the ability to reason about such issues, at least by default,” Vincent Conitzer, Carnegie Mellon University computer science professor and director of its AI lab, told Insider.

    Microsoft’s caution is one other companies would do well to imitate.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit