This is how Microsoft described how security would work:
1. On Pre-processing stage information goes from Office applications into your internal Graph data
2. Copilot creates a Prompt to Large Language Model (e.g. ChatGPT)
(Prompt is a command what LLM has to do – same as what you write in ChatGPT)
3. Response goes back from LLM through a “Grounding” stage that actually saturates the response with your internal data
4. Response is sent back to respective application
There is no Copilot Release Date yet – but judging on rumors sometimes this Summer we’ll see it available.