Executive Summary
A user without a Microsoft 365 Copilot license can interact with a custom Agent built in Copilot Studio that uses both public and company data sources. However, their access to company data will be strictly governed by their existing Microsoft Entra ID permissions to those specific data sources (e.g., SharePoint, Dataverse, uploaded PDFs). If they lack the necessary permissions or if the agent’s authentication is not configured to allow their access to internal resources, the agent will effectively be “blocked” from retrieving or generating responses from that internal company data for them. Their results will then be limited to what can be generated from public data sources, provided the agent is designed to handle such limitations gracefully and the query can be fulfilled solely from public information. It is crucial to note that interactions by unlicensed users with Copilot Studio agents, especially those using generative answers or internal data, will incur costs against the organization’s Copilot Studio message capacity.
Introduction
The rapid evolution of AI capabilities within Microsoft’s ecosystem, particularly with Microsoft 365 Copilot and Copilot Studio, has introduced powerful new ways to interact with information. However, this advancement often brings complexities, especially concerning licensing and data access. A common point of confusion arises when organizations deploy custom AI Agents built using Microsoft Copilot Studio, which can leverage a mix of public internet data and sensitive internal company information (such as PDFs, SharePoint documents, or Dataverse records). The central question for IT professionals is whether users who do not possess a Microsoft 365 Copilot license will be able to utilize these Agents, and if so, what limitations apply, particularly regarding access to proprietary company data. This report aims to demystify these interactions, providing a clear, definitive guide to the interplay of licensing, data grounding, and authentication for Copilot Studio Agents.
Understanding the Copilot Ecosystem
The Microsoft Copilot ecosystem comprises several distinct but interconnected components, each with its own purpose and licensing model. Understanding these distinctions is fundamental to clarifying access rights.
Microsoft 365 Copilot: The Enterprise Productivity AI
Microsoft 365 Copilot represents an advanced AI-powered productivity tool deeply integrated across the suite of Microsoft 365 applications, including Word, Excel, PowerPoint, Outlook, and Teams. Its primary function is to enhance user productivity by orchestrating Large Language Models (LLMs) with a user’s organizational data residing within Microsoft Graph and the content generated and managed within Microsoft 365 applications.1 This powerful synergy enables it to generate responses, summarize extensive content, draft emails, create presentations, and analyze data, all within the rich context of a user’s specific work environment.
To fully leverage Microsoft 365 Copilot, users must satisfy specific licensing prerequisites. This includes possession of an eligible Microsoft 365 or Office 365 base license, such as Microsoft 365 E3, E5, A3, A5, Business Standard, Business Premium, or Office 365 E3, E5, A3, A5, F1, F3, E1, Business Basic.2 Beyond this foundational license, a separate Microsoft 365 Copilot add-on license is required, typically priced at $30 per user per month.3 This add-on license is not merely an optional feature; it is essential for unlocking the full spectrum of capabilities, particularly its seamless ability to access and ground responses in a user’s
shared enterprise data and individual data that is indexed via Microsoft Graph.1
A cornerstone of Microsoft 365 Copilot’s design is its robust Enterprise Data Protection (EDP) framework. It operates strictly within the Microsoft 365 service boundary, ensuring that all user prompts, retrieved data, and generated responses remain securely within the organization’s tenant.1 Critically, this data is not utilized to train the foundational LLMs that power Microsoft 365 Copilot. Furthermore, the system rigorously adheres to existing Microsoft 365 permission models. This means that Microsoft 365 Copilot will “only surface organizational data to which individual users have at least view permissions”.1 This semantic index inherently respects user identity-based access boundaries, preventing unauthorized data exposure. This design implies a fundamental level of trust where Microsoft 365 Copilot acts as an intelligent extension of the user’s existing access rights within the Microsoft 365 ecosystem. This broad, personalized access to all relevant Microsoft 365 data, coupled with built-in security and privacy controls that mirror existing access permissions, represents a core differentiation from a more basic custom agent built in Copilot Studio. Consequently, organizations planning to deploy Microsoft 365 Copilot must first ensure their Microsoft 365 permission structures are meticulously managed and robust. Without proper governance, Copilot could inadvertently expose data based on over-permissioned content or previously “dark data,” underscoring the necessity for a well-defined data access strategy.
Microsoft Copilot Studio: The Custom Agent Builder
In contrast to the integrated nature of Microsoft 365 Copilot, Microsoft Copilot Studio serves as a low-code platform specifically engineered for the creation of custom conversational AI agents. These agents, often referred to as “copilots” or “bots,” are designed to answer specific questions, perform defined tasks, and integrate with a diverse array of data sources and external systems.5 A key strength of Copilot Studio is its versatility in deployment; agents can be published across multiple channels, including public websites, Microsoft Teams, and can even be configured to extend the capabilities of Microsoft 365 Copilot itself.5 The platform empowers makers to define explicit instructions, curate specific knowledge sources, and program actions for their agents.6
The agents developed within Copilot Studio possess a standalone operational nature. They can function independently, establishing connections to various data repositories. These include public websites, documents uploaded directly (such as PDFs), content residing in SharePoint, data within Dataverse, and other enterprise systems accessible via a wide range of connectors.5 This independent operation distinguishes them from the deeply embedded functionality of Microsoft 365 Copilot.
Despite their standalone capability, Copilot Studio agents are also designed for seamless integration with Microsoft 365 Copilot. They can be purpose-built to “extend Microsoft 365 Copilot” 6, allowing organizations to develop highly specialized agents. These custom agents can then leverage the sophisticated orchestration engine of Microsoft 365 Copilot, incorporating bespoke domain knowledge or executing specific actions directly within the broader Microsoft 365 Copilot experience. This positions Copilot Studio as a controlled gateway for data access. Unlike Microsoft 365 Copilot, which provides broad access to a user’s Microsoft Graph data based on their existing permissions, Copilot Studio explicitly enables makers to
select and configure precise knowledge sources.7 This granular control over what information an agent can draw upon is critical for effective governance. It makes Copilot Studio particularly suitable for scenarios where only specific, curated datasets should be exposed via an AI agent, even if the user might possess broader permissions elsewhere within the Microsoft 365 environment. This capability allows organizations to create agents that offer targeted access to internal knowledge bases for a wider audience, potentially including users who do not possess Microsoft 365 Copilot licenses, without inadvertently exposing the full breadth of their Microsoft Graph data. However, achieving this requires meticulous configuration of the agent’s knowledge sources and authentication mechanisms.
Licensing Models for Copilot Studio Agents
A frequent area of misunderstanding pertains to the distinct licensing model governing Copilot Studio Agents, which operates separately from the Microsoft 365 Copilot user license. This fundamental distinction means that the agent itself incurs costs based on its usage, regardless of whether the individual interacting with it holds a Microsoft 365 Copilot license.
Copilot Studio’s Independent Licensing
Copilot Studio offers flexible licensing models tailored to organizational needs. The Pay-as-you-go model allows organizations to pay solely for the actual number of messages consumed by their agents each month, eliminating the need for upfront commitment. This model is priced at $0.01 per message 8, providing inherent flexibility and scalability to accommodate fluctuating usage patterns.9 Alternatively, organizations can opt for
Message Packs, which are subscription-based, with one message pack equating to 25,000 messages per month for a cost of $200 per tenant.8 Should an agent’s usage exceed the purchased message pack capacity, the pay-as-you-go meter automatically activates to cover the excess.8 It is important to note that unused messages from a message pack do not roll over to the subsequent month.8
A critical understanding for any deployment is that all interactions with a Copilot Studio agent that result in a generated response—defined as a “message”—contribute to these billing models. This applies unless specific exceptions are met, such as interactions by users holding a Microsoft 365 Copilot license within Microsoft 365 services, as detailed below. Consequently, interactions initiated by users who do not possess a Microsoft 365 Copilot license will directly consume from the organization’s Copilot Studio message capacity and, therefore, incur costs.8 This represents a significant operational cost consideration that is often overlooked. Even when an unlicensed user interacts with a Copilot Studio agent to query seemingly “free” public data, the organization still bears a per-message cost for the Copilot Studio service itself. This necessitates a careful evaluation of the anticipated usage by unlicensed users and the integration of these Copilot Studio message costs into the overall budget. Such financial implications can significantly influence decisions regarding the broad exposure of certain agents versus prioritizing Microsoft 365 Copilot licensing for frequent users who require access to internal data, thereby leveraging the benefits of zero-rated usage.
Copilot Studio Use Rights with Microsoft 365 Copilot
For users who are provisioned with a Microsoft 365 Copilot license, a distinct advantage emerges in their interactions with Copilot Studio agents. Their usage of agents specifically built in Copilot Studio for deployment within Microsoft Teams, SharePoint, and Microsoft 365 Copilot (such as Copilot Chat) is designated as “zero-rated”.8 This means that interactions by these licensed users, when occurring within the context of Microsoft 365 products, do not count against the organization’s Copilot Studio message pack or pay-as-you-go meter. This zero-rating applies to classic answers, generative answers, and tenant Microsoft Graph grounding.8
Beyond cost benefits, the Microsoft 365 Copilot license also confers specific use rights within Copilot Studio. These rights include the ability to “Create and publish your own agents and plugins to extend Microsoft 365 Copilot”.8 This capability underscores a symbiotic relationship: users with Microsoft 365 Copilot licenses gain enhanced functionality and significant cost efficiencies when interacting with custom agents that are integrated within the Microsoft 365 ecosystem. This contrast in billing models highlights a clear financial incentive. If a substantial volume of agent usage involves internal data or generative answers, and the users engaged in these interactions already possess Microsoft 365 Copilot licenses, the organization benefits from the zero-rated usage, potentially leading to considerable cost savings. Conversely, if a large proportion of users are unlicensed, every message generated by the Copilot Studio agent will incur a direct cost. This situation presents a strategic licensing decision point for organizations. A thorough analysis of the user base and agent usage patterns is advisable. If widespread access to internal data via AI agents is a strategic priority, investing in Microsoft 365 Copilot licenses for relevant users can substantially reduce or eliminate the Copilot Studio message costs for those specific interactions within Microsoft 365 applications. This tiered access and cost model is crucial for informing the overall AI strategy and budget allocation, distinguishing between basic, publicly-grounded agents (which still incur Copilot Studio message costs for unlicensed users) and agents providing deep internal data insights (which are more cost-effective when accessed by Microsoft 365 Copilot licensed users within the Microsoft 365 environment).
Data Grounding and Knowledge Sources in Copilot Studio
Copilot Studio agents derive their intelligence and ability to provide relevant information from “knowledge sources,” which are meticulously configured to provide the data necessary for generative answers. The specific type of knowledge source selected directly dictates its authentication requirements and, consequently, determines who can access the information presented by the agent.
Supported Knowledge Sources
Copilot Studio agents offer robust capabilities for grounding in a diverse array of data sources, enabling them to provide rich, relevant information and insights.7 These supported knowledge sources include:
- Public Websites: Agents can be configured to search and return results from specific, predefined public URLs. Additionally, they can perform a broader general web search, drawing information from public websites indexed by Bing.7 Crucially, no authentication is required for public websites to serve as a knowledge source.7
- Documents (Uploaded Files/PDFs): Agents can search the content of documents, including PDFs, that have been uploaded to Dataverse. The agent then generates responses based on the information contained within these document contents.7 These are considered internal organizational sources.
- SharePoint: Agents can establish connections to specified SharePoint URLs, utilizing Microsoft Graph Search capabilities to retrieve and return relevant results from the SharePoint environment.7 This is a common internal data source for many organizations.
- Dataverse: The agent can connect directly to the configured Dataverse environment, employing retrieval-augmented generative techniques within Dataverse to synthesize and return results from structured data.7 This is a powerful internal data source for business applications.
- Enterprise Data using Connectors: Copilot Studio agents can connect to a wide array of connectors that facilitate access to organizational data indexed by Microsoft Search or other external systems.5 The platform supports over 1,400 Power Platform connectors, enabling integration with a vast ecosystem of internal and third-party services.5 These are fundamental internal data sources.
- Real-time Connectors: For specific enterprise systems like ServiceNow or Zendesk, real-time connectors can be added.10 In these configurations, Microsoft primarily indexes metadata, such as table and column names, rather than the raw data itself. Access to the actual enterprise data remains strictly controlled by the user’s existing access permissions within that specific enterprise system.10
The Role of Authentication
Authentication plays an indispensable role in controlling access for agents that interact with restricted resources or sensitive information.11 Copilot Studio provides several authentication options to meet varying security requirements, including “No authentication,” “Authenticate with Microsoft” (leveraging Microsoft Entra ID), or “Authenticate manually” with various OAuth2 identity providers such as Google or Facebook.11
The choice of authentication directly impacts data accessibility:
- Public Data Access: If an agent is configured with the “No authentication” option, it is inherently limited to accessing only public information and resources.11 This configuration allows public website grounding to function without requiring any user sign-in.
- Internal Data Access: For knowledge sources containing sensitive internal data, such as SharePoint, Dataverse, or enterprise data accessed via connectors, authentication is explicitly required.7 These internal sources typically rely on the “Agent user’s Microsoft Entra ID authentication”.7 This means that the user interacting with the agent must successfully sign in with their Microsoft Entra ID account. Once authenticated, their existing Microsoft Entra ID permissions to the underlying data source are meticulously honored.1
This principle of permission inheritance is foundational. Microsoft 365 Copilot, and by extension, Copilot Studio agents configured to access Microsoft 365 data, will “only surface organizational data to which individual users have at least view permissions”.1 This fundamental security control ensures that the AI agent cannot inadvertently or intentionally provide information to a user that they would not otherwise be authorized to access directly. This establishes the user’s existing permission boundary as the ultimate gatekeeper for data access. The most significant factor in “blocking” access to internal company data is not the Copilot Studio agent’s configuration itself, but rather the underlying permission structure within Microsoft 365, encompassing SharePoint permissions, Dataverse security roles, and granular file-level permissions. If a user lacks the requisite permissions to a specific document, SharePoint site, or Dataverse record, the agent is inherently unable to retrieve or generate information from that source for them, irrespective of the agent’s own capabilities. This reinforces the paramount importance of robust data governance and diligent permission management within an organization. The deployment of AI agents amplifies the necessity for a “least privilege” approach to data access, ensuring that any potential data exposure via an agent is a symptom of pre-existing permission vulnerabilities, rather than a flaw in the agent’s inherent security model.
The Core Question: Unlicensed Users and Mixed Data Agents
Addressing the user’s central query directly, the behavior of a Copilot Studio Agent configured with mixed data sources (combining company data and public websites) when accessed by users without a Microsoft 365 Copilot license is nuanced but can be clearly defined.
Access to Public Data
Users who do not possess a Microsoft 365 Copilot license can indeed access information derived from public websites through a Copilot Studio Agent. This functionality is feasible under specific conditions: the agent must be explicitly configured to utilize public website knowledge sources.7 Furthermore, the agent’s authentication setting can be configured as “No authentication” 11, allowing access without requiring user sign-in, provided the query can be resolved solely from public sources. It is important to remember that even in this scenario, the organization will incur Copilot Studio message costs for these interactions.8 The mechanism for this access involves the agent searching public websites indexed by Bing when Web Search is enabled, a process that can occur in parallel with searches of specific public website knowledge sources configured for the agent.7
Access to Company Data (PDFs, SharePoint, etc.)
Access to internal company data, such as PDFs uploaded to Dataverse, SharePoint content, or data retrieved from enterprise connectors, is fundamentally governed by the individual user’s existing permissions to those specific data sources.1 This is the primary blocking mechanism. If an unlicensed user—or, for that matter, any user regardless of their licensing status—lacks the necessary view permissions to the underlying document, SharePoint site, or Dataverse record, the agent will unequivocally
not be able to retrieve or generate responses from that data for them. The agent operates strictly within the security context and permission boundaries of the interacting user.
For agents configured to access internal data sources like SharePoint, Dataverse, or enterprise connectors, authentication is an explicit requirement.7 This typically involves setting the agent’s authentication to “Authenticate with Microsoft,” which mandates that the user interacting with the agent
must sign in with their Microsoft Entra ID account. Upon successful authentication, the user’s existing Microsoft Entra ID permissions are rigorously checked against the internal data source.7
Therefore, for unlicensed users, the outcome is clear: they will be effectively “blocked” from accessing company data via the agent if they lack the necessary permissions to the underlying data source. This blocking also occurs if the agent’s authentication configuration, such as being set to “Authenticate with Microsoft” without the user being signed in, or if organizational policies prevent their access to internal resources, restricts their access. In scenarios where an agent is configured with mixed data sources, and a query requires internal data for which the user is unauthorized, the agent’s response will be limited to what can be generated from the public data sources. This limitation is only possible if the agent has been specifically designed to gracefully handle such access denials and if the query can be adequately fulfilled using only public information. This graceful degradation is a critical aspect of agent design.
It is also important to understand the nuance regarding Microsoft Graph grounding and Enterprise Data Protection (EDP). While Copilot Studio agents can be configured to access specific SharePoint sites or Graph connectors 4, the broader, seamless access to a user’s
shared enterprise data, individual data, or external data indexed via Microsoft Graph connectors is a core capability fundamentally tied to the Microsoft 365 Copilot license.1 For users who do not possess a Microsoft 365 Copilot license, “Copilot Chat can’t access the user’s shared enterprise data, individual data, or external data indexed via Microsoft Graph connectors”.4 This means that while a Copilot Studio agent
could be configured to provide access to a specific, shared SharePoint site for an unlicensed user (provided authentication and permissions are met, and Copilot Studio metering is enabled), the unlicensed user will not experience the personalized, broad Graph-grounded capabilities that a Microsoft 365 Copilot licensed user would. Furthermore, the “Enhanced search results” feature within Copilot Studio, which leverages semantic search to improve the quality of results from SharePoint and connectors, also necessitates a Microsoft 365 Copilot license within the same tenant and requires the agent’s authentication to be set to “Authenticate with Microsoft”.7
The distinction between results being “limited” or “blocked” is crucial. While access to company data is generally “blocked” if permissions are not met, the results can be “limited” to public data if the agent is intelligently programmed to fall back to publicly available information. This highlights a critical imperative for agent design: developers building agents with mixed data sources must explicitly consider and implement how the agent behaves when internal data access is denied for a given user. This requires robust error handling and conditional logic within the agent’s design. If an agent attempts to access internal data for a user and is denied due to insufficient permissions, it should be programmed either to inform the user clearly about the access restriction or to attempt to answer the query using only the public data sources, if applicable and relevant. This proactive approach ensures a significantly better user experience, preventing hard failures or ambiguous responses. This extends beyond mere technical feasibility to encompass user experience design and effective governance, as an agent that silently fails or provides incomplete answers without explanation can lead to user frustration and erode trust in the system. Clear communication regarding data access limitations is therefore essential.
The following table summarizes user access capabilities to mixed-data agents based on their Microsoft 365 Copilot license status:
| Feature/Scenario | User has Microsoft 365 Copilot License | User does NOT have Microsoft 365 Copilot License |
|---|---|---|
| Access to Public Website Data (via Copilot Studio Agent) | Yes (Agent configured for public sources) | Yes (Agent configured for public sources, “No authentication” option possible) |
| Access to Company Data (PDFs, SharePoint, Dataverse, etc. via Copilot Studio Agent) | Yes (Subject to user’s existing Microsoft Entra ID permissions to data sources; agent authentication required) | Yes (Subject to user’s existing Microsoft Entra ID permissions to data sources; agent authentication required; access is specific to configured Copilot Studio sources, not broad Graph access) |
| Seamless Access to User’s Shared Enterprise/Individual Data (via Microsoft Graph) | Yes (Core capability of M365 Copilot)1 | No (Copilot Chat cannot access user’s shared enterprise/individual data indexed via Microsoft Graph connectors)4 |
| “Enhanced Search Results” (Semantic Search for SharePoint/Connectors in Copilot Studio) | Yes (Requires M365 Copilot license in tenant & “Authenticate with Microsoft” for agent)7 | No (Feature requires M365 Copilot license in tenant)7 |
| Copilot Studio Message Billing for Agent Interactions | Zero-rated when used within Microsoft 365 products (Teams, SharePoint, Copilot Chat)8 | Incurs Copilot Studio message costs (Pay-as-you-go or Message Packs)8 |
| Ability to Extend Microsoft 365 Copilot with Custom Agents/Plugins | Included use rights with M365 Copilot license8 | Not included8 |
Conclusion and Recommendations
The analysis demonstrates that users without a Microsoft 365 Copilot license are not entirely excluded from interacting with custom Agents built in Copilot Studio that leverage both public and company data sources. However, their access is critically contingent upon several factors, primarily their existing permissions to internal data and the authentication configuration of the Copilot Studio agent. While public data can generally be accessed without explicit user authentication (though still incurring Copilot Studio message costs for the organization), access to internal company data is strictly governed by the user’s Microsoft Entra ID permissions. If these permissions are insufficient, the agent will effectively be prevented from retrieving that sensitive information for the user.
Organizations deploying Copilot Studio Agents with mixed data sources should consider the following recommendations to ensure optimal functionality, security, and cost management:
- Prioritize Robust Data Governance: The foundational security principle is that Copilot Studio Agents, like Microsoft 365 Copilot, honor existing user permissions. Therefore, a meticulous review and ongoing management of permissions on SharePoint sites, Dataverse environments, and other internal data sources are paramount. This proactive approach prevents unintended data exposure and ensures that agents only surface information to authorized individuals.1
- Implement Strategic Authentication: Configure Copilot Studio agent authentication settings carefully based on the data sources employed. For agents accessing internal company data, “Authenticate with Microsoft” should be enabled to leverage Microsoft Entra ID and enforce user-specific permissions. For agents relying solely on public information, “No authentication” can be used, but with an understanding of the associated Copilot Studio message costs.7
- Design Agents for Graceful Degradation: When developing agents that combine public and internal data sources, incorporate robust error handling and conditional logic. If an agent attempts to access internal data for an unauthorized user, it should be programmed to either clearly inform the user of the access restriction or intelligently pivot to providing information solely from public sources, if the query allows. This approach enhances the user experience and maintains trust in the agent’s capabilities.
- Manage Copilot Studio Costs Proactively: All interactions with a Copilot Studio agent, regardless of the user’s Microsoft 365 Copilot license status, consume messages that are billed against the organization’s Copilot Studio capacity (Pay-as-you-go or Message Packs).8 Organizations should closely monitor message consumption and factor these costs into their budget.
- Leverage Microsoft 365 Copilot Licenses Strategically: For scenarios requiring extensive, personalized access to Microsoft Graph-grounded enterprise data via agents within Microsoft 365 applications (Teams, SharePoint, Copilot Chat), licensing users for Microsoft 365 Copilot offers significant benefits, including zero-rated Copilot Studio message usage for those interactions.8 This can lead to substantial cost optimization for high-usage internal data scenarios.
- Manage User Expectations: Clearly communicate to users what an agent can and cannot provide based on their licensing and permissions. Transparency helps manage expectations and reduces frustration when access to certain internal data is restricted.
By adhering to these recommendations, organizations can effectively deploy Copilot Studio Agents, maximizing their utility across diverse user groups while maintaining stringent control over data access and managing operational costs efficiently.
Works cited
- Data, Privacy, and Security for Microsoft 365 Copilot, accessed on July 3, 2025, https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy
- App and network requirements for Microsoft 365 Copilot admins, accessed on July 3, 2025, https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-requirements
- Microsoft 365 Copilot: Licensing, Pricing, ROI – SAMexpert, accessed on July 3, 2025, https://samexpert.com/microsoft-365-copilot-licensing/
- Frequently asked questions about Microsoft 365 Copilot Chat, accessed on July 3, 2025, https://learn.microsoft.com/en-us/copilot/faq
- Customize Copilot and Create Agents | Microsoft Copilot Studio, accessed on July 3, 2025, https://www.microsoft.com/en-us/microsoft-copilot/microsoft-copilot-studio
- Copilot Studio overview – Learn Microsoft, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/fundamentals-what-is-copilot-studio
- Knowledge sources overview – Microsoft Copilot Studio, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/knowledge-copilot-studio
- Copilot Studio licensing – Learn Microsoft, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/billing-licensing
- Billing rates and management – Microsoft Copilot Studio, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/requirements-messages-management
- Add real-time knowledge with connectors – Microsoft Copilot Studio, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/knowledge-real-time-connectors
- Configure user authentication in Copilot Studio – Learn Microsoft, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/configuration-end-user-authentication
- FAQ for Copilot data security and privacy for Dynamics 365 and Power Platform, accessed on July 3, 2025, https://learn.microsoft.com/en-us/power-platform/faqs-copilot-data-security-privacy
- Copilot Studio security and governance – Learn Microsoft, accessed on July 3, 2025, https://learn.microsoft.com/en-us/microsoft-copilot-studio/security-and-governance
One thought on “Navigating Copilot Studio Agent Access: Data Grounding and Licensing for Unlicensed Users”