Microsoft PL-900 Power Platform Fundamentals Exam Dumps and Practice Test Questions Set 15 Q211 — 225

Microsoft PL-900 Power Platform Fundamentals Exam Dumps and Practice Test Questions Set 15 Q211 — 225

Visit here for our full Microsoft PL-900 exam dumps and practice test questions.

Question 211

A company needs to create a chatbot that handles common HR questions about benefits and time off policies. Which Power Platform tool should be used?

A) Power Apps canvas app

B) Power Virtual Agents

C) Power Automate desktop flow

D) Power BI dashboard

Answer: B

Explanation:

Automating responses to frequently asked questions improves service efficiency and provides instant support availability. Conversational AI platforms enable creating intelligent chatbots without extensive coding.

Power Virtual Agents enables creating AI-powered chatbots that handle common inquiries through natural conversation. Users design conversation topics defining question variations and bot responses, configure authentication for secure information access, integrate with backend systems to retrieve personalized data, and deploy bots to websites, Microsoft Teams, or other channels. For HR scenarios, bots answer policy questions, guide employees through processes, and escalate complex issues to human agents.

Power Apps canvas app creates interactive applications for data entry and display but does not provide conversational AI capabilities. Canvas apps require users to navigate forms and controls rather than having natural language conversations.

Power Automate desktop flow automates repetitive desktop tasks but does not create conversational interfaces. Desktop flows perform robotic process automation rather than handling user inquiries through chat.

Power BI dashboard visualizes data through charts and reports but does not provide conversational interaction. Dashboards display information but do not answer questions through dialogue.

Power Virtual Agents implementation involves creating a bot in the Power Virtual Agents portal, designing topics that represent conversation subjects like benefits enrollment or PTO requests, adding trigger phrases that activate topics when users ask related questions, building conversation paths using message nodes, question nodes, and condition branches, and integrating actions that call Power Automate flows for data retrieval or process execution.

Conversation design includes greeting messages welcoming users and explaining bot capabilities, question nodes collecting information from users, condition branches routing conversations based on user responses, entity extraction identifying key information like dates or employee IDs, and fallback topics handling questions the bot cannot answer by escalating to humans.

Bot capabilities include natural language understanding interpreting user intent from varied phrasings, context maintenance tracking conversation history, suggested actions presenting options to users, integration with authentication services for secure access to personal information, and analytics showing common questions and conversation success rates informing continuous improvement.

Question 212

An organization needs to automate the process of extracting text from scanned documents and storing it in Dataverse. Which AI Builder capability should be used?

A) Prediction model

B) Text recognition

C) Object detection

D) Category classification

Answer: B

Explanation:

Converting scanned documents and images to machine-readable text enables automation and searchability. Optical character recognition technology extracts text from visual formats.

Text recognition in AI Builder extracts printed and handwritten text from images and PDF documents using optical character recognition technology. The pre-built model requires no training and works immediately processing uploaded images or documents from Power Automate flows or Power Apps. For document automation workflows, text recognition extracts text from scanned invoices, forms, contracts, or receipts enabling downstream processing like data extraction or searchable storage in Dataverse.

Prediction model creates custom machine learning models that forecast outcomes based on historical data but does not extract text from images. Prediction addresses forecasting scenarios rather than optical character recognition.

Object detection identifies and locates objects within images but does not extract text. Object detection recognizes physical items, people, or elements but does not perform character recognition.

Category classification assigns predefined categories to text or images but does not extract text from visual formats. Classification organizes content but does not convert images to text.

Text recognition implementation involves adding the AI Builder text recognition action to Power Automate flows, providing image or PDF input from email attachments, SharePoint files, or other sources, receiving extracted text as output including full text and structured data with bounding box coordinates, and processing text for storage in Dataverse, further analysis, or routing workflows.

Recognition capabilities include printed text extraction from typed documents, handwritten text recognition from forms and notes, multi-language support for documents in various languages, table detection identifying structured data in documents, and bounding box information showing text location within images enabling targeted extraction.

Common use cases include invoice processing extracting vendor and amount information, expense report automation reading receipt details, contract management converting scanned agreements to searchable text, document archival making scanned documents text-searchable, and form digitization converting paper forms to digital data.

Question 213

A business user needs to create a flow that runs on a specific schedule every weekday at 9 AM. Which type of Power Automate trigger should be used?

A) Manual trigger

B) Automated trigger

C) Scheduled trigger

D) Instant trigger

Answer: C

Explanation:

Different automation scenarios require different initiation methods. Scheduled triggers enable time-based automation for recurring business processes.

Scheduled trigger in Power Automate executes flows at specified intervals or specific times. Users configure recurrence patterns including frequency like daily or weekly, specific times like 9 AM, and day selections like weekdays only. Scheduled triggers enable routine processes like daily report generation, morning data synchronization, periodic data cleanup, or scheduled notification sending. For weekday 9 AM execution, configure daily recurrence with time set to 9:00 AM and days limited to Monday through Friday.

Manual trigger requires users to explicitly start flows by clicking buttons in Power Automate mobile app or other interfaces. Manual triggers serve on-demand scenarios but do not execute automatically on schedules.

Automated trigger initiates flows when events occur in connected systems like new email arrival, record creation in Dataverse, or file upload to SharePoint. Automated triggers respond to events rather than running on time schedules.

Instant trigger is another term for manual trigger enabling on-demand flow execution. Instant triggers are user-initiated rather than time-based.

Scheduled trigger configuration involves selecting recurrence trigger when creating flows, setting frequency to day for daily execution, specifying time zone ensuring correct execution time, entering start time as 09:00 AM, and configuring advanced options to select specific days checking Monday through Friday while unchecking weekends.

Scheduling considerations include time zone selection ensuring flows run in correct business time zones, execution window understanding flows may experience slight delays during high-load periods, concurrency settings controlling whether multiple instances can run simultaneously, and error handling implementing retry policies for transient failures.

Common scheduled flow scenarios include daily data imports synchronizing overnight changes from external systems, morning notifications sending reminders or summaries at work start, periodic cleanup removing old records or temporary files, weekly reports generating and distributing summary information, and monthly processes handling end-of-month accounting or maintenance tasks.

Question 214

An organization needs to ensure that specific fields in a Dataverse form are required only when certain conditions are met. Which feature implements conditional field requirements?

A) Security roles

B) Business rules

C) Workflows

D) Field-level security

Answer: B

Explanation:

Dynamic form behavior that adapts to data conditions improves data quality and user experience. Business rules provide declarative logic for implementing conditional requirements without coding.

Business rules in Dataverse implement conditional logic controlling field behavior on forms including making fields required or optional based on other field values, showing or hiding fields, setting field values, showing error messages, and recommending values. For conditional requirements, business rules evaluate conditions like «if opportunity stage equals proposal, then make projected close date required.» Business rules execute on forms providing immediate feedback and on the server ensuring validation regardless of how data is entered.

Security roles control who can access tables and records but do not implement conditional field requirements. Security roles address access permissions rather than dynamic form behavior.

Workflows automate background processes but do not provide real-time form behavior. Classic workflows run asynchronously after record operations complete rather than providing interactive form logic.

Field-level security restricts access to sensitive fields but does not make fields conditionally required. FLS provides read and write permissions but does not implement dynamic validation rules.

Business rule implementation involves opening the table in Dataverse, creating a new business rule, defining conditions that evaluate field values using IF statements, adding actions that execute when conditions are true such as set business required making fields mandatory, and configuring scope determining where rules apply like entity for server-side or all forms for form-based execution.

Business rule capabilities include conditions checking field values, comparing values, or evaluating multiple criteria with AND/OR logic, actions including set business required, set field value, show error message, set visibility, lock or unlock fields, and recommendations suggesting values without forcing them.

Business rule scope options include entity scope applying rules regardless of entry method ensuring server-side validation, all forms applying to all form types, and specific forms targeting individual form configurations. Entity scope provides strongest validation while form scope enables varied behavior across different forms.

Question 215

A company needs to create a Power Apps solution that different teams can customize independently without affecting each other’s changes. Which approach enables this isolation?

A) Single app shared by all teams

B) Separate solutions for each team

C) Manual code duplication

D) Shared Excel file

Answer: B

Explanation:

Managing application lifecycle across multiple teams requires isolation mechanisms preventing conflicting changes. Solutions provide packaging and deployment boundaries for independent team development.

Separate solutions for each team in Power Platform enable independent development by packaging each team’s components including apps, flows, and customizations in distinct solution containers. Teams work in their own solutions without interfering with other team changes, solutions can be exported and imported between environments independently, and dependencies are tracked ensuring required components are included. This approach supports parallel development with isolated change management.

Single app shared by all teams creates conflicts when multiple teams make simultaneous changes. Shared apps lack isolation making it difficult to test and deploy team-specific features without affecting other teams.

Manual code duplication copies components for each team but creates maintenance burden without proper dependency management. Duplication leads to drift between versions and difficulty propagating common changes.

Shared Excel file does not provide proper application lifecycle management or version control. Excel cannot package Power Platform components or provide deployment mechanisms.

Solution implementation involves creating separate managed or unmanaged solutions for each team, adding each team’s components to their solution, managing solution layers where components from multiple solutions interact, and using solution segmentation separating shared components from team-specific customizations.

Solution types include unmanaged solutions used during development allowing modification and deletion of components, and managed solutions for deployment providing lifecycle management, dependency tracking, and update rollback capabilities. Teams develop in unmanaged solutions and deploy as managed solutions to production.

ALM practices include using source control storing solution files in repositories like Azure DevOps or GitHub, implementing CI/CD pipelines automating solution deployment across environments, version numbering tracking solution versions, and environment strategy separating development, test, and production environments for each team.

Question 216

A business analyst needs to create a Power BI report that updates immediately when users select different filter values without querying the entire dataset. Which Power BI feature provides fast filtering?

A) Report-level filters

B) Drillthrough pages

C) Slicers with import mode

D) Visual interactions

Answer: C

Explanation:

Report performance and interactivity significantly impact user experience. Efficient filtering mechanisms enable responsive reports even with large datasets.

Slicers with import mode in Power BI provide fast filtering by pre-loading data into memory enabling instant filter response without server queries. Import mode loads entire datasets or representative samples into Power BI Desktop or Service where the in-memory engine provides sub-second query performance. Slicers filter visuals immediately as users make selections providing responsive interactive experiences. This approach works well for datasets under capacity limits enabling fast exploration.

Report-level filters apply to entire reports but do not specifically optimize filtering performance. Report filters provide scoping but performance depends on the underlying data connection mode and model size.

Drillthrough pages enable navigating to detailed pages based on selections but do not specifically optimize filtering performance. Drillthrough provides navigation rather than fast filtering capability.

Visual interactions control how selecting data in one visual filters other visuals but do not inherently optimize performance. Interactions define behavior relationships rather than addressing query performance.

Import mode implementation involves connecting Power BI to data sources, selecting import mode rather than DirectQuery when configuring connections, refreshing data on schedules to keep imported data current, creating measures and calculated columns in the data model, and designing reports with slicers and other interactive elements.

Import mode advantages include fastest query performance with in-memory processing, support for complex DAX calculations, offline report viewing after data is loaded, and reduced source system load since queries hit cached data rather than operational databases.

Import mode limitations include data freshness depending on refresh schedule rather than real-time, capacity constraints limiting dataset size to available memory, and refresh time requirements to update imported data. These trade-offs must be balanced against performance benefits.

Question 217

An organization needs to implement approval workflows where multiple approvers must approve in a specific sequence. Which Power Automate approval type should be used?

A) Everyone must approve (parallel)

B) First to respond

C) Sequential approval

D) Custom response options

Answer: C

Explanation:

Different business processes require different approval patterns. Sequential approvals enable hierarchical review where requests progress through approval chains in defined order.

Sequential approval in Power Automate sends approval requests to multiple approvers in specified order where each approver must approve before the request advances to the next approver. If any approver rejects, the sequence stops and the overall request is rejected. This pattern supports hierarchical approvals like manager approval followed by director approval, or multi-stage reviews where each stage must complete before proceeding. Sequential approvals ensure proper review order and enable each approver to see previous approval decisions.

Everyone must approve parallel sends requests to all approvers simultaneously requiring all to approve for overall approval but does not enforce sequence. Parallel approvals work when order does not matter but do not provide sequential processing.

First to respond sends to multiple approvers simultaneously and uses the first response received as the final decision. This pattern supports scenarios where any authorized approver can decide but does not provide sequential review.

Custom response options enable defining approval choices beyond approve/reject but do not determine approval ordering. Custom options affect available responses rather than approval sequence.

Sequential approval implementation involves adding «Start and wait for an approval» action in Power Automate flows, selecting sequential approval type from approval types, configuring assigned to field with multiple approvers in desired order using semicolons or dynamic expressions, providing request details including title and description, and processing approval outcomes with conditional actions based on response.

Sequential approval features include approval history showing which approvers have responded, comments from each approver providing context for decisions, ability to reassign requests if approvers are unavailable, timeout options automatically responding after specified periods, and notification customization determining how approvers are notified.

Common sequential approval scenarios include expense reports requiring manager then finance approval, purchase orders progressing through department head then procurement approval, vacation requests requiring supervisor then HR confirmation, and document publishing needing author approval then editor approval before release.

Question 218

A business user needs to create a Power Apps app that displays related records from multiple tables in a hierarchical view. Which control type is most appropriate?

A) Text input control

B) Gallery control with nested galleries

C) Button control

D) Label control

Answer: B

Explanation:

Displaying hierarchical relationships between data entities requires controls that support nested data structures. Gallery controls provide flexible layouts for parent-child data relationships.

Gallery control with nested galleries in Power Apps displays hierarchical data by placing child galleries inside parent gallery items. The parent gallery shows top-level records like accounts or orders, and nested galleries within each parent item display related child records like contacts or order lines. This pattern enables viewing master-detail relationships in compact hierarchical layouts. Galleries support scrolling, selection, and dynamic item rendering making them ideal for relationship visualization.

Text input control enables users to enter text but does not display collections of data or hierarchical relationships. Text inputs are for data entry rather than multi-record display.

Button control triggers actions when clicked but does not display data collections. Buttons initiate processes rather than showing hierarchical information.

Label control displays static or dynamic text but does not present multiple records or relationships. Labels show individual values rather than data collections.

Nested gallery implementation involves adding a parent gallery to screens, setting its Items property to top-level data source like accounts, adding child gallery controls within parent gallery items, setting child gallery Items property to filtered related records using Filter function with parent ID, and configuring gallery layouts, templates, and item spacing for readability.

Hierarchical display patterns include parent items showing summary information with expand controls revealing child galleries, child galleries showing detail records with ability to add or edit, and delegation considerations ensuring queries perform efficiently with large datasets by using delegable filter expressions.

Common nested gallery scenarios include account contacts showing companies with their associated people, order line items displaying orders with product lines, project tasks presenting projects with their task lists, and organizational hierarchies showing departments with their employees.

Question 219

An organization needs to track changes made to specific Dataverse records for audit and compliance purposes. Which Dataverse feature captures record change history?

A) Duplicate detection

B) Auditing

C) Business rules

D) Calculated fields

Answer: B

Explanation:

Regulatory compliance and operational transparency often require detailed audit trails. Dataverse auditing captures who changed what data and when providing comprehensive change history.

Auditing in Dataverse tracks record changes including field value modifications, record creation and deletion, access events, and metadata changes. Auditing is enabled at the environment level, configured per table to track specific entities, and optionally enabled for specific fields to track sensitive data. Audit logs capture user identity, timestamp, old and new values, and operation type. Audit data is retained for specified periods and can be exported for compliance reporting or security analysis.

Duplicate detection identifies similar records preventing duplicate entries but does not track change history. Duplicate detection addresses data quality rather than audit logging.

Business rules implement validation and automation logic but do not record change history. Business rules affect data behavior but do not provide audit trails.

Calculated fields automatically compute values based on formulas but do not track changes. Calculated fields define data transformations rather than capturing change history.

Auditing implementation involves enabling auditing at the environment level through Power Platform admin center, selecting tables to audit configuring which entities track changes, enabling field-level auditing for sensitive columns requiring detailed tracking, and configuring retention policies determining how long audit data is preserved.

Audit capabilities include tracking field changes recording old and new values for monitored fields, user attribution identifying who made changes, timestamp logging when changes occurred, relationship auditing tracking associations between records, and metadata tracking changes to customizations like form modifications.

Audit log access occurs through audit history views in model-driven apps showing change timeline for records, audit summary views providing overview of audit activity, and export functionality downloading audit data for external analysis or compliance reporting. Proper security roles are required to access audit information.

Question 220

A business analyst needs to create a Power BI report that shows sales data with ability to drill down from regions to cities to individual stores. Which Power BI feature enables this navigation?

A) Bookmarks

B) Drill down hierarchy

C) Buttons

D) Tooltips

Answer: B

Explanation:

Hierarchical data analysis requires navigation capabilities that allow exploration from summary to detail. Drill down functionality enables progressive data exploration through organizational hierarchies.

Drill down hierarchy in Power BI enables navigating through data levels in defined sequences by creating hierarchies in the data model that group related fields. For geographic analysis, a hierarchy might include Region, City, and Store levels. Users interact with visuals containing hierarchies by clicking drill down buttons or data points, progressively revealing more detailed data at each level. This navigation pattern supports exploring data from high-level summaries to granular details while maintaining context.

Bookmarks capture report states including filters, selections, and visible pages enabling navigation between predefined views but do not provide hierarchical drill down functionality. Bookmarks jump between report configurations rather than progressively navigating through data levels.

Buttons trigger actions like navigation or bookmark application but do not inherently provide hierarchy navigation. Buttons enable custom interactions but require additional configuration for drill down behavior.

Tooltips display additional information when hovering over visuals but do not change the main visual display or enable drill down navigation. Tooltips provide supplementary information rather than hierarchical exploration.

Hierarchy implementation involves creating hierarchies in Power BI Desktop data model by right-clicking fields in Fields pane, selecting «Create hierarchy,» adding related fields to hierarchy levels, and reordering levels to define drill down sequence. Hierarchies appear in Fields pane with expandable tree structures.

Visual interaction with hierarchies includes drill down mode navigating to next hierarchy level for clicked data point, expand to next level showing additional detail while keeping current level visible, drill up returning to previous hierarchy level, and expand all revealing all hierarchy levels simultaneously.

Common hierarchy types include geographic hierarchies like Country, State, City, date hierarchies like Year, Quarter, Month, Day, organizational hierarchies like Department, Team, Employee, and product hierarchies like Category, Subcategory, Product enabling various analytical perspectives.

Question 221

An organization needs to send personalized emails to customers using data from Dataverse with dynamic content that changes based on customer type. Which Power Automate feature enables conditional email content?

A) Static email templates only

B) Condition actions with HTML email

C) Simple text emails

D) Approval emails

Answer: B

Explanation:

Email personalization improves engagement by delivering relevant content to different audience segments. Conditional logic enables dynamic email content adapting to recipient characteristics.

Condition actions with HTML email in Power Automate enable sending personalized emails by evaluating conditions based on data and constructing different email content for different scenarios. Flows retrieve customer data from Dataverse, use condition actions to check customer type or other attributes, and build HTML email bodies with dynamic content including field values, conditional sections, and formatting. This approach enables highly personalized communication adapting to customer segments, purchase history, or engagement levels.

Static email templates only send identical content to all recipients without personalization. Static templates improve consistency but lack the dynamic adaptation that conditional content provides.

Simple text emails send plain text without formatting or dynamic content. Text emails work for basic notifications but lack visual appeal and personalization capabilities that HTML provides.

Approval emails are specialized formats for approval requests and do not provide general purpose personalized email capabilities. Approval emails serve approval workflows rather than marketing or customer communications.

Conditional email implementation involves retrieving customer records from Dataverse, adding condition actions that evaluate customer properties like customer type equals premium, composing HTML email bodies within different condition branches, including dynamic content expressions that insert field values, and sending emails using «Send an email» action with HTML body.

HTML email capabilities include formatting with styles, colors, and fonts, images embedded or linked for visual content, tables organizing information in structured layouts, links directing recipients to websites or applications, and expressions inserting dynamic values like names, amounts, or dates.

Personalization patterns include segmented content showing different offers based on customer tier, contextual messaging referencing recent purchases or interactions, localized content adapting language or formatting to recipient locale, and behavioral triggers sending emails based on customer actions like abandoned carts or milestone achievements.

Question 222

A business user needs to create a canvas app that uses current GPS location to show nearby store locations on a map. Which Power Apps function retrieves device location?

A) Location function

B) Compass function

C) Acceleration function

D) Camera function

Answer: A

Explanation:

Mobile apps often leverage device sensors to provide location-aware functionality. Power Apps provides functions for accessing device hardware capabilities including GPS location services.

Location function in Power Apps retrieves device GPS coordinates including latitude, longitude, and optionally altitude. The function returns location as a record with Latitude and Longitude properties that can be used to show user position on maps, filter data by proximity, calculate distances to points of interest, or trigger location-based actions. Location services require user permission ensuring privacy consent before accessing position data.

Compass function provides device heading showing which direction the device is pointing but does not provide position coordinates. Compass returns bearing degrees rather than GPS location.

Acceleration function detects device movement and orientation through accelerometer sensors but does not provide location. Acceleration measures forces and tilt rather than geographic position.

Camera function captures photos and videos but does not retrieve location. Camera accesses imaging hardware rather than GPS sensors.

Location function usage involves adding formulas that reference Location with properties like Location.Latitude and Location.Longitude, displaying location on map controls by setting map center or adding pins, filtering data sources to nearby records using distance calculations, and handling location unavailability when GPS is disabled or permission denied.

Location-based app scenarios include store locator apps showing nearest retail locations based on current position, field service apps routing technicians to closest service calls, attendance tracking verifying employees are at correct work sites, and asset management locating equipment using GPS tags.

Privacy and permission considerations include requesting location permission through app configuration, informing users why location is needed building trust, offering functionality without location when users decline permission, and implementing location accuracy requirements balancing precision needs against battery consumption.

Question 223

An organization needs to implement a solution that automatically creates follow-up tasks in Planner when high-priority support cases remain unresolved for 48 hours. Which Power Automate capability should be used?

A) Manual button flow

B) Scheduled flow with condition

C) Instant flow

D) UI flow for desktop automation

Answer: B

Explanation:

Automated escalation processes ensure timely attention to critical issues. Combining scheduled execution with conditional logic enables proactive task management.

Scheduled flow with condition in Power Automate runs on a regular schedule like daily or hourly, queries for support cases meeting escalation criteria such as high priority and created more than 48 hours ago, evaluates each case using condition actions to determine if follow-up tasks should be created, and creates Planner tasks for qualifying cases. This approach provides automated monitoring and escalation without manual intervention.

Manual button flow requires users to explicitly trigger execution making it inappropriate for automatic escalation. Manual flows serve on-demand scenarios but do not provide proactive monitoring.

Instant flow is another term for manual trigger flow that executes on demand. Instant flows are user-initiated rather than running automatically on schedules.

UI flow for desktop automation performs robotic process automation on desktop applications but does not provide cloud-based scheduled monitoring. Desktop flows address different scenarios than cloud automation.

Implementation involves creating scheduled flow with recurrence trigger set to desired frequency like every 4 hours, adding «List rows» action querying Dataverse for high-priority cases, filtering for cases where created date is more than 48 hours ago and status is active, using «Apply to each» to process each case, adding condition checking if follow-up task already exists to prevent duplicates, and creating Planner tasks with case details when conditions are met.

Escalation workflow enhancements include sending notifications to case owners and managers alerting them to escalated cases, updating case records adding notes about automatic escalation, implementing progressive escalation creating different tasks based on case age, and logging escalation activities for reporting and analysis.

Best practices include deduplication checking preventing multiple tasks for same case, configuring appropriate schedules balancing timely escalation against excessive flow runs, implementing error handling gracefully managing failures, and monitoring flow execution reviewing logs to ensure escalations occur correctly.

Question 224

A business analyst needs to create a Power BI report that consumers can interact with to ask questions and receive answers in natural language. Which Power BI feature should be configured?

A) Static report only

B) Q&A visual

C) Card visual

D) Slicer

Answer: B

Explanation:

Democratizing data access requires intuitive interfaces that non-technical users can understand. Natural language querying removes barriers enabling business users to explore data through conversation.

Q&A visual in Power BI creates interactive question-answering experiences where users type natural language questions and receive visualizations as answers. The Q&A visual analyzes data models, understands entity and measure names, interprets questions like «show sales by product category,» and generates appropriate charts or tables. Q&A supports follow-up questions enabling exploration, learns from user interactions improving interpretations, and can be configured with synonyms teaching Q&A domain-specific terminology.

Static report only presents pre-built visualizations without interactive querying. Static reports show fixed content but do not respond to natural language questions.

Card visual displays single metric values like totals or KPIs but does not provide question-answering interface. Cards show specific measures but do not interpret natural language queries.

Slicer provides filtering interface for predefined dimensions but does not understand natural language questions. Slicers enable selection-based filtering rather than conversational querying.

Q&A visual implementation involves adding Q&A visual to report canvas, configuring to use report data model, teaching Q&A synonyms mapping business terms to data model elements, suggesting questions providing users with starting points, and styling visual appearance to match report theme.

Q&A configuration includes linguistic schema defining synonyms like mapping «revenue» to «sales amount» measure, phrasings teaching Q&A relationships like «products have categories,» and excluded terms preventing Q&A from suggesting certain fields in question prompts.

Q&A capabilities include question suggestions showing example questions users can ask, automatic visualization selection choosing appropriate chart types for answers, drill down enabling exploration from summary to detail, and question history allowing users to return to previous queries.

Question 225

An organization needs to implement data loss prevention policies preventing users from sharing sensitive data between business and non-business connectors. Which Power Platform admin feature configures DLP?

A) Security roles

B) Data Loss Prevention policies

C) Environment variables

D) Connection references

Answer: B

Explanation:

Protecting organizational data requires governance controls preventing unauthorized data sharing. Data loss prevention policies enforce boundaries between trusted and untrusted services.

Data Loss Prevention policies in Power Platform admin center define rules controlling which connectors can be used together in apps and flows. DLP policies classify connectors into business, non-business, and blocked groups, prevent mixing business and non-business connectors in the same app or flow, allow business-to-business and non-business-to-non-business combinations, and apply at tenant or environment level. This governance prevents users from accidentally or intentionally exfiltrating corporate data to unauthorized services.

Security roles control access to tables and features within environments but do not restrict connector usage. Security roles address access permissions rather than data loss prevention.

Environment variables store configuration values that change between environments but do not provide governance controls. Environment variables enable deployment flexibility rather than security policies.

Connection references enable updating connections during solution deployment but do not enforce data protection policies. Connection references address ALM rather than DLP.

DLP policy implementation involves accessing Power Platform admin center, navigating to data policies, creating new policy, naming and describing policy purpose, classifying connectors by dragging them to business, non-business, or blocked categories, configuring policy scope selecting environments to which policy applies, and reviewing connector patterns identifying allowed and prevented combinations.

DLP enforcement prevents users from creating or modifying apps and flows that violate policies, displays error messages explaining policy violations, requires users to remove conflicting connectors to proceed, and allows policy exceptions through custom connector patterns when business justification exists.

DLP policy design considerations include identifying business connectors representing corporate data sources like Dataverse and SharePoint, designating consumer services like Gmail and Twitter as non-business, blocking high-risk connectors entirely, and implementing multiple policies for different environment types like production versus development having different restrictions.