The use of anomaly detection in Power BI using large datasets will negatively affect their performance and accuracy when the data is unoptimized. An ideal time series data where a single measure is pictured over time will best enable Power BI's anomaly detection process. Be sure to follow these recommended practices to have a reliable result and smooth performance:
1. Prepare aggregate data beforehand
Diehards with winter blues, the large ones will need to aggregate instead of simply using a line chart for visualization. Aggregate your data according to any relevant time interval before visualizing it using things like Power Query or DAX (such as daily, weekly, or monthly). This simplifies the visual and improves accuracy because it allows anomaly detection to highlight what is meaningful: the trends.
2. Filters and slicers to narrow the view
Don't analyze too much data at once. Limit what you're analyzing heavily with filters. Geographically restricted to certain areas, categories, or periods. Use those slicers or filters to cut down on the volume of data dynamically at the report level without rebuilding the visuals. This will also allow the algorithm to detect anomalies better in that context.
3. Prepare your visual setup
Ensure that the x-axis is properly configured to be a date/time field with a single numeric on the y-axis. Do not exceed a few lines or categories in one single visual since anomaly detection can only allow one measure per line chart. Adjust the sensitivity settings in the Analytics pane to calibrate the thresholds for detection against the scale and variation of your data.