title | description | ms.reviewer | author | ms.author | ms.topic | ms.custom | ms.date | ms.search.form | ||
---|---|---|---|---|---|---|---|---|---|---|
Notebook contextual monitoring and debugging |
Learn how to view Apache Spark job progress below the Notebook cell. |
snehagunda |
jejiang |
jejiang |
how-to |
|
02/24/2023 |
Monitor notebook all runs, monitor Spark jobs within a notebook |
The Microsoft Fabric notebook is a web-based interactive surface for developing Apache Spark jobs and conducting machine learning experiments. This article outlines how to monitor the progress of your Spark jobs, access Spark logs, receive advice within the notebook, and navigate to the Spark application detail view or Spark UI for more comprehensive monitoring information for the entire notebook.
A Spark job progress indicator is provided with a real-time progress bar that helps you monitor the job execution status for each notebook cell. You can view the status and tasks' progress across your Spark jobs and stages.
:::image type="content" source="media\spark-monitor-debug\spark-monitor-progress.png" alt-text="Screenshot showing Notebook cell and Spark job progress list." lightbox="media\spark-monitor-debug\spark-monitor-progress.png":::
The executor usage graph visually displays the allocation of Spark job executors and resource usage. Currently, only the runtime information of spark 3.4 and above will display this feature. Click on Resources tab, the line chart for the resource usage of code cell will be showing.
:::image type="content" source="media\spark-monitor-debug\resource.png" alt-text="Screenshot showing Notebook cell and resource usage of code cell." lightbox="media\spark-monitor-debug\resource.png":::
A built-in Spark advisor analyzes your notebook code and Spark executions in real-time to help optimize the running performance of your notebook and assist in debugging failures. There are three types of built-in advice: Info, Warning, and Error. The icons with numbers indicate the respective count of advice in each category (Info, Warning, and Error) generated by the Spark advisor for a particular notebook cell.
To view the advice, click the arrow at the beginning to expand and reveal the details.
:::image type="content" source="media\spark-monitor-debug\light-bulb.png" alt-text="Screenshot showing light bulb." lightbox="media\spark-monitor-debug\light-bulb.png":::
After expanding the advisor section, one or more pieces of advice become visible.
:::image type="content" source="media\spark-monitor-debug\light-bulb-to-expand-the-box.png" alt-text="Screenshot showing light bulb to expand the box." lightbox="media\spark-monitor-debug\light-bulb-to-expand-the-box.png":::
Data skew is a common issue users often encounter. The Spark advisor supports skew detection, and if skew is detected, a corresponding analysis is displayed below.
:::image type="content" source="media\spark-monitor-debug\spark-advisor-skew-detection.png" alt-text="Screenshot showing Data Skew Analysis details." lightbox="media\spark-monitor-debug\spark-advisor-skew-detection.png":::
Spark logs are essential for locating exceptions and diagnosing performance or failures. The contextual monitoring feature in the notebook brings the logs directly to you for the specific cell you are running. You can search the logs or filter them by errors and warnings.
:::image type="content" source="media\spark-monitor-debug\real-time-logs.png" alt-text="Screenshot showing the real time logs under the code cell." lightbox="media\spark-monitor-debug\real-time-logs.png":::
If you want to access additional information about the Spark execution at the notebook level, you can navigate to the Spark application details page or Spark UI through the options available in the context menu.
:::image type="content" source="media\spark-monitor-debug\access-spark-ui-and-monitoring-detail-page.png" alt-text="Screenshot showing the access spark ui and monitoring detail page." lightbox="media\spark-monitor-debug\access-spark-ui-and-monitoring-detail-page.png":::