Known LimitationsΒΆ

You should be aware of the known limitations in Jupyter notebooks.

  • There is a 60-minute timeout for the Spark applications to start, that is to transition to RUNNING state. For example, if all resources in the cluster are in use and a notebook is run which creates a new Spark application, the Spark application has to transition to RUNNING state within 60 mins. If not, then the timeout expires,the Spark application is killed, and a timeout error is displayed in the notebook.
  • The following limitations are related to the %%sql magics, which use autovizwidgets to render various chart types.
    • Slow process- These charts are designed to be used for interactive use only. On a click of each chart type, the front end communicates with the backend through the same websocket which is used for cell executions. It generates the code in the backend which is transferred back to UI and rendered. This whole process is slow, especially if there are other cells executing in the notebook.
    • Re-rendering - After a notebook is executed, when it is saved, the rendered chart/table is not saved along with the notebook contents. Therefore, when a user re-opens a notebook later or shares it with others, these charts/table are not rendered again and instead an Error displaying widget: model not found message is displayed.