Run a Notebook¶
-
POST
/api/v1.2/commands
¶
Use this API to run a Spark notebook with optional parameters. Currently, this API does not support Presto notebooks. You can view the command’s status, result, or cancel a command using the corresponding Command API that are used for other types of command.
Note
You can run a Spark notebook only when it is associated with a Spark cluster. You must have permissions to run or schedule a notebook.
These are a few points to know about running a notebook through the API:
- Invoking a notebook API is useful when you want to run all its paragraphs at once.
- When you invoke an API to run a notebook and simultaneously that notebook is edited by another user, then the API does not successfully run or at least does not respond with the expected result.
- When two users simultaneously invoke a API to run the same notebook, both the notebooks run successfully but the paragraphs may not run in the same order.
- As other command APIs, when a notebook is invoked/run through the API, you can see the Qubole logs and the App Logs.
A Spark notebook can also be scheduled using the Scheduler API as a Spark command. scheduler-api describes how to create, edit, view, and list schedules.
Required Role¶
The following users can make this API call:
- Users who belong to the system-admin or system-user group.
- Users who belong to a group associated with a role that allows submitting a command. See Managing Groups and Managing Roles for more information.
Parameters¶
Note
Parameters marked in bold below are mandatory. Others are optional and have default values.
Parameter | Description |
---|---|
command_type | SparkCommand |
note_id | Specify the notebook’s ID that you want to run. |
language | The language is notebook and if not specified, it gets added by default when a notebook’s ID is specified in the API call. |
label | Specify one of the labels of the cluster which is associated with the notebook that you want to run. |
name | Add a name to the command that is useful while filtering commands from the command history. It does not accept & (ampersand), < (lesser than), > (greater than), ” (double quotes), and ‘ (single quote) special characters, and HTML tags as well. It can contain a maximum of 255 characters. |
tags | Add a tag to a command so that it is easily identifiable and searchable from the commands list in the Commands History. Add a tag as a filter value while searching commands.
It can contain a maximum of 255 characters. A comma-separated list of tags can be associated with a single command. While adding a tag value, enclose it in square brackets. For example,
{"tags":["<tag-value>"]} . |
arguments | It is used to add parameters in the run notebook API call. These parameters fill dynamic forms of the notebook with the given parameters. You can pass more than one variable. The
syntax for using "arguments": {"key1":"value1", "key2":"value2", ..., "keyN":"valueN"}
Where |
Request API Syntax¶
Here is the Request API syntax for running a Spark notebook.
curl -i -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"name" : "<name_command>", "command_type":"SparkCommand", "language":"notebook", "note_id":"<Notebook_Id>",
"tag":"<tags>", "label":"<cluster-label>", "arguments": {"key1":"value1", "key2":"value2", ..., "keyN":"valueN"} }' \
"https://gcp.qubole.com/api/v1.2/commands"
Sample API Requests¶
Here is an example with a successful response.
curl -i -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"name" : "note_command", "command_type":"SparkCommand", "language":"notebook", "note_id":"123","tag":"notes",
"label":"spark1"}' \
"https://gcp.qubole.com/api/v1.2/commands"
Successful Response
{
"id": 363,
"path": "/tmp/2016-10-03/1/363",
"status": "waiting",
"created_at": "2016-10-03T07:14:39Z",
"command_type": "SparkCommand",
"progress": 0,
"qbol_session_id": 69,
"qlog": null,
"resolved_macros": null,
"pid": null,
"template": "generic",
"submit_time": 1475478879,
"start_time": null,
"end_time": null,
"can_notify": false,
"num_result_dir": 0,
"pool": null,
"timeout": null,
"name": "note_command",
"command_source": "API",
"account_id": 1,
"saved_query_mutable_id": null,
"user_id": 1,
"label": "spark1",
"meta_data": {
"results_resource": "commands/363/results",
"logs_resource": "commands/363/logs"
},
"uid": 1,
"perms": null,
"command": {
"cmdline": null,
"language": "notebook",
"note_id": 123,
"program": null,
"arguments": "",
"user_program_arguments": null,
"sql": null,
"md_cmd": true,
"app_id": null,
"retry": 0
},
"instance": null
}
Here is an example with a failed response.
curl -i -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"name" : "note_command", "command_type":"SparkCommand", "language":"notebook", "note_id":"111", "tag":"notes"}' \
"https://gcp.qubole.com/api/v1.2/commands"
Failed Response
{
"error": {
"error_code": 422,
"error_message": "Command type could not be created. Errors: There is no cluster associated with notebook with Id: 111"
}
}
Here is another example with a failed response.
curl -i -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"name" : "note_command", "command_type":"SparkCommand", "language":"notebook", "note_id":"12321", "tag":"notes",
"label":"spark1"}' \
"https://gcp.qubole.com/api/v1.2/commands"
Failed Response
{
"error": {
"error_code": 422,
"error_message": ""Command type could not be created. Errors: There is no spark notebook for account 54321 with Id: 3333""
}
}
Here is a sample REST API call with optional parameters.
curl -i -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"name" : "note_command", "command_type":"SparkCommand", "language":"notebook", "note_id":"1000", "tag":"notes",
"label":"spark2", "arguments":{"Name":"AssetNote", "Year":"2017"}}' \
"https://gcp.qubole.com/api/v1.2/commands"