Composing a Shell Command¶
Use the command composer on the Workbench page to compose a shell command.
See Running a Shell Command for more information.
Note
Hadoop 2 and Spark clusters support shell commands. See Mapping of Cluster and Command Types for more information. Some Cloud platforms do not support all cluster types.
Qubole does not recommended running a Spark application as a Bash command under the Shell command options as automatic changes, such as increase in the Application Coordinator memory based on the driver memory and debug options’ availability, do not happen. Such automatic changes occur when you run a Spark application through the Command Line option.
Perform the following steps to compose a shell command:
- Navigate to the Workbench page and click + Create New.
- Select Shell from the drop-down list near the center of the top of the page.
- Shell Script is selected by default from the drop-down list near the right side of the top of the page. You can also specify the location of a script in Cloud storage.
- To use a shell command, enter it into the text field. If you are using a script, specify script parameters, if any, in the text field.
- In the Optional List of Files text field, optionally list files (separated by a comma) to be copied from Cloud storage to the working directory where the command is run.
- in the Optional List of Archives text field, optionally list archive files (separated by a comma) to be uncompressed in the working directory where the command is run.
- Click Run to execute the query.
You can see the result under the Results tab, and the logs under the Logs tab. For more information on how to download command results and logs, see Downloading Results and Logs.