Skip to main content

CREATE JOB

Description

Create a new Job.

Note that the SQL statement does not end with ;

Syntax

CREATE JOB <job_name>
TYPE = { 'JAR' | 'PYTHON' }
PARAMETERS = <array>
CLUSTER = <string>

Example

Example command for creating a Job.

CREATE JOB count_transactions
TYPE = 'JAR'
PARAMETERS = ( '--class', 'com.example.MySparkApp', '/path/to/my-spark-app.jar', 'arg1', 'arg2' )
CLUSTER = 'onehouse_cluster_spark'

Required parameters

  • <job_name>: Unique name to identify the Job (max 100 characters).
  • TYPE: Specify the type of Job - this can be a JAR (for Java or Scala code) or Python script.
  • PARAMETERS: Specify an array of Strings to pass as parameters to the Job, which will be used in a spark-submit (see Apache Spark docs). This should include the following:
    • [Required] For JAR Jobs, you must include the --class parameter.
    • [Required] Include the cloud storage bucket path containing the code for your Job. The Onehouse agent must have access to read this path.
    • [Optional] Include any other Spark properties you'd like the Job to use.
    • [Optional] Include any arguments you'd like to pass to the Job.
  • CLUSTER: Specify the name of an existing Onehouse Cluster with type Spark to run the Job.

Status API

Status API response

  • API_OPERATION_STATUS_SUCCESS from Status API indicates that the Job has been created.
  • API_OPERATION_STATUS_FAILED from Status API does not necessarily mean the Job was not created. To confirm whether the Job was created, send a request to DESCRIBE JOB API.

Example Status API response

The Status API response of a successful Job creation.

{
"apiStatus": "API_OPERATION_STATUS_SUCCESS",
"apiResponse": {}
}