Skip to main content

ALTER JOB

Description

Modify an existing Job.

Note that the SQL statement does not end with ;

Usage Notes:

  • You may specify just one parameter to modify per ALTER JOB API command. To modify multiple parameters, you must use multiple commands.
  • You cannot alter the Job's Type after creation.

Syntax

ALTER JOB <job_name>
[ SET CLUSTER = <string> ]
[ SET PARAMETERS = <array> ]

Example

Example commands for modifying a Job.

ALTER JOB count_transactions
SET CLUSTER = 'high_prio_cluster_spark'
ALTER JOB count_transactions
SET PARAMETERS = ( '--class', 'com.example.MySparkApp', '/path/to/my-spark-app.jar', 'arg1', 'arg2' )

Required parameters

  • <job_name>: Unique name to identify the Job (max 100 characters).

Optional parameters

Use one optional parameter

You must use exactly one of the optional parameters in the query.

  • CLUSTER: Specify the name of an existing Onehouse Cluster with type Spark to run the Job.
  • PARAMETERS: Specify an array of Strings to pass as parameters to the Job, which will be used in a spark-submit (see Apache Spark docs). This should include the following:
    • For JAR Jobs, you must include the --class parameter.
    • Optionally, include any other Spark properties you'd like the Job to use.
    • Include the cloud storage bucket path containing the code for your Job. The Onehouse agent must have access to read this path.
    • Optionally, include any arguments you'd like to pass to the Job.

Status API

Status API response

  • API_OPERATION_STATUS_SUCCESS from Status API indicates that the Job has been modified.
  • API_OPERATION_STATUS_FAILED from Status API does not necessarily mean the Job modification failed. To confirm whether the Job run was modified, send a request to the DESCRIBE JOB API and check the field in sparkJob.

Example Status API response

The Status API response of successful modification of a Job.

{
"apiStatus": "API_OPERATION_STATUS_SUCCESS",
"apiResponse": {}
}