livy.batch

class livy.batch.LivyBatch(url, batch_id, auth=None, verify=True, requests_session=None)[source]

Manages a remote Livy batch and high-level interactions with it.

Parameters
  • url (str) – The URL of the Livy server.

  • batch_id (int) – The ID of the Livy batch.

  • auth (Union[AuthBase, Tuple[str, str], None]) – A requests-compatible auth object to use when making requests.

  • verify (Union[bool, str]) – Either a boolean, in which case it controls whether we verify the server’s TLS certificate, or a string, in which case it must be a path to a CA bundle to use. Defaults to True.

  • requests_session (Optional[Session]) – A specific requests.Session to use, allowing advanced customisation. The caller is responsible for closing the session.

classmethod create(url, file, auth=None, verify=True, requests_session=None, class_name=None, args=None, proxy_user=None, jars=None, py_files=None, files=None, driver_memory=None, driver_cores=None, executor_memory=None, executor_cores=None, num_executors=None, archives=None, queue=None, name=None, spark_conf=None)[source]

Create a new Livy batch session.

The py_files, files, jars and archives arguments are lists of URLs, e.g. [“s3://bucket/object”, “hdfs://path/to/file”, …] and must be reachable by the Spark driver process. If the provided URL has no scheme, it’s considered to be relative to the default file system configured in the Livy server.

URLs in the py_files argument are copied to a temporary staging area and inserted into Python’s sys.path ahead of the standard library paths. This allows you to import .py, .zip and .egg files in Python.

URLs for jars, py_files, files and archives arguments are all copied to the same working directory on the Spark cluster.

The driver_memory and executor_memory arguments have the same format as JVM memory strings with a size unit suffix (“k”, “m”, “g” or “t”) (e.g. 512m, 2g).

See https://spark.apache.org/docs/latest/configuration.html for more information on Spark configuration properties.

Parameters
  • url (str) – The URL of the Livy server.

  • file (str) – File containing the application to execute.

  • auth (Union[AuthBase, Tuple[str, str], None]) – A requests-compatible auth object to use when making requests.

  • verify (Union[bool, str]) – Either a boolean, in which case it controls whether we verify the server’s TLS certificate, or a string, in which case it must be a path to a CA bundle to use. Defaults to True.

  • requests_session (Optional[Session]) – A specific requests.Session to use, allowing advanced customisation. The caller is responsible for closing the session.

  • class_name (Optional[str]) – Application Java/Spark main class.

  • proxy_user (Optional[str]) – User to impersonate when starting the session.

  • jars (Optional[List[str]]) – URLs of jars to be used in this session.

  • py_files (Optional[List[str]]) – URLs of Python files to be used in this session.

  • files (Optional[List[str]]) – URLs of files to be used in this session.

  • driver_memory (Optional[str]) – Amount of memory to use for the driver process (e.g. ‘512m’).

  • driver_cores (Optional[int]) – Number of cores to use for the driver process.

  • executor_memory (Optional[str]) – Amount of memory to use per executor process (e.g. ‘512m’).

  • executor_cores (Optional[int]) – Number of cores to use for each executor.

  • num_executors (Optional[int]) – Number of executors to launch for this session.

  • archives (Optional[List[str]]) – URLs of archives to be used in this session.

  • queue (Optional[str]) – The name of the YARN queue to which submitted.

  • name (Optional[str]) – The name of this session.

  • spark_conf (Optional[Dict[str, Any]]) – Spark configuration properties.

Return type

LivyBatch

wait()[source]

Wait for the batch session to finish.

Return type

SessionState

property state

The state of the managed Spark batch.

Return type

SessionState

log(from_=None, size=None)[source]

Get logs for this Spark batch.

Parameters
  • from – The line number to start getting logs from.

  • size (Optional[int]) – The number of lines of logs to get.

Return type

List[str]

kill()[source]

Kill the managed Spark batch session.

Return type

None