Skip to content

data_push_job

Module to interact with data push jobs.

This module contains class to interact with a data push job in EMS data integration.

Typical usage example
data_push_job = data_pool.create_data_push_job("TABLE_NAME", JobType.REPLACE)
data_push_job.add_file_chunk(file)
data_push_job.execute()

DataPushJob

Bases: DataPushJobBase

Data push job object to interact with data push job specific data integration endpoints.

client class-attribute

client: Client = Field(Ellipsis, exclude=True)

id class-attribute

id: str

data_pool_id class-attribute

data_pool_id: str

from_transport classmethod

from_transport(client, data_push_job_transport)

Creates high-level data push job object from given DataPushJobTransport.

Parameters:

  • client (Client) –

    Client to use to make API calls for given data push job.

  • data_push_job_transport (DataPushJobBase) –

    DataPushJobTransport object containing properties of data push job.

Returns:

  • DataPushJob

    A DataPushJob object with properties from transport and given client.

sync

sync()

Syncs data push job properties with EMS.

delete

delete()

Deletes data push job.

add_file_chunk

add_file_chunk(file)

Adds file chunk to data push job.

Parameters:

  • file (typing.Union[io.BytesIO, io.BufferedReader]) –

    File stream to be upserted within data push job.

add_data_frame

add_data_frame(df, chunk_size=100000)

Splits data frame into chunks of size chunk_size and adds each chunk to data push job.

Parameters:

  • df (pd.DataFrame) –

    Data frame to push with given data push job.

  • chunk_size (int) –

    Number of rows for each chunk.

get_chunks

get_chunks()

Gets all chunks of given data push job.

Returns:

execute

execute(wait=True)

Execute given data push job.

Parameters:

  • wait (bool) –

    If true, function only returns once data push job has been executed and raises error if reload fails. If false, function returns after triggering execution and does not raise errors in case it failed.

Raises: