data_push_job
Module to interact with data push jobs.
This module contains class to interact with a data push job in EMS data integration.
Typical usage example
DataPushJob ¶
Bases: DataPushJobBase
Data push job object to interact with data push job specific data integration endpoints.
target_name
instance-attribute
¶
Name of table where data is pushed to.
connection_id
instance-attribute
¶
Id of data connection where data is pushed to.
from_transport
classmethod
¶
Creates high-level data push job object from given DataPushJobTransport.
Parameters:
-
client
(
Client
) –Client to use to make API calls for given data push job.
-
data_push_job_transport
(
DataPushJobBase
) –DataPushJobTransport object containing properties of data push job.
Returns:
-
DataPushJob
–A DataPushJob object with properties from transport and given client.
add_file_chunk ¶
Adds file chunk to data push job.
Parameters:
-
file
(
typing.Union[io.BytesIO, io.BufferedReader]
) –File stream to be upserted within data push job.
Examples:
Create data push job to replace table:
delete_file_chunk ¶
Deletes file chunk from data push job.
Parameters:
-
file
(
typing.Union[io.BytesIO, io.BufferedReader]
) –File stream to be deleted within data push job.
Examples:
Create data push job to delete table:
add_data_frame ¶
Splits data frame into chunks of size chunk_size
and adds each chunk to data push job.
Parameters:
-
df
(
pd.DataFrame
) –Data frame to push with given data push job.
-
chunk_size
(
int
) –Number of rows for each chunk.
-
index
(
typing.Optional[bool]
) –Whether index is included in parquet file. See pandas documentation.
Examples:
Create data push job to replace table:
delete_data_frame ¶
Splits data frame into chunks of size chunk_size
and deletes each chunk to data push job.
Parameters:
-
df
(
pd.DataFrame
) –Data frame to push with given data push job.
-
chunk_size
(
int
) –Number of rows for each chunk.
-
index
(
typing.Optional[bool]
) –Whether index is included in parquet file. See pandas documentation.
Examples:
Create data push job to delete table:
get_chunks ¶
Gets all chunks of given data push job.
Returns:
-
CelonisCollection[typing.Optional[DataPushChunk]]
–A list containing all chunks.
execute ¶
Execute given data push job.
Parameters:
-
wait
(
bool
) –If true, function only returns once data push job has been executed and raises error if reload fails. If false, function returns after triggering execution and does not raise errors in case it failed.
Raises:
-
PyCelonisDataPushJobNotNew
–Data push job has already been executed and can't be executed again.
-
PyCelonisDataPushExecutionFailedError
–Data push job execution failed. Only triggered if
wait=True
.
Examples:
Create data push job to replace table: