Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Upload pipeline on kubeflow

I am currently trying to setup a kubeflow pipeline. My use case requires that the configuration for pipelines shall be provided via a yaml/json structure. Looking into the documentation for submitting pipelines I came across this paragraph:

Each pipeline is defined as a Python program. Before you can submit a pipeline to the Kubeflow Pipelines service, you must compile the pipeline to an intermediate representation. The intermediate representation takes the form of a YAML file compressed into a .tar.gz file.

Is is possible to upload/submit a pipeline to KubeFlow via json representation or any other representation instead of a zip file(tar.gz) representation? Is there a way to bypass the filesystem persistence of files(zips and tar.gz) and add them into database as a yaml/json representation?

like image 583
LexByte Avatar asked Nov 01 '25 01:11

LexByte


1 Answers

When you compile your python pipeline code then it results in a compressed file containing a YAML file. You can take out the YAML file after decompressing it and you can add its contents to your database table.

Later If you want to upload it to Kubeflow then use the following code:

 pipeline_file_path = 'pipelines.yaml' # extract it from your database
 pipeline_name = 'Your Pipeline Name'

 client = kfp.Client()
 pipeline = client.pipeline_uploads.upload_pipeline(
                                pipeline_file_path, name=pipeline_name)
like image 164
Dilip Sharma Avatar answered Nov 03 '25 09:11

Dilip Sharma



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!