data_pipelines_cli.cli_commands package


data_pipelines_cli.cli_commands.clean module

clean() None[source]

Delete local working directories.

data_pipelines_cli.cli_commands.compile module

compile_project(env: str, docker_build: bool = False) None[source]

Create local working directories and build artifacts.

  • env (str) – Name of the environment

  • docker_build (bool) – Whether to build a Docker image



replace_image_settings(docker_args: data_pipelines_cli.data_structures.DockerArgs) None[source]

data_pipelines_cli.cli_commands.create module

create(project_path: str, template_path: Optional[str]) None[source]

Create a new project using a template.

  • project_path (str) – Path to a directory to create

  • template_path (Optional[str]) – Path or URI to the repository of the project template


DataPipelinesError – no template found in .dp.yml config file

data_pipelines_cli.cli_commands.deploy module

class DeployCommand(env: str, docker_push: bool, dags_path: Optional[str], provider_kwargs_dict: Optional[Dict[str, Any]], datahub_ingest: bool)[source]

Bases: object

A class used to push and deploy the project to the remote machine.

blob_address_path: str

URI of the cloud storage to send build artifacts to

datahub_ingest: bool

Whether to ingest DataHub metadata

deploy() None[source]

Push and deploy the project to the remote machine.

docker_args: Optional[data_pipelines_cli.data_structures.DockerArgs]

Arguments required by the Docker to make a push to the repository. If set to None, deploy() will not make a push

provider_kwargs_dict: Dict[str, Any]

Dictionary of arguments required by a specific cloud storage provider, e.g. path to a token, username, password, etc.

data_pipelines_cli.cli_commands.init module

init(config_path: Optional[str]) None[source]

Configure the tool for the first time.


config_path (Optional[str]) – URI of the repository with a template of the config file


DataPipelinesError – user do not want to overwrite existing config file

data_pipelines_cli.cli_commands.prepare_env module

prepare_env(env: str) None[source]

Prepare local environment for use with dbt-related applications.

Prepare local environment for use with applications expecting a “traditional” dbt structure, such as plugins to VS Code. If in doubt, use dp run and dp test instead.


env (str) – Name of the environment

data_pipelines_cli.cli_commands.publish module

create_package() pathlib.Path[source]

Create a dbt package out of the built project.


DataPipelinesError – There is no model in ‘manifest.json’ file.

publish_package(package_path: pathlib.Path, key_path: str, env: str) None[source] module

run(env: str) None[source]

Run the project on the local machine.


env (str) – Name of the environment

data_pipelines_cli.cli_commands.template module

list_templates() None[source]

Print a list of all templates saved in the config file.

data_pipelines_cli.cli_commands.test module

test(env: str) None[source]

Run tests of the project on the local machine.


env (str) – Name of the environment

data_pipelines_cli.cli_commands.update module

update(project_path: str, vcs_ref: str) None[source]

Update an existing project from its template :param project_path: Path to a directory to create :type project_path: str :param vcs_ref: Git reference to checkout in projects template :type vcs_ref: str