CLI Commands Reference

If you are looking for extensive information on a specific CLI command, this part of the documentation is for you.

dp

dp [OPTIONS] COMMAND [ARGS]...

Options

--version

Show the version and exit.

clean

Delete local working directories

dp clean [OPTIONS]

compile

Create local working directories and build artifacts

dp compile [OPTIONS]

Options

--env <env>

Required Name of the environment

Default

base

--docker-build

Whether to build a Docker image

--docker-tag <docker_tag>

Image tag of a Docker image to create

--docker-args <docker_args>

Args required to build project in json format

create

Create a new project using a template

dp create [OPTIONS] PROJECT_PATH [TEMPLATE_PATH]...

Arguments

PROJECT_PATH

Required argument

TEMPLATE_PATH

Optional argument(s)

deploy

Push and deploy the project to the remote machine

dp deploy [OPTIONS]

Options

--env <env>

Name of the environment

Default

base

--dags-path <dags_path>

Remote storage URI

--blob-args <blob_args>

Path to JSON or YAML file with arguments that should be passed to your Bucket/blob provider

--docker-push

Whether to push image to the Docker repository

--datahub-ingest

Whether to ingest DataHub metadata

docs-serve

Generate and serve dbt documentation.

dp docs-serve [OPTIONS]

Options

--env <env>

Name of the environment

Default

local

--port <port>

Port to be used by the ‘dbt docs serve’ command

Default

9328

generate

Generate additional dbt files

dp generate [OPTIONS] COMMAND [ARGS]...

model-yaml

Generate schema YAML using codegen or dbt-profiler

dp generate model-yaml [OPTIONS] [MODEL_PATH]...

Options

--env <env>

Name of the environment

Default

local

--with-meta

Whether to generate dbt-profiler metadata

--overwrite

Whether to overwrite existing YAML files

Arguments

MODEL_PATH

Optional argument(s)

source-sql

Generate SQLs that represents tables in given dataset

dp generate source-sql [OPTIONS]

Options

--env <env>

Name of the environment

Default

local

--source-yaml-path <source_yaml_path>

Required Path to the ‘source.yml’ schema file

Default

/home/docs/checkouts/readthedocs.org/user_builds/data-pipelines-cli/checkouts/0.21.0/docs/models/source/source.yml

--staging-path <staging_path>

Required Path to the ‘staging’ directory

Default

/home/docs/checkouts/readthedocs.org/user_builds/data-pipelines-cli/checkouts/0.21.0/docs/models/staging

--overwrite

Whether to overwrite existing SQL files

source-yaml

Generate source YAML using codegen

dp generate source-yaml [OPTIONS] [SCHEMA_NAME]...

Options

--env <env>

Name of the environment

Default

local

--source-path <source_path>

Required Path to the ‘source’ directory

Default

/home/docs/checkouts/readthedocs.org/user_builds/data-pipelines-cli/checkouts/0.21.0/docs/models/source

--overwrite

Whether to overwrite an existing YAML file

Arguments

SCHEMA_NAME

Optional argument(s)

init

Configure the tool for the first time

dp init [OPTIONS] [CONFIG_PATH]...

Arguments

CONFIG_PATH

Optional argument(s)

prepare-env

Prepare local environment for apps interfacing with dbt

dp prepare-env [OPTIONS]

Options

--env <env>

Name of the environment

publish

Create a dbt package out of the project

dp publish [OPTIONS]

Options

--key-path <key_path>

Required Path to the key with write access to repo with published packages

--env <env>

Required Name of the environment

Default

base

run

Run the project on the local machine

dp run [OPTIONS]

Options

--env <env>

Name of the environment

Default

local

seed

Run ‘dbt seed’

dp seed [OPTIONS]

Options

--env <env>

Name of the environment

Default

local

template-list

Print a list of all templates saved in the config file

dp template-list [OPTIONS]

test

Run tests of the project on the local machine

dp test [OPTIONS]

Options

--env <env>

Name of the environment

Default

local

update

Update project from its template

dp update [OPTIONS] [PROJECT_PATH]...

Options

--vcs-ref <vcs_ref>

Git reference to checkout

Arguments

PROJECT_PATH

Optional argument(s)