DataKitchen DataOps Documention

Tool Documentation

Source & Sink Category

Teradata Data Sources and Sinks are of the database category.

Type Codes

MS SQL
Type Code

Data Source

DKDataSource_Teradata

Data Sink

DKDataSink_Teradata

Connection Properties

Field Name
Scope
input type
Required?
Description

hostname

Source/Sink

HostInfo

Required

Teradata host name

username

Source/Sink

String

Required

Teradata password

password

Source/Sink

Password

Required

The user to log in as

Define at Kitchen Level

{
    "teradataconfig": {
            "hostname": "#{vault://teradata/hostname}",
            "username": "#{vault://teradata/username}",
            "password": "#{vault://teradata/password}"
    }
}

Expanded Syntax

{
    "name": "teradata_datasource",
    "type": "DKDataSource_Teradata",
    "hostname": "{{teradataconfig.hostname}}",
    "username": "{{teradataconfig.username}}",
    "password": "{{teradataconfig.password}}",
    "keys": {},
    "tests": {}
}
{
    "name": "teradata_datasink",
    "type": "DKDataSink_Teradata",
    "hostname": "{{teradataconfig.hostname}}",
    "username": "{{teradataconfig.username}}",
    "password": "{{teradataconfig.password}}",
    "keys": {},
    "tests": {}
}

Condensed Syntax

Check Your Syntax

Do not use quotes for your condensed connection configuration variables.

{
    "type": "DKDataSource_Teradata",
    "name": "teradata_datasource",
    "config": {{teradataconfig}},
    "keys": {},
    "tests": {}
}
{
    "type": "DKDataSink_Teradata",
    "name": "teradata_datasink",
    "config": {{teradataconfig}},
    "keys": {},
    "tests": {}
}

Additional Properties

Field Name
Scope / Field Type
Field Type

batch-size

key / TCPPort

The number of rows to send to the server as a single packet. Default 5000. Used by freebcp query type only. Deprecated?

bcp-format-file

key / string

Used when query-type is freebcp.

bool-format

key / string

How a value stored as True or False in a SQL source should be written to a CSV file, and which strings from a CSV should be created as True or False in a SQL sink.

Default values are True and False, but users could specify Yes or 1 and No or 0 instead.

Usage in JSON files:
"bool-format": { "Yes": true, "No": false }

col-delimiter

key / string

Indicates the character(s) used to separate columns in a data file. Defaults to comma (,) or tab (\t) depending on query type.

date-format

key / datetime

Python datetime format for dates in the delimited file. Default value %Y-%m-%d might be written to a file as 2020-08-27. See https://strftime.org/ for syntax.

datetime-format

key / datetime

Python datetime format for dates in the delimited file. Default value %Y-%m-%d %H:%M:%S might be written to a file as 2020-08-27 14:30:22. See https://strftime.org/ for syntax.

decimal-format

key / string

Python string format for decimal numbers in the delimited file. Default value %f.

first-row-column-names

key / Boolean

When using CSV format as input, if true, will use the first row as column names, which can be either ignored or used as field names depending the case.

format

key / string

Specifies the format of the data being inserted.
Valid values:
csv : Exports the data as CSV
binary : Exports the data in an internal binary format. Useful when moving from SQL datasources to SQL data sinks.
json : Exports the data to JSON format

Binary format preserves better file types so is recommended when using DataMapper nodes to move data from different databases.

infer-types

key / Boolean

When true, it attempts to infer the most appropiate data types from the input data. If false, the table-ddl must be specified.
Note that when using CSV input format, type detection may note be accurate. Deprecated?

insert-column-names

key / Boolean

Puts the column names in the output file.

insert-template

key / string

Specifies the DML template used to insert data.

int-format

key / string

Python string format for integers in the delimited file. Default value %d.

max-errors

key / integer

The maximum amount of allowed errors on insert.

null-value

key / string

Specifies how null values from a SQL source should be written to a CSV file, and which strings from a CSV should be created as null in a SQL sink. Default value is an empty string, but users could specify NULL or None instead.

query-type

key / string

Required. Specifies the type of operation to perform data input, this can be:
execute_dml: the insert is performed using a DML statement template provided on the key.
bulk_insert: Inserts all the records in the file using automatically generated DML sentences.
freebcp: Uses the BCP command line tool to upload data. Requires a CSV file as input.

Each data sink type may also offer extra options.

row-delimiter

key / string

When using CSV format, it specifies the row delimiter. Use C-Style escaping for tab, newline, and carriage return characters (\t, \n, etc). Note that backslashes must be escaped inside JSON ("row-delimiter": "\t").

table-ddl

key / string

Specifies the statement used to create the table.

table-name

key / HostInfo

Indicates the table to insert data into, if not specified in the SQL command. Required for freebcp query type.

time-format

key / datetime

Python datetime format for time in the delimited file. Default value %H:%M:%S might be written to a file as 14:30:22. See https://strftime.org/ for syntax.

Built-in Runtime Variables

Built-in Runtime Variable
Scope
Description

key_count

Source/Sink

The count of all keys declared in a given Data Source or Data Sink.

column_count

Key

The count of columns for data associated to a Key. Only available for query-type that is either execute-query or execute-row.

result

Key

The result when query-type is set to execute_scalar.

row_count

Key

The count of rows for data associated to a Key. Only available for query-type that is either execute-query or execute-row. Table headers are NOT included in the row count value.

UTF-8 File Encoding Required

Files used with data sources and data sinks must be encoded in UTF-8 in order to avoid non-Unicode characters causing errors with row-count tests and problems with sinking data to database tables. For CSV and other delimited files, use the "save as" function in your application and select the proper encoding, or consider using a text editor with encoding options.

Data Source Example

Example: Data Transfer from FTP Server to Teradata

{
    "type": "DKNode_DataMapper"
{
    "mappings": {
        "ftp_to_td": {
            "source-name": "ftp_source",
            "source-key": "ftp_to_td_source",
            "sink-name": "teradata_sink",
            "sink-key": "ftp_to_td_sink"
        }
    }
}
{
    "name": "source",
    "type": "DKDataSource_FTP",
    "config": {
        "username": "vault://ftp/username",
        "hostname": "vault://ftp/hostname",
        "port": "vault://ftp/port",
        "password": "vault://ftp/password",
        "passive": true
    },
    "keys": {
        "ftp_to_td_source": {
            "file-key": "example.csv",
            "use-only-file-key": true
        }
    }
}
{
    "name": "sink",
    "type": "DKDataSink_Teradata",
    "config": {
        "username": "vault://teradata/username",
        "password": "vault://teradata/password",
        "hostname": "vault://teradata/hostname"
    },
    "keys": {
        "ftp_to_td_sink": {
            "query-type": "bulk_insert",
            "table-name": "test_data.td_testing",
            "format": "csv"
        }
    }
}

Data Sink Example

Example: Data Transfer from Teradata back to FTP Server

{
    "type": "DKNode_DataMapper"
}
{
    "mappings": {
        "mapping1": {
            "source-name": "teradata_source",
            "source-key": "mapping1_source",
            "sink-name": "ftp_sink",
            "sink-key": "mapping1_sink"
        }
    }
}
{
    "name": "source",
    "type": "DKDataSource_Teradata",
    "config": {
        "username": "vault://teradata/username",
        "password": "vault://teradata/password",
        "hostname": "vault://teradata/hostname"
    },
    "keys": {
        "mapping1_source": {
            "sql": "SELECT * FROM test_data.td_testing",
            "query-type": "execute_query",
            "format": "csv"
        }
    }
}
{
    "name": "sink",
    "type": "DKDataSink_FTP",
    "config": {
        "username": "vault://ftp/username",
        "hostname": "vault://ftp/hostname",
        "port": "vault://ftp/port",
        "password": "vault://ftp/password",
    },
    "keys": {
        "mapping1_sink": {
            "file-key": "output.csv",
            "use-only-file-key": true
        }
    }
}

Updated about a month ago

Teradata


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.