tencent cloud

Stream Compute Service

Releases Notes and Announcements
Release Notes
Product Introduction
Overview
Strengths
Use Cases
Purchase Guide
Billing Overview
Billing Mode
Refund
Configuration Adjustments
Getting Started
Preparations
Creating a Private Cluster
Creating a SQL Job
Creating a JAR Job
Creating an ETL Job
Creating a Python Job
Operation Guide
Managing Jobs
Developing Jobs
Monitoring Jobs
Job Logs
Events and Diagnosis
Managing Metadata
Managing Checkpoints
Tuning Jobs
Managing Dependencies
Managing Clusters
Managing Permissions
SQL Developer Guide
Overview
Glossary and Data Types
DDL Statements
DML Statements
Merging MySQL CDC Sources
Connectors
SET Statement
Operators and Built-in Functions
Identifiers and Reserved Words
Python Developer Guide
ETL Developer Guide
Overview
Glossary
Connectors
FAQ
Contact Us

Data Lake Compute

PDF
フォーカスモード
フォントサイズ
最終更新日: 2023-11-08 14:27:50

Versions

Flink Version
Description
1.11
Unsupported
1.13
Supported (use as sink)
1.14
Unsupported
1.16
Unsupported

Use cases

This connector can be used as a sink. It allows for writing data to native tables managed in Data Lake Compute.

Defining a table in DDL

CREATE TABLE `eason_internal_test`(
`name` STRING,
`age` INT
) WITH (
'connector' = 'dlc-inlong',
'catalog-database' = 'test',
'catalog-table' = 'eason_internal_test',
'default-database' = 'test',
'catalog-name' = 'HYBRIS',
'catalog-impl' = 'org.apache.inlong.sort.iceberg.catalog.hybris.DlcWrappedHybrisCatalog',
'qcloud.dlc.secret-id' = '12345asdfghASDFGH',
'qcloud.dlc.secret-key' = '678910asdfghASDFGH',
'qcloud.dlc.region' = 'ap-guangzhou',
'qcloud.dlc.jdbc.url' = 'jdbc:dlc:dlc.internal.tencentcloudapi.com?task_type=SparkSQLTask&database_name=test&datasource_connection_name=DataLakeCatalog&region=ap-guangzhou&data_engine_name=dailai_test',
'qcloud.dlc.managed.account.uid' = '100026378089',
'request.identity.token' = '100026378089',
'user.appid' = '1257058945',
'uri' = 'dlc.internal.tencentcloudapi.com'
);

WITH parameters

Common parameters

Option
Required
Default Value
Description
connector
Yes
None
The connector to use. Here, it should be dlc-inlong.
catalog-database
Yes
None
The name of the database where the Data Lake Compute internal table resides.
catalog-table
Yes
None
The name of the Data Lake Compute internal table.
default-database
Yes
None
The name of the database where the Data Lake Compute internal table resides.
catalog-name
Yes
None
The name of the catalog. Here, it should be HYBRIS.
catalog-impl
Yes
None
The implementation class of the catalog. Here, it should be org.apache.inlong.sort.iceberg.catalog.hybris.DlcWrappedHybrisCatalog.
qcloud.dlc.managed.account.uid
Yes
None
The uid of the Data Lake Compute account. Here, it should be 100026378089.
qcloud.dlc.secret-id
Yes
None
The secretId of the Data Lake Compute user, which can be obtained via https://console.tencentcloud.com/cam/capi.
qcloud.dlc.secret-key
Yes
None
The secretKey of the Data Lake Compute user, which can be obtained via https://console.tencentcloud.com/cam/capi.
qcloud.dlc.region
Yes
None
The region where the Data Lake Compute instance resides. Here, it should be in the format of ap-region.
qcloud.dlc.jdbc.url
Yes
None
The URL for Data Lake Compute JDBC connection.
uri
Yes
None
The URI for Data Lake Compute connection. Here, it should be dlc.internal.tencentcloudapi.com.
user.appid
Yes
None
The appid of the Data Lake Compute user.
request.identity.token
Yes
None
The token for connecting the Data Lake Compute internal table. Here, it should be 100026378089.
sink.ignore.changelog
No
Yes
Whether to ignore delete data, which defaults to false. If this option is set to true, the append mode is enabled.

Configuring a Data Lake Compute table

Upsert mode
-- Statements to create a Data Lake Compute table
CREATE TABLE `bi_sensor`(
`uuid` string,
`id` string,
`type` string,
`project` string,
`properties` string,
`sensors_id` string,
`time` int,
`hour` int) PARTITIONED BY (`time`);
-- Set the target table as table v2 and allow for upsert operations.
ALTER TABLE `bi_sensor` SET TBLPROPERTIES ('format-version'='2','write.metadata.delete-after-commit.enabled' = 'true', 'write.metadata.previous-versions-max' = '100', 'write.metadata.metrics.default' = 'full', 'write.upsert.enabled'='true', 'write.distribution-mode'='hash');

-- oceanus sink DDL. The primary key and partitioning field of the Data Lake Compute table must be entered in the primary key field defined in Flink.
create table bi_sensors (
`uuid` STRING,
`id` STRING,
`type` STRING,
`project` STRING,
`properties` STRING,
`sensors_id` STRING,
`time` int,
`hour` int,
PRIMARY KEY (`uuid`, `time`) NOT ENFORCED
) with (...)


ヘルプとサポート

この記事はお役に立ちましたか?

フィードバック