tencent cloud

Data Lake Compute

Release Notes
Product Introduction
Overview
Strengths
Use Cases
Purchase Guide
Billing Overview
Refund
Payment Overdue
Configuration Adjustment Fees
Getting Started
Complete Process for New User Activation
DLC Data Import Guide
Quick Start with Data Analytics in Data Lake Compute
Quick Start with Permission Management in Data Lake Compute
Quick Start with Partition Table
Enabling Data Optimization
Cross-Source Analysis of EMR Hive Data
Standard Engine Configuration Guide
Configuring Data Access Policy
Operation Guide
Console Operation Introduction
Development Guide
Runtime Environment
SparkJar Job Development Guide
PySpark Job Development Guide
Query Performance Optimization Guide
UDF Function Development Guide
System Restraints
Client Access
JDBC Access
TDLC Command Line Interface Tool Access
Third-party Software Linkage
Python Access
Practical Tutorial
Accessing DLC Data with Power BI
Table Creation Practice
Using Apache Airflow to Schedule DLC Engine to Submit Tasks
Direct Query of DLC Internal Storage with StarRocks
Spark cost optimization practice
DATA + AI
Using DLC to Analyze CLS Logs
Using Role SSO to Access DLC
Resource-Level Authentication Guide
Implementing Tencent Cloud TCHouse-D Read and Write Operations in DLC
DLC Native Table
SQL Statement
SuperSQL Statement
Overview of Standard Spark Statement
Overview of Standard Presto Statement
Reserved Words
API Documentation
History
Introduction
API Category
Making API Requests
Data Table APIs
Task APIs
Metadata APIs
Service Configuration APIs
Permission Management APIs
Database APIs
Data Source Connection APIs
Data Optimization APIs
Data Engine APIs
Resource Group for the Standard Engine APIs
Data Types
Error Codes
General Reference
Error Codes
Quotas and limits
Operation Guide on Connecting Third-Party Software to DLC
FAQs
FAQs on Permissions
FAQs on Engines
FAQs on Features
FAQs on Spark Jobs
DLC Policy
Privacy Policy
Data Privacy And Security Agreement
Service Level Agreement
Contact Us

Overview of SuperSQL Statement

PDF
Focus Mode
Font Size
Last updated: 2024-10-30 17:25:33
DLC can run almost seamlessly on the DLC Serverless Spark and DLC Serverless Presto engines through a set of standard SQL statements. Statements and functions related to metadata and analysis are basically compatible with Hive and Spark statements, and custom functions are supported. For supported system built-in functions, see Unified Functions. If you need to use Presto built-in functions, see Presto Built-in Functions. If you need to query and analyze data in external Iceberg tables on DLC, some statements may differ from those for native tables. For details, see Iceberg External Tables vs. Native Tables Statement Differences.
The following table lists statements supported by DLC:

DDL Statements

Database-Related Statements

Purpose
Statement
Create a database.
Display all databases defined in this metadata.
View database attributes.
Change database attributes.
Change database storage location.
Delete a database.

Data Table-Related Statements

Purpose
Statement
Create a data table.
Update table snapshot.
Query data table creation statements.
Query table attributes.
Query all tables in database.
View column information and metadata information on the data table.
Query column information on the data table.
Add a column to the data table.
Add a column to the data table.
Chang a field name.
Delete a field in the data table.
Add a partition to the data table.
List table partitions.
Delete a partition of the data table.
Add a partition field to the Iceberg table.
Delete a partition field of the Iceberg table.
Change data table attributes.
Change data table storage location.
Change the sorting method of data in the table.
Change data distribution policy of partitioned table.
Add identifier fields attribute.
Delete identifier fields attribute.
Update partition information.
Perform statistics on data table.
Delete metadata table.
Display logical or physical plan for executing SQL statements.
EXPLAIN
Call table-related stored procedure.

Statements Related to View

Purpose
Statement
Create a view based on results of SELECT statement.
Query view in the database.
Check view attributes.
Display view creation statements.
Check view columns.
Change view name.
Change view attributes.
Delete view.
DROP VIEW

Statements Related to Function

Purpose
Create a function.
View function creation statements
Delete a function.

DML Statements

Purpose
Insert a row of data.
Replace a row of data.
Update data by row (for replacing INSERT OVERWRITE).
Query Iceberg table metadata.
Insert query results to the data table.
Delete data of Iceberg table.
Update the specified row.
UPDATE

DQL Statement

Purpose
Query data.
For related reserved words, see Reserved Words.


Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback