tencent cloud

Data Lake Compute

DLC 정책
개인 정보 보호 정책
데이터 개인 정보 보호 및 보안 계약
문서Data Lake Compute

Session Management

포커스 모드
폰트 크기
마지막 업데이트 시간: 2025-03-21 12:22:27
The session management feature is used to record and trace notebook interactive sessions submitted to the DLC engine through the API or Wedata. Users can perform operations such as SQL queries, data processing, and model training through sessions.

Prerequisites

Environment preparation for Data Lake Compute (DLC).
Enable Tencent Cloud DLC engine service.
Creating a session requires purchasing a job type engine.
SuperSQL job engine.
Standard engine: Spark engine or machine learning resource group.

Operation Steps

1. Log in to DLC Console > Ops Management > Session Management and choose service region.
2. Enter the session management page, and users can view all the historical session records.
3. Support filtering and viewing by engine type, status, Kind, engine name, Session ID, and Session Name.
4. Click Session Name/ID. View session details is supported.
5. Support users to click kill to close the session on the console.
6. Support user viewing of the Spark UI of the session.

Session List

Field Name
Description
Session Name/ID
Unique identifier for the session.
Sessions created by the SuperSQL job engine only have a Session ID. Session ID rule: livy-session-uuid.
Sessions created by the standard engine or Spark engine
User-submitted Notebook, prefixed with session_test
User-submitted batch SQL, prefixed with temporary-rg
Status
State of the current session, can be divided into
not_started: The session has not been started. This status indicates that the session request has been accepted, but the session has not yet started for some reason (for example, insufficient resources or configuration problems). Users need to check related configurations or resource status to start the session.
Starting: The session is starting. This status means Livy is allocating resources and initializing the environment for a new Spark session.
idle: The session has started successfully and is in idle state. At this point, you can submit Spark jobs. The Livy session is ready to process requests.
busy: The session is processing one or more jobs. This status indicates that the session is executing tasks and cannot accept new job requests until the current job is completed.
shutting down: The session is deactivating. This status means the user has requested to stop the session, and Livy is performing clearing and resource release operations. The session may stay in this status for a period of time until all running jobs are completed and resources are released.
error: The session encounters an error during startup or execution. This status usually means the session is unable to function normally, possibly due to insufficient resources, configuration errors, or other problems.
dead: The session has died and cannot be recovered.
killed: The session is forcefully terminated. This status means the user has actively terminated the session, possibly because the session is no longer needed or there are problems with the ongoing jobs. A killed session cannot be recovered.
success: The session has been successfully completed. This status usually indicates that all jobs in the session have been successfully executed and completed. The session can be considered successful in this status, and users can view the results or output.
Engine
Computing engine.
Kind
Session type
Spark
Pyspark
SQL
Machine Learning
Python
MLlib
Creator
The user who creates a session.
Validity period
The running time of the session.


도움말 및 지원

문제 해결에 도움이 되었나요?

피드백