+
+
+1. Open the **Import** page for your target TiDB instance.
+
+ 1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
+
+ > **Tip:**
+ >
+ > You can use the combo box in the upper-left corner to switch between organizations and instances.
+
+ 2. Click the name of your target TiDB instance to go to its overview page, and then click **Data** > **Import** in the left navigation pane.
+
+2. Click **Import data from Cloud Storage**.
+
+3. On the **Import Data from Cloud Storage** page, provide the following information:
+
+ - **Storage Provider**: select **Amazon S3**.
+ - **Source Files URI**:
+ - When importing one file, enter the source file URI in the following format `s3://[bucket_name]/[data_source_folder]/[file_name].csv`. For example, `s3://sampledata/ingest/TableName.01.csv`.
+ - When importing multiple files, enter the source folder URI in the following format `s3://[bucket_name]/[data_source_folder]/`. For example, `s3://sampledata/ingest/`.
+ - **Credential**: you can use either an AWS Role ARN or an AWS access key to access your bucket. For more information, see [Configure Amazon S3 access](/tidb-cloud/serverless-external-storage.md#configure-amazon-s3-access).
+ - **AWS Role ARN**: enter the AWS Role ARN value. If you need to create a new role, click **Click here to create a new one with AWS CloudFormation** and follow the guided steps to launch the provided template, acknowledge the IAM warning, create the stack, and copy the generated ARN back into {{{ .premium }}}.
+ - **AWS Access Key**: enter the AWS access key ID and AWS secret access key.
+ - **Test Bucket Access**: click this button after the credentials are in place to confirm that {{{ .premium }}} can reach the bucket.
+ - **Target Connection**: provide the TiDB username and password that will run the import. Optionally, click **Test Connection** to validate the credentials.
+
+4. Click **Next**.
+
+5. In the **Source Files Mapping** section, {{{ .premium }}} scans the bucket and proposes mappings between the source files and destination tables.
+
+ When a directory is specified in **Source Files URI**, the **Use [File naming conventions](/tidb-cloud/naming-conventions-for-data-import.md) for automatic mapping** option is selected by default.
+
+ > **Note:**
+ >
+ > When a single file is specified in **Source Files URI**, the **Use [File naming conventions](/tidb-cloud/naming-conventions-for-data-import.md) for automatic mapping** option is not displayed, and {{{ .premium }}} automatically populates the **Source** field with the file name. In this case, you only need to select the target database and table for data import.
+
+ - Leave automatic mapping enabled to apply the [file naming conventions](/tidb-cloud/naming-conventions-for-data-import.md) to your source files and target tables. Keep **CSV** selected as the data format.
+
+ - **Advanced options**: expand the panel to view the `Ignore compatibility checks (advanced)` toggle. Leave it disabled unless you intentionally want to bypass schema compatibility validation.
+
+
+ > **Note:**
+ >
+ > Manual mapping is coming soon. When the toggle becomes available, clear the automatic mapping option and configure the mapping manually:
+ >
+ > - **Source**: enter a filename pattern such as `TableName.01.csv`. Wildcards `*` and `?` are supported (for example, `my-data*.csv`).
+ > - **Target Database** and **Target Table**: choose the destination objects for the matched files.
+
+6. {{{ .premium }}} automatically scans the source path. Review the scan results, check the data files found and corresponding target tables, and then click **Start Import**.
+
+7. When the import progress shows **Completed**, check the imported tables.
+
+
+
+
+
+1. Open the **Import** page for your target TiDB instance.
+
+ 1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page of your project.
+
+ > **Tip:**
+ >
+ > You can use the combo box in the upper-left corner to switch between organizations and instances.
+
+ 2. Click the name of your target TiDB instance to go to its overview page, and then click **Data** > **Import** in the left navigation pane.
+
+2. Click **Import data from Cloud Storage**.
+
+3. On the **Import Data from Cloud Storage** page, provide the following information:
+
+ - **Storage Provider**: select **Alibaba Cloud OSS**.
+ - **Source Files URI**:
+ - When importing one file, enter the source file URI in the following format `oss://[bucket_name]/[data_source_folder]/[file_name].csv`. For example, `oss://sampledata/ingest/TableName.01.csv`.
+ - When importing multiple files, enter the source folder URI in the following format `oss://[bucket_name]/[data_source_folder]/`. For example, `oss://sampledata/ingest/`.
+ - **Credential**: you can use an AccessKey pair to access your bucket. For more information, see [Configure Alibaba Cloud Object Storage Service (OSS) access](/tidb-cloud/serverless-external-storage.md#configure-alibaba-cloud-object-storage-service-oss-access).
+ - **Test Bucket Access**: click this button after the credentials are in place to confirm that {{{ .premium }}} can reach the bucket.
+ - **Target Connection**: provide the TiDB username and password that will run the import. Optionally, click **Test Connection** to validate the credentials.
+
+4. Click **Next**.
+
+5. In the **Source Files Mapping** section, {{{ .premium }}} scans the bucket and proposes mappings between the source files and destination tables.
+
+ When a directory is specified in **Source Files URI**, the **Use [File naming conventions](/tidb-cloud/naming-conventions-for-data-import.md) for automatic mapping** option is selected by default.
+
+ > **Note:**
+ >
+ > When a single file is specified in **Source Files URI**, the **Use [File naming conventions](/tidb-cloud/naming-conventions-for-data-import.md) for automatic mapping** option is not displayed, and {{{ .premium }}} automatically populates the **Source** field with the file name. In this case, you only need to select the target database and table for data import.
+
+ - Leave automatic mapping enabled to apply the [file naming conventions](/tidb-cloud/naming-conventions-for-data-import.md) to your source files and target tables. Keep **CSV** selected as the data format.
+
+ - **Advanced options**: expand the panel to view the `Ignore compatibility checks (advanced)` toggle. Leave it disabled unless you intentionally want to bypass schema compatibility validation.
+
+
+ > **Note:**
+ >
+ > Manual mapping is coming soon. When the toggle becomes available, clear the automatic mapping option and configure the mapping manually:
+ >
+ > - **Source**: enter a filename pattern such as `TableName.01.csv`. Wildcards `*` and `?` are supported (for example, `my-data*.csv`).
+ > - **Target Database** and **Target Table**: choose the destination objects for the matched files.
+
+6. {{{ .premium }}} automatically scans the source path. Review the scan results, check the data files found and corresponding target tables, and then click **Start Import**.
+
+7. When the import progress shows **Completed**, check the imported tables.
+
+
+
+
+
+When you run an import task, if any unsupported or invalid conversions are detected, {{{ .premium }}} terminates the import job automatically and reports an importing error.
+
+If you get an importing error, do the following:
+
+1. Drop the partially imported table.
+2. Check the table schema file. If there are any errors, correct the table schema file.
+3. Check the data types in the CSV files.
+4. Try the import task again.
+
+## Troubleshooting
+
+### Resolve warnings during data import
+
+After clicking **Start Import**, if you see a warning message such as `can't find the corresponding source files`, resolve this by providing the correct source file, renaming the existing one according to [Naming Conventions for Data Import](/tidb-cloud/naming-conventions-for-data-import.md), or using **Advanced Settings** to make changes.
+
+After resolving these issues, you need to import the data again.
+
+### Zero rows in the imported tables
+
+After the import progress shows **Completed**, check the imported tables. If the number of rows is zero, it means no data files matched the Bucket URI that you entered. In this case, resolve this issue by providing the correct source file, renaming the existing one according to [Naming Conventions for Data Import](/tidb-cloud/naming-conventions-for-data-import.md), or using **Advanced Settings** to make changes. After that, import those tables again.
diff --git a/tidb-cloud/premium/import-from-s3-premium.md b/tidb-cloud/premium/import-from-s3-premium.md
new file mode 100644
index 0000000000000..8d7e5f5a7f069
--- /dev/null
+++ b/tidb-cloud/premium/import-from-s3-premium.md
@@ -0,0 +1,78 @@
+---
+title: Import Data from Amazon S3 into {{{ .premium }}}
+summary: Learn how to import CSV files from Amazon S3 into {{{ .premium }}} instances using the console wizard.
+---
+
+# Import Data from Amazon S3 into {{{ .premium }}}
+
+This document describes how to import CSV files from Amazon Simple Storage Service (Amazon S3) into {{{ .premium }}} instances. The steps reflect the current private preview user interface and serve as an initial framework for the upcoming public preview launch.
+
+> **Warning:**
+>
+> {{{ .premium }}} is currently available in **private preview** in select AWS regions.
+>
+> If Premium is not yet enabled for your organization, or if you need access in another cloud provider or region, click **Support** in the lower-left corner of the [TiDB Cloud console](https://tidbcloud.com/), or submit a request through the [Contact Us](https://www.pingcap.com/contact-us) form on the website.
+
+> **Tip:**
+>
+> - For {{{ .starter }}} or Essential, see [Import CSV Files from Cloud Storage into {{{ .starter }}} or Essential](/tidb-cloud/import-csv-files-serverless.md).
+> - For {{{ .dedicated }}}, see [Import CSV Files from Cloud Storage into {{{ .dedicated }}}](/tidb-cloud/import-csv-files.md).
+
+## Limitations
+
+- To ensure data consistency, {{{ .premium }}} allows importing CSV files into empty tables only. If the target table already contains data, import into a staging table and then copy the rows using the `INSERT ... SELECT` statement.
+- During the private preview, the user interface currently supports Amazon S3 as the only storage provider. Support for additional providers will be added in future releases.
+- Each import job maps a single source pattern to one destination table.
+
+## Step 1. Prepare the CSV files
+
+1. If a CSV file is larger than 256 MiB, consider splitting it into smaller files around 256 MiB so {{{ .premium }}} can process them in parallel.
+2. Name your CSV files according to the Dumpling naming conventions:
+ - Full-table files: use the `${db_name}.${table_name}.csv` format.
+ - Sharded files: append numeric suffixes, such as `${db_name}.${table_name}.000001.csv`.
+ - Compressed files: use the `${db_name}.${table_name}.${suffix}.csv.${compress}` format.
+3. Optional schema files (`${db_name}-schema-create.sql`, `${db_name}.${table_name}-schema.sql`) help {{{ .premium }}} create databases and tables automatically.
+
+
+
+## Step 2. Create target schemas (optional)
+
+If you want {{{ .premium }}} to create the databases and tables automatically, place the schema files generated by Dumpling in the same S3 directory. Otherwise, create the databases and tables manually in {{{ .premium }}} before running the import.
+
+## Step 3. Configure access to Amazon S3
+
+To allow {{{ .premium }}} to read your bucket, use either of the following methods:
+
+- Provide an AWS Role ARN that trusts TiDB Cloud and grants the `s3:GetObject` and `s3:ListBucket` permissions on the relevant paths.
+- Provide an AWS access key (access key ID and secret access key) with equivalent permissions.
+
+The wizard includes a helper link labeled **Click here to create a new one with AWS CloudFormation**. Follow this link if you need {{{ .premium }}} to pre-fill a CloudFormation stack that creates the role for you.
+
+## Step 4. Import CSV files from Amazon S3
+
+1. In the [TiDB Cloud console](https://tidbcloud.com/tidbs), navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page, and then click the name of your TiDB instance.
+2. In the left navigation pane, click **Data** > **Import**, and choose **Import data from Cloud Storage**.
+3. In the **Source Connection** dialog:
+ - Set **Storage Provider** to **Amazon S3**.
+ - Enter the **Source Files URI** for a single file (`s3://bucket/path/file.csv`) or for a folder (`s3://bucket/path/`).
+ - Choose **AWS Role ARN** or **AWS Access Key** and provide the credentials.
+ - Click **Test Bucket Access** to validate connectivity.
+
+4. Click **Next** and provide the TiDB SQL username and password for the import job. Optionally, test the connection.
+5. Review the automatically generated source-to-target mapping. Disable automatic mapping if you need to define custom patterns and destination tables.
+6. Click **Next** to run the pre-check. Resolve any warnings about missing files or incompatible schemas.
+7. Click **Start Import** to launch the job group.
+8. Monitor the job statuses until they show **Completed**, then verify the imported data in TiDB Cloud.
+
+## Troubleshooting
+
+- If the pre-check reports zero files, verify the S3 path and IAM permissions.
+- If jobs remain in **Preparing**, ensure that the destination tables are empty and the required schema files exist.
+- Use the **Cancel** action to stop a job group if you need to adjust mappings or credentials.
+
+## Next steps
+
+- See [Import Data into {{{ .premium }}} using the MySQL Command-Line Client](/tidb-cloud/premium/import-with-mysql-cli-premium.md) for scripted imports.
+- See [Troubleshoot Access Denied Errors during Data Import from Amazon S3](/tidb-cloud/troubleshoot-import-access-denied-error.md) for IAM-related problems.
diff --git a/tidb-cloud/premium/import-with-mysql-cli-premium.md b/tidb-cloud/premium/import-with-mysql-cli-premium.md
new file mode 100644
index 0000000000000..39185d496ad95
--- /dev/null
+++ b/tidb-cloud/premium/import-with-mysql-cli-premium.md
@@ -0,0 +1,179 @@
+---
+title: Import Data into {{{ .premium }}} using the MySQL Command-Line Client
+summary: Learn how to import small CSV or SQL files into {{{ .premium }}} instances using the MySQL Command-Line Client (`mysql`).
+---
+
+# Import Data into {{{ .premium }}} using the MySQL Command-Line Client
+
+This document describes how to import data into {{{ .premium }}} using the [MySQL Command-Line Client](https://dev.mysql.com/doc/refman/8.0/en/mysql.html) (`mysql`). The following sections provide step-by-step instructions for importing data from SQL or CSV files. This process performs a logical import, where the MySQL Command-Line Client replays SQL statements from your local machine against TiDB Cloud.
+
+> **Warning:**
+>
+> {{{ .premium }}} is currently available in **private preview** in select AWS regions.
+>
+> If Premium is not yet enabled for your organization, or if you need access in another cloud provider or region, click **Support** in the lower-left corner of the [TiDB Cloud console](https://tidbcloud.com/), or submit a request through the [Contact Us](https://www.pingcap.com/contact-us) form on the website.
+
+> **Tip:**
+>
+> - Logical imports are best suited for relatively small SQL or CSV files. For faster, parallel imports from cloud storage or to process multiple files from [Dumpling](https://docs.pingcap.com/tidb/stable/dumpling-overview) exports, see [Import CSV Files from Cloud Storage into {{{ .premium }}}](/tidb-cloud/premium/import-csv-files-premium.md).
+> - For {{{ .starter }}} or Essential, see [Import Data into {{{ .starter }}} or Essential via MySQL CLI](/tidb-cloud/import-with-mysql-cli-serverless.md).
+> - For {{{ .dedicated }}}, see [Import Data into {{{ .dedicated }}} via MySQL CLI](/tidb-cloud/import-with-mysql-cli.md).
+
+## Prerequisites
+
+Before you can import data to a {{{ .premium }}} instance via the MySQL Command-Line Client, you need the following prerequisites:
+
+- You have access to your {{{ .premium }}} instance.
+- Install the MySQL Command-Line Client (`mysql`) on your local computer.
+
+## Step 1. Connect to your {{{ .premium }}} instance
+
+Connect to your TiDB instance using the MySQL Command-Line Client. If this is your first time, perform the following steps to configure the network connection and generate the TiDB SQL `root` user password:
+
+1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/project/instances) page. Then, click the name of your target instance to go to its overview page.
+
+2. Click **Connect** in the upper-right corner. A connection dialog is displayed.
+
+3. Ensure that the configurations in the connection dialog match your operating environment.
+
+ - **Connection Type** is set to `Public`.
+ - **Connect With** is set to `MySQL CLI`.
+ - **Operating System** matches your environment.
+
+ > **Note:**
+ >
+ > {{{ .premium }}} instances have the public endpoint disabled by default. If you do not see the `Public` option, enable the public endpoint on the instance details page (under the **Network** tab), or ask an organization admin to enable it before proceeding.
+
+4. Click **Generate Password** to create a random password. If you have already configured a password, reuse that credential or rotate it before proceeding.
+
+## Step 2. Define the target database and table schema
+
+Before importing data, create the target table structure that matches your dataset.
+
+The following is an example SQL file (`products-schema.sql`) that creates a sample database and table. Update the database or table names to match your environment.
+
+```sql
+CREATE DATABASE IF NOT EXISTS test;
+USE test;
+
+CREATE TABLE products (
+ product_id INT PRIMARY KEY,
+ product_name VARCHAR(255),
+ price DECIMAL(10, 2)
+);
+```
+
+Run the schema file against your {{{ .premium }}} instance so the database and table exist before you load data in the next step.
+
+## Step 3. Import data from an SQL or CSV file
+
+Use the MySQL Command-Line Client to load data into the schema you created in Step 2. Replace the placeholders with your own file paths, credentials, and dataset as needed, then follow the workflow that matches your source format.
+
+
+
+
+Do the following to import data from an SQL file:
+
+1. Provide an SQL file (for example, `products.sql`) that contains the data you want to import. This SQL file must include `INSERT` statements with data, similar to the following:
+
+ ```sql
+ INSERT INTO products (product_id, product_name, price) VALUES
+ (1, 'Laptop', 999.99),
+ (2, 'Smartphone', 499.99),
+ (3, 'Tablet', 299.99);
+ ```
+
+2. Use the following command to import data from the SQL file:
+
+ ```bash
+ mysql --comments --connect-timeout 150 \
+ -u '' -h -P 4000 -D test \
+ --ssl-mode=VERIFY_IDENTITY --ssl-ca= \
+ -p < products.sql
+ ```
+
+ Replace the placeholder values (for example, ``, ``, ``, ``, and the SQL file name) with your own connection details and file path.
+
+> **Note:**
+>
+> The sample schema creates a `test` database and the commands use `-D test`. Change both the schema file and the `-D` parameter if you plan to import into a different database.
+
+
+
+The SQL user you authenticate with must have the required privileges (for example, `CREATE` and `INSERT`) to define tables and load data into the target database.
+
+
+
+
+
+
+Do the following to import data from a CSV file:
+
+1. Ensure the target database and table exist in TiDB (for example, the `products` table you created in Step 2).
+
+2. Provide a sample CSV file (for example, `products.csv`) that contains the data you want to import. The following is an example:
+
+ **products.csv:**
+
+ ```csv
+ product_id,product_name,price
+ 1,Laptop,999.99
+ 2,Smartphone,499.99
+ 3,Tablet,299.99
+ ```
+
+3. Use the following command to import data from the CSV file:
+
+ ```bash
+ mysql --comments --connect-timeout 150 \
+ -u '' -h -P 4000 -D test \
+ --ssl-mode=VERIFY_IDENTITY --ssl-ca= \
+ -p \
+ -e "LOAD DATA LOCAL INFILE '' INTO TABLE products
+ FIELDS TERMINATED BY ','
+ LINES TERMINATED BY '\n'
+ IGNORE 1 LINES (product_id, product_name, price);"
+ ```
+
+ Replace the placeholder values (for example, ``, ``, ``, ``, ``, and the table name) with your own connection details and dataset paths.
+
+> **Note:**
+>
+> For more syntax details about `LOAD DATA LOCAL INFILE`, see [`LOAD DATA`](/sql-statements/sql-statement-load-data.md).
+
+
+
+
+## Step 4. Validate the imported data
+
+After the import is complete, run basic queries to verify that the expected rows are present and the data is correct.
+
+Use the MySQL Command-Line Client to connect to the same database and run validation queries, such as counting rows and inspecting sample records:
+
+```bash
+mysql --comments --connect-timeout 150 \
+ -u '