cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Need to load the data from databricks to Snowflake table having ID,which automatically increments

ms_221
New Contributor II

I want to load the data from  df (say 3 columns c1,c2,c3) into the snowflake table say (test1) having columns (c1,c2,c3) and ID autoincrement column.

The df and snowflake table (test1) have same column definition and same datatypes. In the target table (test2) we have additional column say ID .

 

1 REPLY 1

Walter_C
Databricks Employee
Databricks Employee

To load data from a DataFrame into a Snowflake table with an autoincrement ID column, you can follow these steps:

  1. First, ensure that your Snowflake table (test1) is created with an autoincrement ID column:

    CREATE OR REPLACE TABLE test1 (
        ID INT AUTOINCREMENT START 1 INCREMENT 1,
        c1 VARCHAR,
        c2 VARCHAR,
        c3 VARCHAR
    );โ€‹
  2. Then, use the write_pandas method from the Snowflake Connector for Python to load the data from your DataFrame into the Snowflake table. Here's how you can do it:
    import pandas as pd
    from snowflake.connector.pandas_tools import write_pandas
    
    # Assuming you have already established a connection to Snowflake
    # and have a cursor object named 'cursor'
    
    # Your DataFrame
    df = pd.DataFrame({
        'c1': ['value1', 'value2', 'value3'],
        'c2': ['value4', 'value5', 'value6'],
        'c3': ['value7', 'value8', 'value9']
    })
    
    # Write the DataFrame to Snowflake
    success, nchunks, nrows, _ = write_pandas(
        conn=cursor.connection,
        df=df,
        table_name='test1',
        database='your_database',
        schema='your_schema'
    )
    
    if success:
        print(f"Successfully loaded {nrows} rows into {nchunks} chunks")
    else:
        print("Data load failed")

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group