- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-11-2024 02:18 PM
I'm running this code in databricks notebook and I want the table from dataframe in catalog were created with CDF enables. When I run the code table hasn't exited yet.
This code doesn't create a table with enables CDF. It doesn't add:
delta.enableChangeDataFeed = true
df \
.write \
.format("delta") \
.mode("overwrite") \
.option("overwriteSchema", "true") \
.option("delta.enableChangeDataFeed", "true") \
.saveAsTable(my_table_name_in_catalog):
Or it is possible only enable it via Spark SQL, after table already exists?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-12-2024 08:41 AM - edited 04-12-2024 08:43 AM
Hello @drag7ter ,
I don't see anything wrong with your approach, check my repro:
Raphael Balogo
Sr. Technical Solutions Engineer
Databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-12-2024 08:41 AM - edited 04-12-2024 08:43 AM
Hello @drag7ter ,
I don't see anything wrong with your approach, check my repro:
Raphael Balogo
Sr. Technical Solutions Engineer
Databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-13-2024 04:04 AM
Hi @raphaelblg in your screenshot cdcTest table already exists in unity catalog, or it will be created for the first time, running your code? It is really strange as I'm not able to enable from the pyspark code, only after tables created ALTER TABLES works.
"delta.enableChangeDataFeed", "true"

