โ06-04-2025 07:13 AM
Been trying to create a iceberg table natively in databricks with the cluster being 16.4. I also have the Iceberg JAR file for 3.5.2 Spark.
Using a simple command such as:
%sql
CREATE OR REPLACE TABLE catalog1.default.iceberg(
a INT
)
USING iceberg;
is running into a error of: "Failed to find the data source: iceberg. Make sure the provider name is correct and the package is properly registered and compatible with your Spark version".
My question is can we build these iceberg tables natively in Databricks (assuming private preview is also turned on + JAR file is loaded) or do we have to use a external client to build it and then push it to Databricks somehow? Or is it just specific formats (Parquet, etc) thats allowed?
โ06-04-2025 07:29 AM
CREATE OR REPLACE TABLE <catalog>.<schema>.<table>
with USING iceberg
.Failed to find the data source: iceberg
) may arise. For optimal compatibility and functionality, users should follow Databricks' guidelines and preview-specific configurations.โ06-04-2025 07:29 AM
CREATE OR REPLACE TABLE <catalog>.<schema>.<table>
with USING iceberg
.Failed to find the data source: iceberg
) may arise. For optimal compatibility and functionality, users should follow Databricks' guidelines and preview-specific configurations.โ06-04-2025 07:57 AM
Hey Lou,
That helps quite a bit, clears up the confusion on my end. Quick question, the managed private preview, is it enabled by Databricks (the account rep)? I'm assuming the answer is yes here, just wanted to make sure.
Thanks!
โ06-04-2025 08:52 AM
Hey Petergriffin1, I don't know the exact process but your best bet is what you suggested above. Start with your AE, he/she may direct you to your SA and they will take it from there. Hope this help. Lou.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now