Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
spark.sql("CREATE TABLE integrated.TrailingWeeks(ID bigint GENERATED BY DEFAULT AS IDENTITY (START WITH 0 increment by 1) ,Week_ID int NOT NULL) USING delta OPTIONS (path 'dbfs:/<Path in Azure datalake>/delta')")
Hi,When you define an identity column in Databricks with GENERATED BY DEFAULT AS IDENTITY (START WITH 0 INCREMENT BY 1), it is expected to start at 0 and increment by 1. However, due to Databricks' distributed architecture, the values may not be str...
While following the video lesson and executing the notebook 4.2, I noticed that creating the CREATE Table "users_jdbc" command generates an EXTERNAL table, while the video and, notebook too, suggests it as being a Managed table.Here are some printscr...
Hi @Tiago Barata Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...
The Problem:I've observed erratic behavior when I add a comment containing a trailing escape character (\) to a CREATE TABLE statement.For example, this query returns data (though it shouldn't):CREATE TABLE example_table
SELECT 1
-- This comment has ...
Hi everyone, I have connected to Cosmos using this tutorial https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3_2-12/Samples/DatabricksLiveContainerMigrationAfter creating a table using a simple SQL command:CREATE TA...
Hey there @Joel iemma Hope all is well! Just wanted to check in if you would be happy to mark an answer as best for us, please? It would be really helpful for the other members too.Cheers!
On the Data tab in the workspace I have the "Create Table" button which gives me the option to upload a local file as a data source. Can I upload an Excel file here? Not sure what kind of files work for this.
Currently the file types supported there are CSV, JSON and Avro. You could, however upload the excel file to the dbfs path under FileStore and write code in a notebook to parse it and persist it to a table