Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I am new to Databricks. Please excuse my ignorance. My requirement is to convert the SQL query below into Databricks SQL. The query comes from EventLog table and the output of the query goes into EventSummaryThese queries can be found hereCREATE TABL...
you may explore the tool and services from Travinto Technologies . They have very good tools. We had explored their tool for our code coversion from Informatica, Datastage and abi initio to DATABRICKS , pyspark. Also we used for SQL queries, stored ...
When I use SQL code like "create table myTable (column1 string, column2 string) using csv options('delimiter' = ',', 'header' = 'true') location 'pathToCsv'" to create a table from a single CSV file stored in a folder within an Azure Data Lake contai...
Hi @andrew li, When you specify a path with LOCATION keyword, Spark will consider that to be an EXTERNAL table. So when you dropped the table, you underlying data if any will not be cleared. So in you case, as this is an external table, you folder s...
I am testing some SQL code based on the book SQL Cookbook Second Edition, available from https://downloads.yugabyte.com/marketing-assets/O-Reilly-SQL-Cookbook-2nd-Edition-Final.pdfBased on Page 43, I am OK with the left join, as shown here:However, w...
I'm new to databricks and try to migrate some former SQL code used so far, in a DB notebook.I have some constants declared and I can't find the right way to similarly declare those in my notebook.I tried %sql
DECLARE label_language CONSTANT VARCHAR(2...
Thank you @Hubert Dudek , gave it a try and it does the job, both in SQL and in Python btw. When the time will come to industrialize this, I'll have to figure out how to create/use/deal with some configuration files (json, yaml or so). But for explo...