Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...
As an additional tip for those working towards both the Associate and Professional certifications, I recommend avoiding a long gap between the two exams to maintain your momentum. If possible, try to schedule them back-to-back with just a few days in...
I am attempting to append the results from a notebook query results table into an existing databricks database table. By chance would someone share an example of the sql code with me?
Hi @wheersink ,So let's say you created following table with some sample values.%sql
CREATE TABLE dev.default.employee (
id INT,
name STRING,
age INT,
department STRING
);
INSERT INTO dev.default.employee VALUES
(1, 'John Doe', 30, 'Financ...
Unlock the power of the For Each task in Databricks to seamlessly iterate over collections—whether it's a list of table names or any value—and dynamically run tasks with specific parameter values. This powerful feature lets you automate repetitive pr...
Hi all,I am looking for collections of Code patterns for:SparkSQLpySparkScalaI'm sure there are at least a few repos in Github with snippets and will share them as I find them in this thread. If you come across any good collections, please post your...
AI/BI dashboards now support cross-filtering, which allows you to click on an element in one chart to filter and update related data in other charts.Cross-filtering allows users to interactively explore relationships and patterns across multiple visu...
Added support for SHOW CREATE TABLE for materialized views and streaming tables. This will show the complete CREATE command used at creation time, including properties and schedulesCatalog Explorer "Overview" tab now shows the full CREATE command for...
LakeFlow Connect introduces simple ingestion connectors for databases, enterprise applications, and file sources. Set up efficient, low-maintenance pipelines in just a few clicks or via an API. Current sources include Salesforce, Workday, and SQL Ser...
We have tried to build a connection test logic to our software to try out the reachability of the SQL Warehouse, yet the connection parameters do not seem to function in expected manner.When the SQL Warehouse is running, the connection test functions...
Hello,
To create the connection you would need an endpoint, I would suggest you to give Serverless warehouse a try so that you don not have to wait, and for the suggestion on the product you may also submit a feedback and share the details of use cas...
This is not a question, this is just the solution to a problem we encountered in case someone from the community finds it useful.Recently we encountered an issue, where our users' jobs started failing out of nowhere on the following command, with the...
AI/BI dashboards can now be managed through Terraform.Dashboard using serialized_dashboard attribute: data "databricks_sql_warehouse" "starter" {
name = "Starter Warehouse"
}
resource "databricks_dashboard" "dashboard" {
display_name = "...
Hello Everyone!!!I hope everyone knows about the exciting new launch from Databricks: LakeFlow. This innovative tool makes data engineering simpler and more efficient. Let’s understand what LakeFlow offers and how it can help data teams. Let’s unders...
AI/BI dashboards:Improvements:The assistant can now help you edit chart axis titles and control the visibility of chart data labels.You can now assign multiple measures to pivot tables.Added more relative datetime options, such as Last 10 seconds and...
Does anyone know how to do DevOps CICD pipeline to deploy changes to a Databricks Catalog? I've added a couple tables in DEV and need to deploy in Prod. Could'nt find any info on this anywhere.
Thanks @Retired_mod , the links refer mostly deployments of workspace objects rather than changes in database objects, eg. a new schema, table, or new column added to a table etc. I'll keep looking...
Databricks, a unified data analytics platform, is widely recognized for its ability to process big data and run complex algorithms. When it comes to infrastructure as code, Terraform stands out as a tool that can manage Databricks resources efficient...
Hi @Ajay-Pandey ,
Thank you for sharing this comprehensive guide on managing Databricks resources with Terraform. The detailed explanation of the different authentication methods and examples is very helpful. I appreciate you taking the time to put t...