cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Clean Rooms: Now Generally Available on AWS and Azure

We’re thrilled to announce the General Availability (GA) of Databricks Clean Rooms on AWS and Azure, a significant step forward in enabling secure, privacy-centric data collaboration. Following the Public Preview launch in August, we've engaged close...

  • 102 Views
  • 0 replies
  • 1 kudos
Wednesday
Welcoming BladeBridge to Databricks: Accelerating Data Warehouse Migrations to Lakehouse

Databricks welcomes BladeBridge, a proven provider of AI-powered migration solutions for enterprise data warehouses. Together, Databricks and BladeBridge will help enterprises accelerate the work required to migrate legacy data warehouses like Oracle...

  • 114 Views
  • 0 replies
  • 0 kudos
Wednesday
Serverless Compute for Notebooks, Workflows and Pipelines is now Generally Available on Google Cloud

In the rapidly evolving landscape of data engineering and analytics, speed, scalability, and simplicity are invaluable. Serverless compute addresses these needs by eliminating the complexity of managing infrastructure, allowing you to focus on buildi...

  • 130 Views
  • 0 replies
  • 0 kudos
Wednesday
Securely share data, analytics and AI

Reduce the cost of sharing across platforms Gartner predicts that CDOs who have successfully executed data sharing initiatives are 1.7 times more effective in showing business value and ROI from their data analytics strategy. Databricks provides an ...

  • 515 Views
  • 0 replies
  • 3 kudos
a week ago
Check Out the Latest Videos on DatabricksTV

We are thrilled to introduce DatabricksTV – your go-to hub for all things Databricks! DatabricksTV is packed with insightful videos from a diverse range of creators, offering you the latest tips, tutorials, and deep dives into the Databricks Platform...

  • 1822 Views
  • 0 replies
  • 3 kudos
07-30-2024
Data Intelligence for Data Engineers

Join us to find out how a platform built on lakehouse architecture and enhanced with built-in data intelligence automates many of the tasks that bog down engineers. You’ll discover how the Databricks Data Intelligence Platform helps you build secure...

  • 1022 Views
  • 0 replies
  • 2 kudos
2 weeks ago

Community Activity

ohnomydata
by > Visitor
  • 8 Views
  • 0 replies
  • 0 kudos

Accidentally deleted files via API

Hello,I’m hoping you might be able to help me.I have accidentally deleted some Workspace files via API (an Azure DevOps code deployment pipeline). I can’t see the files in my Trash folder – are they gone forever, or is it possible to recover them on ...

  • 8 Views
  • 0 replies
  • 0 kudos
Somia
by > New Contributor
  • 83 Views
  • 6 replies
  • 2 kudos

Resolved! sql query is not returning _sqldf.

Notebooks in my workspace are not returning _sqldf when a sql query is run. If I run this code, it would give an error in second cell that _sqldf is not defined.First Cell:%sqlselect * from some_table limit 10Second Cell:%sqlselect * from _sqldfHowev...

  • 83 Views
  • 6 replies
  • 2 kudos
Latest Reply
Somia
New Contributor
  • 2 kudos

Changing the notebook to default python and all purpose compute have fixed the issue. I am able to access _sqldf in subsequent sql or python cell.

  • 2 kudos
5 More Replies
pradeepvatsvk
by > New Contributor II
  • 155 Views
  • 2 replies
  • 0 kudos

polars to natively read and write through adls

HI Everyone,Is there a way polars can directly read files from ADLS  through abfss protocol.

  • 155 Views
  • 2 replies
  • 0 kudos
Latest Reply
jennifer986bloc
New Contributor II
  • 0 kudos

@pradeepvatsvk wrotae:HI Everyone,Is there a way polars can directly read files from ADLS  through abfss protocol.Hello @pradeepvatsvk,Yes, Polars can directly read files from Azure Data Lake Storage (ADLS) using the ABFS (Azure Blob Filesystem) prot...

  • 0 kudos
1 More Replies
hiryucodes
by > Visitor
  • 81 Views
  • 1 replies
  • 0 kudos

ModuleNotFound when running DLT pipeline

My new DLT pipeline gives me a ModuleNotFound error when I try to request data from an API. For some more context, I develop in my local IDE and then deploy to databricks using asset bundles. The pipeline runs fine if I try to write a static datafram...

  • 81 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @hiryucodes, Ensure that the directory structure of your project is correctly set up. The module 'src' should be in a directory that is part of the Python path. For example, if your module is in a directory named 'src', the directory structure sho...

  • 0 kudos
Rafael-Sousa
by > Contributor II
  • 40 Views
  • 3 replies
  • 0 kudos

Managed Delta Table corrupted

Hey guys,Recently, we have add some properties to our delta table and after that, the table shows error and we cannot do anything. The error is that: (java.util.NoSuchElementException) key not found: spark.sql.statistics.totalSizeI think maybe this i...

  • 40 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Rafael-Sousa, Could you please raise a support case for this, to investigate this further? help@databricks.com

  • 0 kudos
2 More Replies
samtech
by > Visitor
  • 27 Views
  • 1 replies
  • 0 kudos

DAB multiple workspaces

Hi,We have 3 regional workspaces. Assume that we keep seperate folder for notebook say amer/xx , apac/xx, emea/xx and sepeate job/pipeline configrations for each region in git how to make sure during deploy appropriate job/pipleines are deployed in r...

  • 27 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @samtech, Define separate bundle configuration files for each region. These configuration files will specify the resources (notebooks, jobs, pipelines) and their respective paths. For example, you can have amer_bundle.yml, apac_bundle.yml, and eme...

  • 0 kudos
Phani1
by > Valued Contributor II
  • 8 Views
  • 1 replies
  • 0 kudos

Databricks On-Premises or in Private Cloud

Hi All,Is it possible to store/process the data on-premises or in a private cloud with Databricks? Will this choice affect costs and performance? Please advise, as the customer wants the data stored on-premises or in a private cloud for security reas...

  • 8 Views
  • 1 replies
  • 0 kudos
Latest Reply
TakuyaOmi
Valued Contributor II
  • 0 kudos

@Phani1 Databricks does not provide a product that can be directly installed and self-managed on on-premises or private cloud environments. Instead, Databricks primarily operates as a managed service on public cloud platforms such as AWS, Azure, and ...

  • 0 kudos
cpayne_vax
by > New Contributor III
  • 16672 Views
  • 12 replies
  • 9 kudos

Resolved! Delta Live Tables: dynamic schema

Does anyone know if there's a way to specify an alternate Unity schema in a DLT workflow using the @Dlt.table syntax? In my case, I’m looping through folders in Azure datalake storage to ingest data. I’d like those folders to get created in different...

  • 16672 Views
  • 12 replies
  • 9 kudos
Latest Reply
kuldeep-in
Databricks Employee
  • 9 kudos

@user1234567899 Make sure to enable DPM from Previews page. Once enabled you should be able to use schema name in DLT.

  • 9 kudos
11 More Replies
Phani1
by > Valued Contributor II
  • 4 Views
  • 1 replies
  • 0 kudos

Multiple metastores in Unity Catalog

Hi All,Can we have more than one meta store in a region in Unity Catalog? Having a single meta store per region helps keep metadata management organized, but customers are asking for multiple meta stores for different needs. Is it possible to have se...

  • 4 Views
  • 1 replies
  • 0 kudos
Latest Reply
TakuyaOmi
Valued Contributor II
  • 0 kudos

@Phani1 When attempting to create multiple metastores in a specific region, you may encounter the following error: This region already contains a metastore. Only a single metastore per region is allowed.Databricks recommends having only one metastore...

  • 0 kudos
JamesD
by > New Contributor III
  • 2 Views
  • 0 replies
  • 0 kudos

Generative AI Solution Development - error running lab

The lab "2.1 - Preparing Data for RAG" has an issue. Can you please ensure the remaining parts of the notebook also work?  

JamesD_0-1738945799960.png
  • 2 Views
  • 0 replies
  • 0 kudos
MohanaBasak
by Databricks Employee
  • 363 Views
  • 2 replies
  • 3 kudos

Unlocking Cost and Performance Insights in Databricks with Custom Dashboards

As organizations continue to scale their data infrastructure, efficient resource utilization, cost control, and operational transparency are paramount for success. With the growing adoption of Databricks, monitoring and optimizing compute usage and d...

Screenshot 2024-12-15 at 5.20.05 PM.png Screenshot 2024-12-15 at 4.56.02 PM.png Screenshot 2024-12-15 at 5.11.08 PM.png Screenshot 2024-12-15 at 5.06.35 PM.png
  • 363 Views
  • 2 replies
  • 3 kudos
Latest Reply
MohanaBasak
Databricks Employee
  • 3 kudos

A really good optimization guide - https://www.databricks.com/discover/pages/optimize-data-workloads-guide. Apart from this, if you see from the dashboard that for certain jobs both CPU and memory is under utilized, you can just use lower compute and...

  • 3 kudos
1 More Replies
sachin_kanchan
by > Visitor
  • 37 Views
  • 3 replies
  • 0 kudos

Unable to log in into Community Edition

So I just registered for the Databricks Community Edition. And received an email for verification.When I click the link, I'm redirected to this website (image attached) where I am asked to input email. And when I do that, it sends me a verification c...

db_fail.png
  • 37 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Lets open a case with databricks-community@databricks.com

  • 0 kudos
2 More Replies
BriGuy
by > New Contributor II
  • 30 Views
  • 2 replies
  • 0 kudos

create a one off job run using databricks SDK.

I'm trying to build the job spec using objects.  When I try to call execute the job I get the following error.I'm somewhat new to python and not sure what I'm doing wrong here.  Is anyone able to help?Traceback (most recent call last): File "y:\My ...

  • 30 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @BriGuy, Can you try importing this module first? from databricks.sdk.service.jobs import PermissionLevel

  • 0 kudos
1 More Replies
melikaabedi
by > Visitor
  • 20 Views
  • 1 replies
  • 0 kudos

databricks apps

Imagine I develop an app in Databricks with #databricks-apps. Is it possible for someone outside the organization to use it just by accessing a URL, without having a Databricks account? thank you in advance for your hel

  • 20 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @melikaabedi, No, only users in the account can access a Databricks app, same way you would with AI/BI dashboards.

  • 0 kudos
Ajay-Pandey
by > Esteemed Contributor III
  • 676 Views
  • 1 replies
  • 1 kudos

📊 Simplifying CDC with Databricks Delta Live Tables & Snapshots 📊

In the world of data integration, synchronizing external relational databases (like Oracle, MySQL) with the Databricks platform can be complex, especially when Change Data Feed (CDF) streams aren’t available. Using snapshots is a powerful way to mana...

Pull-Based Snapshots.png
  • 676 Views
  • 1 replies
  • 1 kudos
Latest Reply
BilalHaniff1
New Contributor
  • 1 kudos

Hi AjayCan apply changes into snapshot handle re-processing of an older snapshot? UseCase:- Source has delivered data on day T, T1 and T2.  - Consumers realise there is an error on the day T data, and make a correction in the source. The source redel...

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog