cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Chinu
by New Contributor III
  • 4941 Views
  • 3 replies
  • 0 kudos

Tableau Desktop connection error from Mac M1

Hi, Im getting the below error while connecting SQL Warehouse from the tableau desktop. I installed the latest ODBC drivers (2.7.5) but I can confirm that the driver name is different. From the error message I see libsparkodbc_sbu.dylib but in my lap...

  • 4941 Views
  • 3 replies
  • 0 kudos
Latest Reply
jefflipkowitz
Databricks Employee
  • 0 kudos

Have you referred to this document?https://help.tableau.com/current/pro/desktop/en-us/examples_databricks.html https://help.tableau.com/current/pro/desktop/en-us/examples_databricks.htm

  • 0 kudos
2 More Replies
unity_Catalog
by New Contributor III
  • 9276 Views
  • 2 replies
  • 2 kudos

Resolved! Migrating dashboards from one workspace to another workspace

I'm exporting dashboard objects from an existing workspace to new workspace but after importing ,the underlying dashboards data is not coming to new workspace. I'm using the below code. Can anyone helpimport osimport requestsimport jsonimport logging...

  • 9276 Views
  • 2 replies
  • 2 kudos
Latest Reply
romy
Databricks Employee
  • 2 kudos

Hi, you can use the workspace API to import the dashboard: https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/workspace-lakeview-api#step-3-import-a-dashboard A code example is available on this thread: https://community.databric...

  • 2 kudos
1 More Replies
argl1995dbks
by New Contributor III
  • 3335 Views
  • 5 replies
  • 1 kudos

To trigger databricks workflow on defined frequency

Hi Databricks, I am trying to run a databricks workflow on scheduled basis (for e.g. the frequency is after every five mins). Here is the databricks.yaml file: bundle:  name: dab_demo# include:#   - resources/*.ymlvariables:  job_cluster_key:    desc...

  • 3335 Views
  • 5 replies
  • 1 kudos
Latest Reply
argl1995dbks
New Contributor III
  • 1 kudos

Hi, let me explain you the current scenario, we have databricks workflows which has DS, DE and MLOps tasks. The workflows are meant to be triggered on a specific frequency i.e. Monthly and Quarterly and Quarterly workflow depends on the Monthly workf...

  • 1 kudos
4 More Replies
mhasel
by New Contributor II
  • 2563 Views
  • 1 replies
  • 1 kudos

Cannot use Databricks VSCode extensions on repository with ".devcontainer" folder

Hi, I have a repository that contains a ".devcontainer" folder. I use VSCode and try to use the databricks extension following this guide When I run "Upload and Run File on Databricks" I have this error and then cannot run the python scriptSync Error...

  • 2563 Views
  • 1 replies
  • 1 kudos
Latest Reply
mhasel
New Contributor II
  • 1 kudos

Hi @Retired_mod The folder lays plain in the repository. When logging on the databricks Web UI I can see that the folder (together with its content) was correctly copied But I still have the error messsage in VSCode 

  • 1 kudos
elaya6
by New Contributor II
  • 968 Views
  • 1 replies
  • 0 kudos

Exam got suspended

Hi Team,In this morning i was taking the exam suddenly proctor asked me to show the room and walls i showed him then he suspended my exam no one was there. I dont understand why he suspended the exam. Kindly reschedule the exam i need the certificati...

  • 968 Views
  • 1 replies
  • 0 kudos
Latest Reply
elaya6
New Contributor II
  • 0 kudos

Hi Team,Could you please atleast reschedule the exam. i will take it from the prometric centerthanks

  • 0 kudos
araiho
by New Contributor II
  • 3335 Views
  • 2 replies
  • 1 kudos

R Package Installation Best Practices

Hello,We are new to databricks and are wondering what the best practices are for R package installation. We currently have cluster spin up wait times of more than 20 minutes with our init scripts. We have tried the following:1. Libraries tab in the c...

  • 3335 Views
  • 2 replies
  • 1 kudos
Latest Reply
araiho
New Contributor II
  • 1 kudos

@Retired_mod Thank you for your detailed response! I think we would like to use Docker if we can because we are not using RStudio but R directly in the databricks notebooks and workflows. So, anymore information about R and Docker and Databricks woul...

  • 1 kudos
1 More Replies
ankris
by New Contributor III
  • 6773 Views
  • 6 replies
  • 2 kudos

data quality check in data engineering

Can we use deequ library with azure databricks ? if yes Please provide some support material or examplesIs there any similar data quality library or suggestion to achieve automatic data quality check during data engineering (Azure databricks)Thanks i...

  • 6773 Views
  • 6 replies
  • 2 kudos
Latest Reply
joarobles
New Contributor III
  • 2 kudos

Hi there! You could also take a look at Rudol, it enables no-code Data Quality validations to enable non-technical roles such as Business Analysts or Data Stewards to configure quality checks by themselves. 

  • 2 kudos
5 More Replies
ArghaB
by New Contributor II
  • 4411 Views
  • 8 replies
  • 9 kudos

Resolved! Facing StorageContext Error while trying to access DBFS

This issue has hindered my practice for the whole day. I scoured the web and couldn't find anybody who has faced this particular error. The error I am getting is: DBFS file browserStorageContext com.databricks.backend.storage.StorageContextType$DbfsR...

  • 4411 Views
  • 8 replies
  • 9 kudos
Latest Reply
mozartiano
New Contributor III
  • 9 kudos

Yeah, unable to save any file with rdd.saveTextfile and to upload any file using the workspace.

  • 9 kudos
7 More Replies
DatabricksIssue
by New Contributor II
  • 4431 Views
  • 5 replies
  • 5 kudos

Resolved! Unable to upload files from DBFS

When clicked on upload i am seeing below errorStorageContext com.databricks.backend.storage.StorageContextType$DbfsRoot$@2f3c3220 for workspace 1406865167171326 is not set in theCustomerStorageInfo.

  • 4431 Views
  • 5 replies
  • 5 kudos
Latest Reply
PB-Data
New Contributor III
  • 5 kudos

 same issue at my end since yesterday.. does anyone know the reason and what needs to be done from our side (if any) to fix this. 

  • 5 kudos
4 More Replies
Amit_Dass_Chmp
by New Contributor III
  • 959 Views
  • 0 replies
  • 0 kudos

DBCU Plans is costlier vs Job Compute Premium 0.30 per DBU Please justify

Please help me to understood the % of savings how Databricks are calculating DBCUThey are telling if I take DBCU 12500 plan the price will be with discount 12000 and 4% discount.That means if I consume 12500 DBU, I am paying for this $12000 and 4% sa...

  • 959 Views
  • 0 replies
  • 0 kudos
AnandBagate
by New Contributor II
  • 1653 Views
  • 2 replies
  • 2 kudos

Resolved! I am facing issue with DBFS File server

Hi,I am facing the issue DBFS File server, anyone guide me how to resolve the issue. what steps should I take to resolve storage issue ? 

AnandBagate_0-1721834812256.png
  • 1653 Views
  • 2 replies
  • 2 kudos
Latest Reply
mozartiano
New Contributor III
  • 2 kudos

Looks good now. thanks

  • 2 kudos
1 More Replies
Carlton
by Contributor II
  • 1900 Views
  • 3 replies
  • 3 kudos

Unablet to Install Python Wheel Library

Hello Team,Can someone let me know if there has been some changes to Databricks Community Edition such that it's no longer possible to install Python Wheel libraries? I was able to install Python Wheel libraries as recently as a few days ago, but now...

Carlton_0-1721823396254.png
  • 1900 Views
  • 3 replies
  • 3 kudos
Latest Reply
aalanis
New Contributor II
  • 3 kudos

Hi there, Check your runtime version to check you are using a version that supports that and I think nowadays the recommended pattern (don’t quote me) is to store your wheel files on the workspace tree

  • 3 kudos
2 More Replies
zll_0091
by New Contributor III
  • 3031 Views
  • 3 replies
  • 4 kudos

Resolved! How does AutoLoader works when triggered via Azure Data Factory?

Hi,I am currently creating an AutoLoader in databricks and will be using ADF as an orchestrator.I am quite confused how this will handle my data so please clarify if I misunderstood it.First, I will run my ADF pipeline which includes an activity to c...

zll_0091_0-1721789223632.png
  • 3031 Views
  • 3 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_filesFirst time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like y...

  • 4 kudos
2 More Replies
zll_0091
by New Contributor III
  • 4326 Views
  • 3 replies
  • 2 kudos

How can I deduplicate data from my stream?

Hi,I'm new to databricks and I'm trying to use stream for my incremental data. This data has duplicates which can be solved using a window function. Can you check where my code goes wrong?1-------#Using Auto Loader to read new files schema = df1.sche...

zll_0091_1-1721743629228.png
  • 4326 Views
  • 3 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @zll_0091 ,Change the output mode to update. Other than that, your code looks fine, but I would rename variable microdf to windowSpec, because now it's little confusing.

  • 2 kudos
2 More Replies
ChristianRRL
by Valued Contributor III
  • 3968 Views
  • 3 replies
  • 0 kudos

Resolved! DLT Compute: "Ephemeral" Job Compute vs. All-purpose compute 2.0 ... WHY?

Hi there, this is a follow-up from a discussion I started last monthSolved: Re: DLT Compute: "Ephemeral" Job Compute vs. All-p... - Databricks Community - 71661Based on what was discussed, I understand that it's not possible to use "All Purpose Clust...

  • 3968 Views
  • 3 replies
  • 0 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 0 kudos

@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters: 1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is n...

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels