cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

araiho
by New Contributor II
  • 3054 Views
  • 2 replies
  • 1 kudos

R Package Installation Best Practices

Hello,We are new to databricks and are wondering what the best practices are for R package installation. We currently have cluster spin up wait times of more than 20 minutes with our init scripts. We have tried the following:1. Libraries tab in the c...

  • 3054 Views
  • 2 replies
  • 1 kudos
Latest Reply
araiho
New Contributor II
  • 1 kudos

@Retired_mod Thank you for your detailed response! I think we would like to use Docker if we can because we are not using RStudio but R directly in the databricks notebooks and workflows. So, anymore information about R and Docker and Databricks woul...

  • 1 kudos
1 More Replies
ankris
by New Contributor III
  • 6463 Views
  • 6 replies
  • 2 kudos

data quality check in data engineering

Can we use deequ library with azure databricks ? if yes Please provide some support material or examplesIs there any similar data quality library or suggestion to achieve automatic data quality check during data engineering (Azure databricks)Thanks i...

  • 6463 Views
  • 6 replies
  • 2 kudos
Latest Reply
joarobles
New Contributor III
  • 2 kudos

Hi there! You could also take a look at Rudol, it enables no-code Data Quality validations to enable non-technical roles such as Business Analysts or Data Stewards to configure quality checks by themselves. 

  • 2 kudos
5 More Replies
ArghaB
by New Contributor II
  • 4132 Views
  • 8 replies
  • 9 kudos

Resolved! Facing StorageContext Error while trying to access DBFS

This issue has hindered my practice for the whole day. I scoured the web and couldn't find anybody who has faced this particular error. The error I am getting is: DBFS file browserStorageContext com.databricks.backend.storage.StorageContextType$DbfsR...

  • 4132 Views
  • 8 replies
  • 9 kudos
Latest Reply
mozartiano
New Contributor III
  • 9 kudos

Yeah, unable to save any file with rdd.saveTextfile and to upload any file using the workspace.

  • 9 kudos
7 More Replies
DatabricksIssue
by New Contributor II
  • 4223 Views
  • 5 replies
  • 5 kudos

Resolved! Unable to upload files from DBFS

When clicked on upload i am seeing below errorStorageContext com.databricks.backend.storage.StorageContextType$DbfsRoot$@2f3c3220 for workspace 1406865167171326 is not set in theCustomerStorageInfo.

  • 4223 Views
  • 5 replies
  • 5 kudos
Latest Reply
PB-Data
New Contributor III
  • 5 kudos

 same issue at my end since yesterday.. does anyone know the reason and what needs to be done from our side (if any) to fix this. 

  • 5 kudos
4 More Replies
Amit_Dass_Chmp
by New Contributor III
  • 880 Views
  • 0 replies
  • 0 kudos

DBCU Plans is costlier vs Job Compute Premium 0.30 per DBU Please justify

Please help me to understood the % of savings how Databricks are calculating DBCUThey are telling if I take DBCU 12500 plan the price will be with discount 12000 and 4% discount.That means if I consume 12500 DBU, I am paying for this $12000 and 4% sa...

  • 880 Views
  • 0 replies
  • 0 kudos
AnandBagate
by New Contributor II
  • 1376 Views
  • 2 replies
  • 2 kudos

Resolved! I am facing issue with DBFS File server

Hi,I am facing the issue DBFS File server, anyone guide me how to resolve the issue. what steps should I take to resolve storage issue ? 

AnandBagate_0-1721834812256.png
  • 1376 Views
  • 2 replies
  • 2 kudos
Latest Reply
mozartiano
New Contributor III
  • 2 kudos

Looks good now. thanks

  • 2 kudos
1 More Replies
Carlton
by Contributor II
  • 1728 Views
  • 3 replies
  • 3 kudos

Unablet to Install Python Wheel Library

Hello Team,Can someone let me know if there has been some changes to Databricks Community Edition such that it's no longer possible to install Python Wheel libraries? I was able to install Python Wheel libraries as recently as a few days ago, but now...

Carlton_0-1721823396254.png
  • 1728 Views
  • 3 replies
  • 3 kudos
Latest Reply
aalanis
New Contributor II
  • 3 kudos

Hi there, Check your runtime version to check you are using a version that supports that and I think nowadays the recommended pattern (don’t quote me) is to store your wheel files on the workspace tree

  • 3 kudos
2 More Replies
zll_0091
by New Contributor III
  • 2814 Views
  • 3 replies
  • 4 kudos

Resolved! How does AutoLoader works when triggered via Azure Data Factory?

Hi,I am currently creating an AutoLoader in databricks and will be using ADF as an orchestrator.I am quite confused how this will handle my data so please clarify if I misunderstood it.First, I will run my ADF pipeline which includes an activity to c...

zll_0091_0-1721789223632.png
  • 2814 Views
  • 3 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_filesFirst time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like y...

  • 4 kudos
2 More Replies
zll_0091
by New Contributor III
  • 3997 Views
  • 3 replies
  • 2 kudos

How can I deduplicate data from my stream?

Hi,I'm new to databricks and I'm trying to use stream for my incremental data. This data has duplicates which can be solved using a window function. Can you check where my code goes wrong?1-------#Using Auto Loader to read new files schema = df1.sche...

zll_0091_1-1721743629228.png
  • 3997 Views
  • 3 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @zll_0091 ,Change the output mode to update. Other than that, your code looks fine, but I would rename variable microdf to windowSpec, because now it's little confusing.

  • 2 kudos
2 More Replies
ChristianRRL
by Valued Contributor III
  • 3691 Views
  • 3 replies
  • 0 kudos

Resolved! DLT Compute: "Ephemeral" Job Compute vs. All-purpose compute 2.0 ... WHY?

Hi there, this is a follow-up from a discussion I started last monthSolved: Re: DLT Compute: "Ephemeral" Job Compute vs. All-p... - Databricks Community - 71661Based on what was discussed, I understand that it's not possible to use "All Purpose Clust...

  • 3691 Views
  • 3 replies
  • 0 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 0 kudos

@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters: 1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is n...

  • 0 kudos
2 More Replies
Mudasirfiyaz20
by New Contributor II
  • 5329 Views
  • 1 replies
  • 0 kudos

Easy GIF Animator 7.4.8 Crack + License Key 2024

Easy GIF Animator 7.4.8 Crack + License Key 2024Easy GIF Animator Crack is popular ever for animation creation and editing purpose there are sufficient modules that works in moderate session and also upgrade the further styles which available here to...

  • 5329 Views
  • 1 replies
  • 0 kudos
Latest Reply
Mudasirfiyaz20
New Contributor II
  • 0 kudos

Download Easy GIF Animator 

  • 0 kudos
ani2409
by New Contributor II
  • 2838 Views
  • 3 replies
  • 0 kudos

Error Creating Primary Key Constraint in DLT

Hello There!Greetings!!I am getting the following error when trying to Create a DLT table in my Gold Layer..com.databricks.sql.managedcatalog.PrimaryKeyColumnsNullableException: Cannot create the primary key `x_key` because its child column(s) `x_key...

  • 2838 Views
  • 3 replies
  • 0 kudos
Latest Reply
ani2409
New Contributor II
  • 0 kudos

Thank you @szymon_dybczak for the response.As you can see I have already defined the not null constraint in my definition for the primary key x_keyCONSTRAINT pk_key_not_null EXPECT (x_key IS NOT NULL) But still I am getting the same error.Also I chec...

  • 0 kudos
2 More Replies
arijit_sani
by New Contributor II
  • 5388 Views
  • 2 replies
  • 1 kudos

Oracle datawarehous Replacement With Databricks Using DeltaLake

I am new to Spark and DataBricks and exploring these to understand to replace Oracle DataWarehouse by DataBricks(deltalake) and to use Spark to improve the ELT/ETL performance of existing DW.Now, I have done some lookups in databricks blogs, spark do...

Get Started Discussions
datawarehouse oracle-databricks Migration data
  • 5388 Views
  • 2 replies
  • 1 kudos
Latest Reply
arijit_sani
New Contributor II
  • 1 kudos

Hi, Could you please advise on this

  • 1 kudos
1 More Replies
Rajaniesh
by New Contributor III
  • 25896 Views
  • 6 replies
  • 0 kudos

Error "Root storage credential for metastore does not exist"While creating the Databricks Volume in

Hi, I tried to create the databricks volume in unity catalog but it threw this error:Root storage credential for metastore XXXXXX does not exist. Please contact your Databricks representative or consider updating the metastore with a valid storage cr...

  • 25896 Views
  • 6 replies
  • 0 kudos
Latest Reply
KrzysztofPrzyso
New Contributor III
  • 0 kudos

I had a similar issue, that was resolved by assigning storage credential to the metastore using REST API call.- make sure that you are metastore or account admin- make sure that the storage credential is correctly configured, its access corrector ide...

  • 0 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels