cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

PKD28
by New Contributor II
  • 327 Views
  • 3 replies
  • 0 kudos

Resolved! Databricks Cluster job failure issue

Jobs within the all purpose DB Cluster are failing with "the spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached"In the event log it says "Event_type=DRIVER_NOT_RESPONDING & MESSAGE= "Driver is up b...

  • 327 Views
  • 3 replies
  • 0 kudos
Latest Reply
PKD28
New Contributor II
  • 0 kudos

 just now there is one cluster issuecluster error: Driver is unresponsive likely due to GCcluster conf:worker: Standard_D8ads_v5Driver: standard_E16d_v4What do you suggest here ??

  • 0 kudos
2 More Replies
Zahid-CSA
by New Contributor
  • 345 Views
  • 1 replies
  • 0 kudos

PowerBi to Databricks SQL warehouse Inactivity error on refreshing

Hello Team, we are trying to refreshing a dataset which has near about 1 billion rows and we have partiotioned it to run periodicaly and in parralel-distrubted mechansism but the refrshing is failing after hours stating inactivity time out error are ...

  • 345 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @Zahid-CSA ,You can take a look at below thread, kind of similar problem. @SethParker suggested one possible solution, maybe worth a try:Re: Power BI Import Model Refresh from Databricks ... - Databricks Community - 51661

  • 0 kudos
phanisaisrir
by New Contributor
  • 172 Views
  • 1 replies
  • 0 kudos

Accessing table in Unity Catalog

What is the preferred way of accessing a UC enabled SqlWarehouse table from Databricks Spark Cluster . My requirement is to fetch the data from a sqlwarehouse table using complex queries, transform it using Pyspark notebook and save the results.But t...

  • 172 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
New Contributor II
  • 0 kudos

Hi @phanisaisrir ,Use spark sql. This is the native and most integrated way to interact with data within Databricks.

  • 0 kudos
MM4
by New Contributor II
  • 162 Views
  • 2 replies
  • 1 kudos

"Data" is not available in the left hand side menu

Hi, "Data" is not showing up in the left hand side menu. I have attached the cluster to the notebook, pfa snapshot for the reference.Any idea, how can it be resolved ?  

MM4_0-1724877598621.png
  • 162 Views
  • 2 replies
  • 1 kudos
Latest Reply
MM4
New Contributor II
  • 1 kudos

Thanks for the response @szymon_dybczak .Actually, I am having community edition. It seems data pane is unavailable for the community edition.

  • 1 kudos
1 More Replies
jq2024
by New Contributor II
  • 552 Views
  • 2 replies
  • 1 kudos

Resolved! I cannot curl a URL in a notebook

I tried to curl the following url in a notebook %sh curl https://staging-api.newrelic.com/graphql -v  But I got the following error message { [5 bytes data] * TLSv1.2 (OUT), TLS header, Supplemental data (23): } [5 bytes data] * TLSv1.2 (IN), TLS he...

  • 552 Views
  • 2 replies
  • 1 kudos
Latest Reply
jq2024
New Contributor II
  • 1 kudos

Thank you for the suggestion. It works

  • 1 kudos
1 More Replies
brady_tyson
by New Contributor
  • 159 Views
  • 0 replies
  • 0 kudos

Databricks Connect Vscode. Cannot find package installed on cluster

I am using Databricks Connect v2 to connect to a UC enabled cluster. I have a package I have made and installed in a wheel file on the cluster.  When using vscode to import the package and use it I get a module not found error when running cell by ce...

Community Platform Discussions
databricks-connect
package
python
VSCode
  • 159 Views
  • 0 replies
  • 0 kudos
reggie
by New Contributor II
  • 300 Views
  • 3 replies
  • 0 kudos

Resolved! Issue enabling mosaic

Hi,I am trying to install mosaic on my cluster, but get the error once I use 'enable_mosaic': ImportError: cannot import name '_to_java_column' from 'pyspark.sql.functions' (/databricks/spark/python/pyspark/sql/functions/__init__.py) File <command-14...

  • 300 Views
  • 3 replies
  • 0 kudos
Latest Reply
reggie
New Contributor II
  • 0 kudos

Thank you, I thought mosaic requires a DBR at least 13.x since I didn't get a version error, but this fixed my problem  

  • 0 kudos
2 More Replies
Reply_Domenico
by New Contributor II
  • 420 Views
  • 2 replies
  • 1 kudos

Resolved! UCX code migration

Hello Databricks Community,I’m currently in the process of migrating our codebase to Unity Catalog using UCX and would appreciate some advice. Our environment has a mix of jobs and tables running on both Unity Catalog and hive_metastore.After running...

  • 420 Views
  • 2 replies
  • 1 kudos
Latest Reply
Brahmareddy
Valued Contributor II
  • 1 kudos

Hi @Reply_Domenico, How are you dong today?To filter jobs, try adjusting the UCX assessment query or use a script to exclude jobs already on Unity Catalog. Unfortunately, UCX doesn't yet have specific commands for automating code migration to Unity C...

  • 1 kudos
1 More Replies
EirikMa
by New Contributor II
  • 1711 Views
  • 3 replies
  • 0 kudos

UTF-8 troubles in DLT

Issues with UTF-8 in DLTI am having issues with UTF-8 in DLT:I have tried to set the spark config on the cluster running the DLT pipeline:  I have fixed this with normal compute under advanced settings like this:spark.conf.set("spark.driver.extraJava...

EirikMa_0-1711360526822.png EirikMa_1-1711361452104.png
  • 1711 Views
  • 3 replies
  • 0 kudos
Latest Reply
EirikMA1
New Contributor II
  • 0 kudos

@Kaniz_Fatma Hi, DLT has updated its runtime but I get a different error now:this is my code: 

  • 0 kudos
2 More Replies
Meghana_Vasavad
by New Contributor III
  • 290 Views
  • 1 replies
  • 0 kudos

Assistance Required for Enabling Unity Catalog in Databricks Workspace

Hi,I hope this message finds you well.I am reaching out regarding a concern with Databricks Administrator privileges. I have an Azure subscription and I use Azure Databricks for my tutorials, but I currently do not have Global Administrator access, w...

  • 290 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @Meghana_Vasavad ,During initial setup of unity catalog you need to find a person with global administrator role of Entra ID tenant. It's one time action, because then he can give necessary permission to manage catalog to your account, or even bet...

  • 0 kudos
thibault
by Contributor II
  • 4715 Views
  • 7 replies
  • 0 kudos

Asset Bundles git branch per target

Hi,I am migrating from dbx to Databricks Asset Bundles (DAB) a deployment setup where I have specific parameters per environment. This was working well with dbx, and I am trying now to define those parameters defining targets (3 targets : dev, uat, p...

  • 4715 Views
  • 7 replies
  • 0 kudos
Latest Reply
thibault
Contributor II
  • 0 kudos

Something must have changed in the meantime on Databricks side. I have only updated databricks CLI to 016 and now, using a git / branch under each target deploys this setup, where feature-dab is the branch I want the job to pull sources from, I see t...

  • 0 kudos
6 More Replies
sanjay
by Valued Contributor II
  • 497 Views
  • 5 replies
  • 0 kudos

Resolved! maxFilesPerTrigger not working while loading data from Unity Catalogue table

Hi,I am using streaming on unity catalogue tables and trying to limit the number of records read in each batch. Here is my code but its not respecting maxFilesPerTrigger, instead reads all available data. (spark.readStream.option("skipChangeCommits",...

  • 497 Views
  • 5 replies
  • 0 kudos
Latest Reply
Witold
Contributor III
  • 0 kudos

I believe you misunderstand the fundamentals of delta tables. `maxFilesPerTrigger` has nothing to do with how many rows you will process at the same time. And if you really want to control the number of records per file, then you need to adapt the wr...

  • 0 kudos
4 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors