cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Surajv
by New Contributor III
  • 46 Views
  • 1 replies
  • 0 kudos

Getting python version errors when using pyspark rdd using databricks connect

Hi community, When I use pyspark rdd related functions in my environment using databricks connect, I get below error: Databricks cluster version: 12.2. `RuntimeError: Python in worker has different version 3.9 than that in driver 3.10, PySpark cannot...

  • 46 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Surajv, The error message you’re encountering indicates a Python version mismatch between the Spark worker and the Spark driver. To resolve this issue, follow these steps: Install Correct Python Version on Worker Node: Ensure that the correct Py...

  • 0 kudos
jvk
by Visitor
  • 111 Views
  • 2 replies
  • 0 kudos

Cant create cluster: "Aws Authorization Failure:" .. not authorized to perform: sts:AssumeRole

Full error here:Aws Authorization Failure:Failure happened when talking to AWS, AWS API error code: AccessDenied AWS error message: User: arn:aws:iam::414351767826:user/ConsolidatedManagerIAMUser-ConsolidatedManagerUser-VX02FYW0SSCY is not authorized...

  • 111 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @jvk, The “AccessDenied” error message you’re encountering in AWS indicates that the user “arn:aws:iam::414351767826:user/ConsolidatedManagerIAMUser-ConsolidatedManagerUser-VX02FYW0SSCY” does not have the necessary permissions to perform the “sts:...

  • 0 kudos
1 More Replies
chloeh
by New Contributor
  • 172 Views
  • 1 replies
  • 0 kudos

Using SQL for Structured Streaming

Hi!I'm new to Databricks. I'm trying to create a data pipeline with structured streaming. A minimal example data pipeline would look like: read from upstream Kafka source, do some data transformation, then write to downstream Kafka sink. I want to do...

  • 172 Views
  • 1 replies
  • 0 kudos
Latest Reply
chloeh
New Contributor
  • 0 kudos

Ok I figured out why I was getting an error on the usage of `read_kafka`. My default cluster was set up with the wrong Databricks runtime

  • 0 kudos
Surajv
by New Contributor III
  • 137 Views
  • 1 replies
  • 0 kudos

Getting client.session.cache.size warning in pyspark code using databricks connect

Hi Community, I have setup a jupyter notebook in a server and installed databricks connect in its kernel to leverage my databricks cluster compute in the notebook and write pyspark code. Whenever I run my code it gives me below warning: ```WARN Spark...

  • 137 Views
  • 1 replies
  • 0 kudos
Latest Reply
Riyakh
New Contributor II
  • 0 kudos

The warning indicates that the client cache (used to manage connections between your local environment and the Databricks cluster) has reached its maximum size (20 sessions). When this limit is reached, the oldest session is closed to make room for a...

  • 0 kudos
RakeshRakesh_De
by New Contributor III
  • 1940 Views
  • 3 replies
  • 0 kudos

Resolved! if any user has only permission 'select table' in unityCatalog but not having permission to ext loc

Hi,Suppose one use having access 'Select' permission the table but user not having any permission to table external location in the 'external location'..  User will be able to read the data from table?? if yes how can user will be able to read the wh...

  • 1940 Views
  • 3 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

Hi @Kaniz , thanks for response.. Why the hyperlink command not showing full?

  • 0 kudos
2 More Replies
RobinK
by New Contributor III
  • 239 Views
  • 5 replies
  • 0 kudos

How to switch Workspaces via menue

Hello,In various webinars and videos featuring Databricks instructors, I have noticed that it is possible to switch between different workspaces using the top menu within a workspace. However, in our organization, we have three separate workspaces wi...

  • 239 Views
  • 5 replies
  • 0 kudos
Latest Reply
Rajani
New Contributor III
  • 0 kudos

Hi @RobinK looking at screenshots provided i can see you have access to different workspaces but still the dropdown is not visible for you, i also checked if there is any setting for same but i didnt found it.you can raise a ticket to databricks and ...

  • 0 kudos
4 More Replies
dustint121
by New Contributor
  • 345 Views
  • 1 replies
  • 1 kudos

Resolved! Issue with creating cluster on Community Edition

I have recently signed up for Databricks Community Edition and have yet to succesfully create a cluster.I get this message when trying to create a cluster:"Self-bootstrap failure during launch. Please try again later and contact Databricks if the pro...

  • 345 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Hi @dustint121 It's Databricks internal issue; wait for some time and it will resolve.

  • 1 kudos
halox6000
by New Contributor II
  • 588 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks community edition down?

I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"

  • 588 Views
  • 3 replies
  • 1 kudos
Latest Reply
dustint121
New Contributor
  • 1 kudos

I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.

  • 1 kudos
2 More Replies
Chinu
by New Contributor III
  • 722 Views
  • 3 replies
  • 0 kudos

System Tables - Billing schema

Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...

  • 722 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaizen
Contributor III
  • 0 kudos

@Kaniz Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?

  • 0 kudos
2 More Replies
phguk
by New Contributor II
  • 1238 Views
  • 2 replies
  • 0 kudos

Adding NFS storage as external volume (Unity)

Can anyone share experience (or point me to another reference) that describes how to configure Azure Blob storage which has NFS enabled as an external volume to Databricks ?I've succeeded in adding SMB storage to Databricks but (if I understand prope...

  • 1238 Views
  • 2 replies
  • 0 kudos
Latest Reply
phguk
New Contributor II
  • 0 kudos

Apologies for the delay & many thanks for responding. Yes I've been able to mount my premium storage + NFS container as an external volume to Databricks.

  • 0 kudos
1 More Replies
kp12
by New Contributor II
  • 2898 Views
  • 4 replies
  • 1 kudos

column "id" is of type uuid but expression is of type character varying.

Hello,I'm trying to write to Azure PostgreSQL flexible  database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...

  • 2898 Views
  • 4 replies
  • 1 kudos
Latest Reply
Student-Learn
New Contributor
  • 1 kudos

Yes, this stack overflow was my reference too and adding below option made load go with no error on UUID data type in postgres columnSpoiler.option(stringtype, "unspecified").option(stringtype, "unspecified")https://stackoverflow.com/questions/409739...

  • 1 kudos
3 More Replies
faithlawrence98
by New Contributor
  • 204 Views
  • 1 replies
  • 0 kudos

Why I am getting QB Desktop Error 6000 recurringly?

Whenever I try to open my company file over a network or multi-user mode, I keep getting QB Desktop Error 6000 and something after that. The error messages on my screen vary every time I attempt to access the data file. I cannot understand the error,...

  • 204 Views
  • 1 replies
  • 0 kudos
Latest Reply
judithphillips5
New Contributor II
  • 0 kudos

Hi, @faithlawrence98 Don’t worry; we’re here to help you.To open your company file over the network, it is required to run the QB Database Server Manager on your server computer. In case this tool isn’t running, you will most probably face QB Desktop...

  • 0 kudos
Raja_fawadAhmed
by New Contributor
  • 195 Views
  • 1 replies
  • 0 kudos

databricks job compute price w.r.t running time

I have two workflows (jobs) in data bricks (AWS) with below cluster specs (job base cluster NOT general purpose)Driver: i3.xlarge · Workers: i3.xlarge · 2-8 workers Job 1 takes 10 min to completeJob 2 takes 50 min to completeQuestions:DBU cost is sam...

  • 195 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

  Hi @Raja_fawadAhmed, DBU Cost for Both Jobs: Databricks pricing is based on DBUs (Databricks Units) consumed. The cost depends on the type of compute instances used and the specific workload.For your two jobs, the DBU cost would be calculated...

  • 0 kudos
Prashanthkumar
by New Contributor II
  • 2183 Views
  • 5 replies
  • 0 kudos

Is it possible to view Databricks cluster metrics using REST API

I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...

Prashanthkumar_0-1705104529507.png
  • 2183 Views
  • 5 replies
  • 0 kudos
Latest Reply
Nandhini_Kumar
New Contributor II
  • 0 kudos

Is there any alternative way to get all performance metrics programmatically?

  • 0 kudos
4 More Replies
Labels
Top Kudoed Authors