cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

chloeh
by New Contributor
  • 158 Views
  • 1 replies
  • 0 kudos

Using SQL for Structured Streaming

Hi!I'm new to Databricks. I'm trying to create a data pipeline with structured streaming. A minimal example data pipeline would look like: read from upstream Kafka source, do some data transformation, then write to downstream Kafka sink. I want to do...

  • 158 Views
  • 1 replies
  • 0 kudos
Latest Reply
chloeh
New Contributor
  • 0 kudos

Ok I figured out why I was getting an error on the usage of `read_kafka`. My default cluster was set up with the wrong Databricks runtime

  • 0 kudos
Surajv
by New Contributor III
  • 109 Views
  • 1 replies
  • 0 kudos

Getting client.session.cache.size warning in pyspark code using databricks connect

Hi Community, I have setup a jupyter notebook in a server and installed databricks connect in its kernel to leverage my databricks cluster compute in the notebook and write pyspark code. Whenever I run my code it gives me below warning: ```WARN Spark...

  • 109 Views
  • 1 replies
  • 0 kudos
Latest Reply
Riyakh
New Contributor II
  • 0 kudos

The warning indicates that the client cache (used to manage connections between your local environment and the Databricks cluster) has reached its maximum size (20 sessions). When this limit is reached, the oldest session is closed to make room for a...

  • 0 kudos
RakeshRakesh_De
by New Contributor III
  • 1734 Views
  • 3 replies
  • 0 kudos

Resolved! if any user has only permission 'select table' in unityCatalog but not having permission to ext loc

Hi,Suppose one use having access 'Select' permission the table but user not having any permission to table external location in the 'external location'..  User will be able to read the data from table?? if yes how can user will be able to read the wh...

  • 1734 Views
  • 3 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

Hi @Kaniz , thanks for response.. Why the hyperlink command not showing full?

  • 0 kudos
2 More Replies
RobinK
by New Contributor III
  • 220 Views
  • 5 replies
  • 0 kudos

How to switch Workspaces via menue

Hello,In various webinars and videos featuring Databricks instructors, I have noticed that it is possible to switch between different workspaces using the top menu within a workspace. However, in our organization, we have three separate workspaces wi...

  • 220 Views
  • 5 replies
  • 0 kudos
Latest Reply
Rach
New Contributor III
  • 0 kudos

Hi @RobinK looking at screenshots provided i can see you have access to different workspaces but still the dropdown is not visible for you, i also checked if there is any setting for same but i didnt found it.you can raise a ticket to databricks and ...

  • 0 kudos
4 More Replies
dustint121
by New Contributor
  • 217 Views
  • 1 replies
  • 0 kudos

Issue with creating cluster on Community Edition

I have recently signed up for Databricks Community Edition and have yet to succesfully create a cluster.I get this message when trying to create a cluster:"Self-bootstrap failure during launch. Please try again later and contact Databricks if the pro...

  • 217 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 0 kudos

Hi @dustint121 It's Databricks internal issue; wait for some time and it will resolve.

  • 0 kudos
halox6000
by New Contributor II
  • 569 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks community edition down?

I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"

  • 569 Views
  • 3 replies
  • 1 kudos
Latest Reply
dustint121
New Contributor
  • 1 kudos

I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.

  • 1 kudos
2 More Replies
Chinu
by New Contributor III
  • 717 Views
  • 3 replies
  • 0 kudos

System Tables - Billing schema

Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...

  • 717 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaizen
Contributor III
  • 0 kudos

@Kaniz Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?

  • 0 kudos
2 More Replies
phguk
by New Contributor II
  • 1232 Views
  • 2 replies
  • 0 kudos

Adding NFS storage as external volume (Unity)

Can anyone share experience (or point me to another reference) that describes how to configure Azure Blob storage which has NFS enabled as an external volume to Databricks ?I've succeeded in adding SMB storage to Databricks but (if I understand prope...

  • 1232 Views
  • 2 replies
  • 0 kudos
Latest Reply
phguk
New Contributor II
  • 0 kudos

Apologies for the delay & many thanks for responding. Yes I've been able to mount my premium storage + NFS container as an external volume to Databricks.

  • 0 kudos
1 More Replies
kp12
by New Contributor II
  • 2862 Views
  • 4 replies
  • 1 kudos

column "id" is of type uuid but expression is of type character varying.

Hello,I'm trying to write to Azure PostgreSQL flexible  database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...

  • 2862 Views
  • 4 replies
  • 1 kudos
Latest Reply
Student-Learn
New Contributor
  • 1 kudos

Yes, this stack overflow was my reference too and adding below option made load go with no error on UUID data type in postgres columnSpoiler.option(stringtype, "unspecified").option(stringtype, "unspecified")https://stackoverflow.com/questions/409739...

  • 1 kudos
3 More Replies
faithlawrence98
by New Contributor
  • 105 Views
  • 1 replies
  • 0 kudos

Why I am getting QB Desktop Error 6000 recurringly?

Whenever I try to open my company file over a network or multi-user mode, I keep getting QB Desktop Error 6000 and something after that. The error messages on my screen vary every time I attempt to access the data file. I cannot understand the error,...

  • 105 Views
  • 1 replies
  • 0 kudos
Latest Reply
judithphillips5
New Contributor
  • 0 kudos

Hi, @faithlawrence98 Don’t worry; we’re here to help you.To open your company file over the network, it is required to run the QB Database Server Manager on your server computer. In case this tool isn’t running, you will most probably face QB Desktop...

  • 0 kudos
Raja_fawadAhmed
by New Contributor
  • 192 Views
  • 1 replies
  • 0 kudos

databricks job compute price w.r.t running time

I have two workflows (jobs) in data bricks (AWS) with below cluster specs (job base cluster NOT general purpose)Driver: i3.xlarge · Workers: i3.xlarge · 2-8 workers Job 1 takes 10 min to completeJob 2 takes 50 min to completeQuestions:DBU cost is sam...

  • 192 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

  Hi @Raja_fawadAhmed, DBU Cost for Both Jobs: Databricks pricing is based on DBUs (Databricks Units) consumed. The cost depends on the type of compute instances used and the specific workload.For your two jobs, the DBU cost would be calculated...

  • 0 kudos
Prashanthkumar
by New Contributor II
  • 2170 Views
  • 5 replies
  • 0 kudos

Is it possible to view Databricks cluster metrics using REST API

I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...

Prashanthkumar_0-1705104529507.png
  • 2170 Views
  • 5 replies
  • 0 kudos
Latest Reply
Nandhini_Kumar
New Contributor II
  • 0 kudos

Is there any alternative way to get all performance metrics programmatically?

  • 0 kudos
4 More Replies
Surajv
by New Contributor III
  • 230 Views
  • 1 replies
  • 0 kudos

Difference between delete token API and revoke token API Databricks

Hi Community, I am trying to understand the difference between:Delete token API: DELETE /api/2.0/token-management/tokens/{token_id}Revoke token API: POST /api/2.0/token/deleteAs, when I create more than 600 tokens - I am getting QUOTA_EXCEEDED error....

  • 230 Views
  • 1 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Delete token API doc link: https://docs.databricks.com/api/workspace/tokenmanagement/deleteRevoke token API doc link: https://docs.databricks.com/api/workspace/tokens/revoketoken 

  • 0 kudos
NC
by New Contributor III
  • 166 Views
  • 1 replies
  • 0 kudos

Using libpostal in Databricks

Hi,I am trying to work on address parsing and would like to use libpostal in Databricks.I have used the official python bindings: GitHub - openvenues/pypostal: Python bindings to libpostal for fast international address parsing/normalizationpip insta...

  • 166 Views
  • 1 replies
  • 0 kudos
Latest Reply
NC
New Contributor III
  • 0 kudos

I managed to install pylibpostal via the Cluster Library. but I cannot seem to download the data needed to run it.Please help. Thank you.

  • 0 kudos
Labels
Top Kudoed Authors