cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RakeshRakesh_De
by New Contributor III
  • 4618 Views
  • 1 replies
  • 0 kudos

if any user has only permission 'select table' in unityCatalog but not having permission to ext loc

Hi,Suppose one use having access 'Select' permission the table but user not having any permission to table external location in the 'external location'..  User will be able to read the data from table?? if yes how can user will be able to read the wh...

  • 4618 Views
  • 1 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

Hi @Retired_mod , thanks for response.. Why the hyperlink command not showing full?

  • 0 kudos
RobinK
by Contributor
  • 3357 Views
  • 5 replies
  • 0 kudos

How to switch Workspaces via menue

Hello,In various webinars and videos featuring Databricks instructors, I have noticed that it is possible to switch between different workspaces using the top menu within a workspace. However, in our organization, we have three separate workspaces wi...

  • 3357 Views
  • 5 replies
  • 0 kudos
Latest Reply
Rajani
Contributor II
  • 0 kudos

Hi @RobinK looking at screenshots provided i can see you have access to different workspaces but still the dropdown is not visible for you, i also checked if there is any setting for same but i didnt found it.you can raise a ticket to databricks and ...

  • 0 kudos
4 More Replies
dustint121
by New Contributor II
  • 3745 Views
  • 1 replies
  • 1 kudos

Resolved! Issue with creating cluster on Community Edition

I have recently signed up for Databricks Community Edition and have yet to succesfully create a cluster.I get this message when trying to create a cluster:"Self-bootstrap failure during launch. Please try again later and contact Databricks if the pro...

  • 3745 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Hi @dustint121 It's Databricks internal issue; wait for some time and it will resolve.

  • 1 kudos
halox6000
by New Contributor III
  • 4055 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks community edition down?

I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"

  • 4055 Views
  • 3 replies
  • 1 kudos
Latest Reply
dustint121
New Contributor II
  • 1 kudos

I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.

  • 1 kudos
2 More Replies
anonymous_567
by New Contributor II
  • 2965 Views
  • 3 replies
  • 0 kudos

Autoloader update table when new changes are made

Hello,Everyday a new file of the same name gets sent to my storage account with old and new data appended at the end.  Columns may also be added during one of these file updates. This file does a complete overwrite of the previous file. Is it possibl...

  • 2965 Views
  • 3 replies
  • 0 kudos
Latest Reply
data-grassroots
New Contributor III
  • 0 kudos

This may be helpful - the bit on allow overwritehttps://docs.databricks.com/en/ingestion/auto-loader/faq.html

  • 0 kudos
2 More Replies
Chinu
by New Contributor III
  • 3492 Views
  • 1 replies
  • 0 kudos

System Tables - Billing schema

Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...

  • 3492 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaizen
Valued Contributor
  • 0 kudos

@Retired_mod Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?

  • 0 kudos
Miguel_Grafana
by New Contributor
  • 1081 Views
  • 0 replies
  • 0 kudos

Azure Oauth Passthrough with the Go Driver

Can anyone point me towards some resources for achieving this? I already have the token.Trying with: dbsql.WithAccessToken(settings.Token)But I'm getting the following error:Unable to load OAuth Config: request error after 1 attempt(s): unexpected HT...

  • 1081 Views
  • 0 replies
  • 0 kudos
Hogan
by New Contributor II
  • 2471 Views
  • 1 replies
  • 0 kudos

Can browse external Storage, but can not create a Table from there - VNET, ADLSGen2

Hi there!Hope somebody here can help me. We have created a new Databricks Account on Azure with the ARM template for VNET injection.We have all the subnets etc., unitiy catalog active and the connector for databricks.I want now to create my first tab...

  • 2471 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hogan
New Contributor II
  • 0 kudos

Hi,To solve this problem, the following Microsoft documentation can be used to configure the NCC to enable the connection between the private Azure storage and the serverless resources.https://learn.microsoft.com/en-us/azure/databricks/security/netwo...

  • 0 kudos
sai_sathya
by New Contributor III
  • 5378 Views
  • 6 replies
  • 1 kudos

DataFrame to CSV write has issues due to multiple commas inside an row value

Hi alliam working on a data containing JSON fields with embedded commas into CSV format. iam facing challenges due to the commas within the JSON being misinterpreted as column delimiters during the conversion process.i tried several methods to modify...

sai_sathya_0-1712850570456.png sai_sathya_1-1712850991923.png
  • 5378 Views
  • 6 replies
  • 1 kudos
Latest Reply
artsheiko
Databricks Employee
  • 1 kudos

Hi Sai, I assume that the problem comes not from the PySpark, but from Excel. I tried to reproduce the error and didn't find the way - that a good thing, right ? Please try the following :    df.write.format("csv").save("/Volumes/<my_catalog_name>/<m...

  • 1 kudos
5 More Replies
Nithya_r
by New Contributor II
  • 3565 Views
  • 1 replies
  • 0 kudos

Access Delta sharing from Azure Data Factory

I recently got access to delta sharing and I am looking to access the data from the tables in share through ADF. I used linked services such as REST API and HTTP and successfully established connection using the credential file token and http path, h...

  • 3565 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Databricks Employee
  • 0 kudos

Hey, I think you'll need to use a Databricks activity instead of Copy See : https://learn.microsoft.com/en-us/azure/data-factory/connector-overview#integrate-with-more-data-storeshttps://learn.microsoft.com/en-us/azure/data-factory/transform-data-dat...

  • 0 kudos
databird
by New Contributor II
  • 3116 Views
  • 4 replies
  • 1 kudos

Redefine ETL strategy with pypskar approach

Hey everyone!I've some previous experience with Data Engineering, but totally new in Databricks and Delta Tables.Starting this thread hoping to ask some questions and asking for help on how to design a process.So I have essentially 2 delta tables (sa...

  • 3116 Views
  • 4 replies
  • 1 kudos
Latest Reply
artsheiko
Databricks Employee
  • 1 kudos

Hi @databird , You can review the code of each demo by opening the content via "View the Notebooks" or by exploring the following repo : https://github.com/databricks-demos (you can try to search for "merge" to see all the occurrences, for example) T...

  • 1 kudos
3 More Replies
vinay076
by New Contributor III
  • 2159 Views
  • 2 replies
  • 0 kudos

There is no certification number in my Databricks certificate that i had received after passing the

I enrolled myself for the Databricks data engineer certification recently and gave a shot at the exam and i did clear it successfully. I have received the certificate in the form of a pdf file along with a URL in which i can see my certificate and ba...

  • 2159 Views
  • 2 replies
  • 0 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 0 kudos

Hi @vinay076 Thanks for asking! Our support team can provide you with a credential ID. Please file a ticket with our support team, give them your email associated with your certification, and they can get you the credential ID.

  • 0 kudos
1 More Replies
VabethRamirez
by New Contributor II
  • 8579 Views
  • 5 replies
  • 4 kudos

Resolved! How obtain a list of workflows in Databricks?

I need to obtain a list of my Databricks workflows with their job IDs in a notebook Databricks

  • 8579 Views
  • 5 replies
  • 4 kudos
Latest Reply
artsheiko
Databricks Employee
  • 4 kudos

Hi @VabethRamirez , Also, instead of using directly the API, you can use databricks Python sdk :  %pip install databricks-sdk --upgrade dbutils.library.restartPython()from databricks.sdk import WorkspaceClient w = WorkspaceClient() job_list = w.jobs...

  • 4 kudos
4 More Replies
RahulChaubey
by New Contributor III
  • 1746 Views
  • 1 replies
  • 0 kudos

Can api for query history /api/2.0/sql/history/queries return data which is older than 30 days?

I am using this api but it is returning the data for only last 30 days. Can this api return data which is older than 30 days?

  • 1746 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Databricks Employee
  • 0 kudos

Hi @RahulChaubey, The query history system table was announced during the Q1 roadmap webinar (see the recording, 32:25). There is a chance that it will provide data with a horizon beyond 30 days. Meanwhile, you can enable system tables - I hope some ...

  • 0 kudos
QPeiran
by New Contributor III
  • 2877 Views
  • 2 replies
  • 0 kudos

Does Delta Table can be the source of streaming/auto loader?

Hi,Since the Auto Loader only accept "append-only" data as the source, I am wondering if the "Delta Table" can also be the source.Does VACCUM(deleting stale files) or _delta_log(creating nested and different file format than parquet) going to break A...

  • 2877 Views
  • 2 replies
  • 0 kudos
Latest Reply
artsheiko
Databricks Employee
  • 0 kudos

Hi @QPeiran, Auto-loader is a feature that allows to integrate files into the Data Platform. Once your data is stored into the Delta Table, you can rely on spark.readStream.table("<my_table_name>") to continuously read from the table. Take a look at ...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels