cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

_raman_
by New Contributor III
  • 965 Views
  • 0 replies
  • 2 kudos

ERROR WHILE CREATING CLUTER IN COMMUNITY VERSION

From past two hours i am trying to create a cluster but it is getting failed after trying of 15-20 minutes.the red sign is coming after the cluster name which says that : Bootstrap Timeout:Node daemon ping timeout in 780000 ms for instance i-05ff43bc...

  • 965 Views
  • 0 replies
  • 2 kudos
Sid1805
by New Contributor II
  • 1681 Views
  • 2 replies
  • 0 kudos

Azure Databricks SQL Execution API Authentication

Hi Team,If a cloud application wants to read some data from Databricks we realize that Azure Databricks can expose its Tables via REST API.For its authentication what is the most recommended method - I see we can have PAT token tagged to a Service Pr...

  • 1681 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ramakrishnan83
New Contributor III
  • 0 kudos

Hi @Sid1805 ,I am in the same situation as you. I am looking for guidance how to setup the authentication for REST API to connect with Databricks SQL. I started with PAT and used  the user id as "token" and pwd as PAT Token to connect with Databricks...

  • 0 kudos
1 More Replies
juanchochuy
by New Contributor
  • 3003 Views
  • 1 replies
  • 0 kudos

How Connect Powerbi with sql endpoint with no public access only private?

 we are making configuration adjustments in the workspaces to disable public access. My concern is whether this will impact the Power BI dashboards that currently consume data from our workspace's SQL Data Warehouse. 

Community Platform Discussions
Endpoint SQL
PowerBI
Public Access
  • 3003 Views
  • 1 replies
  • 0 kudos
Latest Reply
gmiguel
Contributor
  • 0 kudos

If you're using Azure Databricks, you can create a vnet data gateway on Azure.https://learn.microsoft.com/en-us/data-integration/vnet/create-data-gatewayshttps://learn.microsoft.com/en-us/data-integration/vnet/high-availability-load-balancingYou need...

  • 0 kudos
KVK
by New Contributor II
  • 688 Views
  • 0 replies
  • 0 kudos

Unable to reset my Community edition password

I created a new Databricks community account a few days ago. Unfortunately, I seem to have forgotten the password associated with it. Despite trying the "forgot password" option multiple times on the login page, I haven't received any emails to reset...

  • 688 Views
  • 0 replies
  • 0 kudos
Ruby8376
by Valued Contributor
  • 939 Views
  • 1 replies
  • 0 kudos

Function/system id for tableau databricks connector

currently, we are using PAT token for authentication purposes to generate tableau reports form data in databricks delta tables using databricks tableau connector. Would somebody know if system/function ID can be used from Tableau to  Databricks inste...

  • 939 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ruby8376
Valued Contributor
  • 0 kudos

@-werners- @Retired_mod can you help here plz?

  • 0 kudos
Ruby8376
by Valued Contributor
  • 1984 Views
  • 3 replies
  • 1 kudos

Resolved! Can data be unified based on client profile (unified profile) in databricks?

Hi All,my question is in regard to how data in salesforce data cloud gets unified based on client profiles. Can similar action be done on data in databricks. i believe unity catalog just provides unified layer for security and governance. is there a ...

  • 1984 Views
  • 3 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

You want to identify actual persons based on one or more profiles (based on e-mail address etc).  That is something that is not available out-of-the box in Databricks.  The 'unified' in Databricks means you have a single platform for several data top...

  • 1 kudos
2 More Replies
DavidKxx
by Contributor
  • 1628 Views
  • 0 replies
  • 0 kudos

pandas .style is ugly in Databricks

Why does something like      df.style.hide_index()     turn out so ugly in Databricks?  That command should show the dataframe pretty like always, but simply with the index column concealed.  Instead, here's an image of what happens instead (displayi...

Screenshot 2024-03-11 111410.png
  • 1628 Views
  • 0 replies
  • 0 kudos
Shravanshibu
by New Contributor III
  • 1449 Views
  • 0 replies
  • 1 kudos

No able to move a file to volume

I am trying to move a file from repo local directory to volumes, but I am getting directory not found issue. Can some one guide me.tried using dbfs (dbfs/:Volumes/folder/ , /dbfs/Volumes/folder/) and without dbfs (/Volumes/folder/). None worked.@Reti...

Shravanshibu_0-1710141545813.png
Community Platform Discussions
unitycatalog
Volumes
  • 1449 Views
  • 0 replies
  • 1 kudos
vkuznetsov
by New Contributor III
  • 3692 Views
  • 5 replies
  • 5 kudos

Problem sharing a streaming table created in Delta Live Table via Delta Sharing

Hi all,I hope you could help me to figure out what I am missing.I'm trying to do a simple thing. To read the data from the data ingestion zone (csv files saved to Azure Storage Account) using the Delta Live Tables pipeline and share the resulting tab...

vkuznetsov_0-1689259588838.png 2023_07_13_16_48_52_Data_Explorer.png
  • 3692 Views
  • 5 replies
  • 5 kudos
Latest Reply
jdog
New Contributor II
  • 5 kudos

I'm curious if Databricks plans to address this.  We use delta live streaming tables extensively and also planned on using delta sharing to get our data from our production unity catalog (different region).  Duplicating the data as a workaround is no...

  • 5 kudos
4 More Replies
Madalian
by New Contributor III
  • 2401 Views
  • 1 replies
  • 0 kudos

How to create Delta live tables in Silver layer

How to create Delta live tables in Silver layerHi DB Experts,Having basic questions :I am working on Madalian Architecture (B, S, G) Layers.on B i am getting Delta files (Parq) format. with log folders. One folder for one table, multiple files are ge...

  • 2401 Views
  • 1 replies
  • 0 kudos
Latest Reply
Madalian
New Contributor III
  • 0 kudos

Dear Kaniz,Thank you for addressing question :I am getting following error if i follow above: pyspark.errors.exceptions.captured.IllegalArgumentException: Reading from a Delta table is not supported with this syntax. If you would like to consume data...

  • 0 kudos
Sujitha
by Databricks Employee
  • 7698 Views
  • 7 replies
  • 24 kudos

Big news: Our Community is now 100,000 members strong with over 50,000 posts🚀

Thanks to every one of you, the Databricks Community has reached an incredible milestone: 100,000 members and over 50,000 posts! Your dedication, expertise and passion have made this possible. Whether you're a seasoned data professional, a coding en...

Header Image 2.png
  • 7698 Views
  • 7 replies
  • 24 kudos
Latest Reply
AshR
Contributor
  • 24 kudos

Wonderful!

  • 24 kudos
6 More Replies
Paul1
by New Contributor
  • 4248 Views
  • 1 replies
  • 0 kudos

Error Spark reading CSV from DBFS MNT: incompatible format detected

I am trying to follow along with a training course, but I am consistently running into an error loading a CSV with Spark from DBFS.  Specifically, I keep getting an "Invalid format detected error".  Has anyone else encountered this and found a soluti...

  • 4248 Views
  • 1 replies
  • 0 kudos
Latest Reply
MichTalebzadeh
Valued Contributor
  • 0 kudos

Well your error message is telling you that Spark is encountering a Delta table conflict while trying to read a CSV file. The file path dbfs:/mnt/dbacademy... points to a CSV file. This is where the fun begins. Spark detects a Delta transaction log d...

  • 0 kudos
Pravin08
by New Contributor III
  • 6579 Views
  • 13 replies
  • 0 kudos

Resolved! Want to split JSON data into multiple rows

Hi,This is my sample JSON data which is generated from api response and it is all coming in a single row. I want to split this in multiple rows and store it in a dataframe.[{"transaction_id":"F6001EC5-528196D1","corrects_transaction_id":null,"transac...

  • 6579 Views
  • 13 replies
  • 0 kudos
Latest Reply
Pravin08
New Contributor III
  • 0 kudos

Yes indeed, it was datatype issue. After changing it to Longtype in the schema definition, it is working now. Thanks once again for all your inputs and time. Much appreciated !!!

  • 0 kudos
12 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors