cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

valefar
by New Contributor
  • 362 Views
  • 1 replies
  • 0 kudos

Unexpected response from server during a HTTP connection: authorize: cannot authorize peer.

Hi all,When attempting to connect to Databricks with Spark ODBC using the regular host ip and port, everything is successful. However, we need to send the connection through an internal proxy service that re-maps the server's endpoint to a local port...

  • 362 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @valefar, Firstly, ensure your connection settings correctly map the server's endpoint to 'localhost' and the appropriate port number through the proxy service. Double-check your connection string or configuration to align with Databricks workspac...

  • 0 kudos
Shrinivas
by New Contributor
  • 363 Views
  • 1 replies
  • 0 kudos

Databricks/Terraform - Error while creating workspace

Hi - I have below code to create the credentials, storage and workspace through terraform script but only credentials and storage is created but failed to create the workspace with error.  Can someone please guide/suggest what's wrong with the code/l...

  • 363 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @Shrinivas , Could you share with us how you configured datbricks provider?

  • 0 kudos
nickneoners
by New Contributor II
  • 492 Views
  • 2 replies
  • 0 kudos

Variables in databricks.yml "include:" - Asset Bundles

HI,We've got an app that we deploy to multiple customers workspaces. We're looking to transition to asset bundles. We would like to structure our resources like:  -src/ -resources/ |-- customer_1/ |-- job_1 |-- job_2 |-- customer_2/ |-- job_...

  • 492 Views
  • 2 replies
  • 0 kudos
Latest Reply
p4pratikjain
Contributor
  • 0 kudos

Interesting use case!! Ideally having seperate bundle for each customer seems like a clean solution. But if you dont want that then - You can just include all the yaml files in databricks.yml with include: - resources/*/*.yml Inside the yaml files...

  • 0 kudos
1 More Replies
Bhavya21
by New Contributor II
  • 360 Views
  • 2 replies
  • 0 kudos

Databricks exam got suspended without any reason. Immediate assistance required

Hello Team, @Cert-Team @Cert-Bricks I had my exam yesterday and had a Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. before that somehow 2 ...

  • 360 Views
  • 2 replies
  • 0 kudos
Latest Reply
Bhavya21
New Contributor II
  • 0 kudos

@Kaniz_Fatma @Cert-Team @Cert-Bricks Thanks so much for responding. But I have been waiting from day before yesterday, and yet have not received any any response for the ticket. Can you please look into it.I appreciate it. 

  • 0 kudos
1 More Replies
adb_newbie
by New Contributor
  • 357 Views
  • 1 replies
  • 0 kudos

Creating a table in ADB SQL for multiple JSON files and selecting all the rows from all the files

HiI have multiple json files stored in my ADLS2 and I want to create a table in which will directly read all the data from ADLS without mounting the files. When I create the table, i cannot select all the data How can i achieve this.ADLS Path : /dwh/...

  • 357 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @adb_newbie, To get all the visit.id in a single row from all the files, you can use the LATERAL VIEW and EXPLODE functions in your SQL query.

  • 0 kudos
ram0021
by New Contributor II
  • 583 Views
  • 2 replies
  • 1 kudos

You haven't configured the CLI yet

I have coded pyfunc model code in Databricks notebook and deployed and served thr model thr endpoint. I tried to query the endpoint thr databricks notebook itself through below code but getting CLI error. Not sure why i am getting this error since I ...

  • 583 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @ram0021, It looks like the Databricks CLI is not properly configured in your Databricks Notebook environment.  Open a terminal or command prompt on your local machine (not the Databricks Notebook).Run the following command to configure the Databr...

  • 1 kudos
1 More Replies
FaizH
by New Contributor III
  • 802 Views
  • 2 replies
  • 2 kudos

Resolved! Connecting Power BI to aws databricks using service principal

Hi,I am trying to connect AWS Databricks to PowerBI using service principal. Below are the steps I followed:1. Created Service Principal in Identity and Access2. I went to Permission setting page under Settings>Advanced and added this new service pri...

FaizH_0-1720813920178.png FaizH_1-1720814138258.png FaizH_2-1720814351495.png FaizH_5-1720814797997.png
  • 802 Views
  • 2 replies
  • 2 kudos
Latest Reply
FaizH
New Contributor III
  • 2 kudos

Thanks for the reply, I got the solution. I was missing adding SP account in SQL Warehouse permission setting. After adding it, my PBI report is working fine.

  • 2 kudos
1 More Replies
InquisitiveGeek
by New Contributor II
  • 344 Views
  • 2 replies
  • 1 kudos

How to get the JSON definition - "CREATE part" for a job using JOB ID or JOB Name

I want to get the JSON definition of the "create part" of the job. I have the job id and job name. I am using databricks notebook for this. I can the get the "GET" json api definition but not able able to get the "CREATE" part json definition which I...

  • 344 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @InquisitiveGeek, To extract the "CREATE" part from the full JSON definition of a Databricks job, you can use the Databricks Jobs API to retrieve the job definition and then parse the relevant sections.

  • 1 kudos
1 More Replies
himanmon
by New Contributor III
  • 454 Views
  • 2 replies
  • 0 kudos

Resolved! How can I increase the hard capacity of the master node?

I'm not sure if this is the right place to post my question. If not, please let me know where I should post my question. I want to download large files from the web from Databricks' master(driver) node. For example, I fetch a file over 150GB via API ...

  • 454 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @himanmon,If you 100% sure that you can't download this file to storage account configured with unity catalog and you want it directly on driver node local storage, then why can't you just increase local disk space by choosing a larger instance ty...

  • 0 kudos
1 More Replies
joseroca99
by New Contributor II
  • 685 Views
  • 5 replies
  • 0 kudos

Resolved! File found with %fs ls but not with spark.read

Code: wikipediaDF = (spark.read  .option("HEADER", True)  .option("inferSchema", True)  .csv("/databricks-datasets/wikipedia-datasets/data-001/pageviews/raw/pageviews_by_second.tsv"))display(bostonDF) Error: Failed to store the result. Try rerunning ...

  • 685 Views
  • 5 replies
  • 0 kudos
Latest Reply
joseroca99
New Contributor II
  • 0 kudos

Update 1: Apparently the problem shows up when using display(), using show() or display(df.limit()) works fine. I also started using the premium pricing tier, I'm going to see what happens if I use the free 14 days trial pricing tier.Update 2: I trie...

  • 0 kudos
4 More Replies
Edoardo
by New Contributor
  • 309 Views
  • 1 replies
  • 0 kudos

Retry Trigger for Specific Errors and Custom Error States in Workflow UI

 Hello everyone,In a workflow, is it possible to trigger a retry only for a specific error on a single task? I want the workflow UI to show a run as failed for both managed and unmanaged errors, but I don't want to trigger the retry for managed error...

  • 309 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Edoardo,  You can implement custom retry logic directly in your workflow code. For instance, if you encounter a specific error that you want to retry, you can wrap your main workflow function with a conditional check. If the function returns that...

  • 0 kudos
trimethylpurine
by New Contributor II
  • 1650 Views
  • 3 replies
  • 2 kudos

Resolved! Gathering Data Off Of A PDF File

Hello everyone,I am developing an application that accepts pdf files and inserts the data into my database. The company in question that distributes this data to us only offers PDF files, which you can see attached below (I hid personal info for priv...

  • 1650 Views
  • 3 replies
  • 2 kudos
Latest Reply
NicholasGray
New Contributor II
  • 2 kudos

Thank you so much for the help.

  • 2 kudos
2 More Replies
Aria
by New Contributor III
  • 1138 Views
  • 2 replies
  • 0 kudos

OAuth user-to-machine (U2M) authentication

I am trying to use OAuth user-to-machine (U2M) authentication  from azure databricks CLI.When I run databricks auth login --host  ,I get a web browser open and get authentication sucessfull message and My profile also save successfully with auth-type...

  • 1138 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Honored Contributor
  • 0 kudos

Hi @Aria , Good Day!  Which CLI version you are using here? Can you try to update the CLI version to a newer version by referring to this document : https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/install#--homebrew-update-for-linux-...

  • 0 kudos
1 More Replies
sathya08
by New Contributor
  • 412 Views
  • 2 replies
  • 1 kudos

Databricks Asset Bundle Error 'KEY of the resource to run"

Hello Team,I am new to DAB and running it for the first time through the Databricks CLI.The bundle validation is successful but while running it errors out error="expected a KEY of the resource to run".Can anyone help me on what to check to resolve t...

  • 412 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @sathya08, The error "expected a KEY of the resource to run" typically indicates an issue with the Databricks Application Bundle (DAB) file that you are trying to run through the Databricks CLI. Ensure that the DAB file you are trying to run is va...

  • 1 kudos
1 More Replies
hugodscarvalho
by New Contributor II
  • 2834 Views
  • 3 replies
  • 2 kudos

Resolved! Issue with Private PyPI Mirror Package Dependencies Installation

I'm encountering an issue with the installation of Python packages from a Private PyPI mirror, specifically when the package contains dependencies and the installation is on Databricks clusters - Cluster libraries | Databricks on AWS. Initially, ever...

Community Platform Discussions
Artifactory
Databricks clusters
Dependency resolution
Installation issues
Private PyPI mirror
  • 2834 Views
  • 3 replies
  • 2 kudos
Latest Reply
Adiga
New Contributor II
  • 2 kudos

Hi @hugodscarvalho ,I am also at this point, where the transitive dependencies (available in jfrog) are not getting installed in my job cluster. Could you please elaborate a bit on what exactly needed to be changed in the JFrog setup for this to work...

  • 2 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors