cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

TutorBees_Net
by New Contributor
  • 1650 Views
  • 2 replies
  • 0 kudos

Tutorbees logo black

We provide online tutoring for students from Grade 5 and all the way up to professionals. You can find the best tutors for Maths, Biology, Physics, Chemistry, English, Social Sciences, Urdu in the comfort of your home. You can also find professional ...

  • 1650 Views
  • 2 replies
  • 0 kudos
Latest Reply
frillow
New Contributor II
  • 0 kudos

The academized review is not that clear. The company seems legitimate enough, but the anonymous profiles make customers and users doubt its legitimacy. While Academized does list the number of custom feedbacks it offers and the fields of specializati...

  • 0 kudos
1 More Replies
Raj4
by New Contributor III
  • 3589 Views
  • 7 replies
  • 0 kudos

www.databricks.com

Hi Team , I have attended the virtual instructor-led training on 23-08-2022 (https://www.databricks.com/p/webinar/databricks-lakehouse-fundamentals-learning-plan). As per the steps mentioned i have completed all of the steps for getting voucher, but ...

  • 3589 Views
  • 7 replies
  • 0 kudos
Latest Reply
amit
New Contributor II
  • 0 kudos

Thanks @Nadia Elsayed​  for quick response. I have booked my exam with supplied coupon without any issue.Thanks, Amit

  • 0 kudos
6 More Replies
sp334
by New Contributor II
  • 9473 Views
  • 4 replies
  • 3 kudos

Resolved! NPIP tunnel setup failed [WaitForNgrokTunnel]

Hello All, We have deployed a new databricks instance in Azure cloud 1) Databricks service attached public subnet/private subnet (delegated to Microsoft.Databricks/workspaces)2) i created a job with cluster runtime ( 1 worker: Standard_DS3_v27.3 LTS...

  • 9473 Views
  • 4 replies
  • 3 kudos
Latest Reply
fabienv
Databricks Employee
  • 3 kudos

In case others run into this in the future. Here is something additional to check:Is your account/workspace enabled for the "compliance security profile"? If yes, you should see a little shield icon in the lower left-hand corner of the workspace Once...

  • 3 kudos
3 More Replies
tompile
by New Contributor III
  • 4579 Views
  • 6 replies
  • 9 kudos

Resolved! Is it possible to make use of pygit2 or GitPython packages to reference git repositories from within databricks?

I am making use of repos in databricks and am trying to reference the current git branch from within the notebook session.For example:from pygit2 import Repositoryrepo = Repository('/Workspace/Repos/user@domain/repository')The code above throws an er...

  • 4579 Views
  • 6 replies
  • 9 kudos
Latest Reply
niburg123
New Contributor III
  • 9 kudos

You cannot use this as far as i know, but you can put a workaround in a notebook if you are calling code from your repo via a notebook:repo_path = "/Repos/xyz_repo_path/xyz_repo_name"repo_path_fs = "/Workspace" + repo_pathrepo_branch = "main"def chec...

  • 9 kudos
5 More Replies
Sadiq
by New Contributor III
  • 3301 Views
  • 5 replies
  • 4 kudos

Fixed length file from Databricks notebook ( Spark SQL)

Hi ,I need help writing data from azure databricks notebook into Fixed Length .txt.notebook has 10 lakh rows and 86 columns. can anyone suggest me

  • 3301 Views
  • 5 replies
  • 4 kudos
Latest Reply
Vidula
Honored Contributor
  • 4 kudos

Hi @sadiq vali​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
4 More Replies
PrebenOlsen
by New Contributor III
  • 3090 Views
  • 4 replies
  • 1 kudos

GroupBy in delta live tables fails with error "RuntimeError: Query function must return either a Spark or Koalas DataFrame"

I have a delta live table that I'm trying to run GroupBy on, but getting an error: "RuntimeError: Query function must return either a Spark or Koalas DataFrame". Here is my code:@dlt.table def groups_hierarchy():   df = dlt.read_stream("groups_h...

  • 3090 Views
  • 4 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Preben Olsen​ Does @Debayan Mukherjee​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
3 More Replies
190809
by Contributor
  • 1192 Views
  • 0 replies
  • 0 kudos

Pulling Data From Stripe to Databricks using the Webhook

I am doing some investigation in how to connect Databricks and Stripe. Stirpe has really good documentation and I have decided to set up a webhook in Django as per their recommendation. This function handles events as they occur in stripe:-----------...

  • 1192 Views
  • 0 replies
  • 0 kudos
Munni
by New Contributor II
  • 553 Views
  • 0 replies
  • 0 kudos

Hai,I need somehelp,I am reading csv file through pyspark ,in which one field encoded with double quotes,I should get that value along with double quo...

Hai,I need somehelp,I am reading csv file through pyspark ,in which one field encoded with double quotes,I should get that value along with double quotes.Spark version is 3.0.1.col1,col2,col3"A",""B,C"","D"-----------INPUTOUTPUT:A , "B,C" , D

  • 553 Views
  • 0 replies
  • 0 kudos
KrishZ
by Contributor
  • 2607 Views
  • 2 replies
  • 1 kudos

Where to report a bug with Databricks ?

I have in issue in Pyspark.Pandas to report.Is there a github or some forum where I can register my issue?Here's the issue

  • 2607 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, @Krishna Zanwar​ Could you please raise a support case to report the bug. Please refer https://docs.databricks.com/resources/support.html to engage with Databricks Support.

  • 1 kudos
1 More Replies
Andrei_Radulesc
by Contributor III
  • 6604 Views
  • 1 replies
  • 2 kudos

Resolved! Error: cannot create mws credentials: Cannot complete request; user is unauthenticated

I am configuring databricks_mws_credentials through Terraform on AWS. This used to work up to a couple days ago - now, I am getting "Error: cannot create mws credentials: Cannot complete request; user is unauthenticated".My user/pw/account credential...

  • 6604 Views
  • 1 replies
  • 2 kudos
Latest Reply
Andrei_Radulesc
Contributor III
  • 2 kudos

Update: after changing the account password, the error went away. There seems to have been a temporary glitch in Databricks preventing Terraform from working with the old password - because the old password was correctly set up.Anyhow, now I have a w...

  • 2 kudos
RohitKulkarni
by Contributor II
  • 1861 Views
  • 0 replies
  • 2 kudos

Get file from SharePoint to copy into Azure blob storage

Hello Team,I am trying to copy the xlx files from sharepoint and move to the Azure blob storageUSERNAME = app_config_client.get_configuration_setting(key='BIAppConfig:SharepointUsername',label='BIApp').valuePASSWORD = app_config_client.get_configurat...

  • 1861 Views
  • 0 replies
  • 2 kudos
Anonymous
by Not applicable
  • 1444 Views
  • 0 replies
  • 0 kudos

Data + AI World Tour �� ✈️ Data + AI World Tour brings the data lakehouse to the global datacommunity. With content, customers and speakers tai...

Data + AI World Tour Data + AI World Tour brings the data lakehouse to the global datacommunity. With content, customers and speakers tailored to eachregion, the tour showcases how and why the data lakehouse is quicklybecoming the cloud data archite...

  • 1444 Views
  • 0 replies
  • 0 kudos
ahuarte
by New Contributor III
  • 18367 Views
  • 17 replies
  • 3 kudos

Resolved! Getting Spark & Scala version in Cluster node initialization script

Hi there, I am developing a Cluster node initialization script (https://docs.gcp.databricks.com/clusters/init-scripts.html#environment-variables) in order to install some custom libraries.Reading the docs of Databricks we can get some environment var...

  • 18367 Views
  • 17 replies
  • 3 kudos
Latest Reply
Lingesh
Databricks Employee
  • 3 kudos

We can infer the cluster DBR version using the env $DATABRICKS_RUNTIME_VERSION. (For the exact spark/scala version mapping, you can refer to the specific DBR release notes)Sample usage inside a init script, DBR_10_4_VERSION="10.4" if [[ "$DATABRICKS_...

  • 3 kudos
16 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels