cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sadiq
by New Contributor III
  • 3607 Views
  • 5 replies
  • 4 kudos

Fixed length file from Databricks notebook ( Spark SQL)

Hi ,I need help writing data from azure databricks notebook into Fixed Length .txt.notebook has 10 lakh rows and 86 columns. can anyone suggest me

  • 3607 Views
  • 5 replies
  • 4 kudos
Latest Reply
Vidula
Honored Contributor
  • 4 kudos

Hi @sadiq vali​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
4 More Replies
PrebenOlsen
by New Contributor III
  • 3556 Views
  • 4 replies
  • 1 kudos

GroupBy in delta live tables fails with error "RuntimeError: Query function must return either a Spark or Koalas DataFrame"

I have a delta live table that I'm trying to run GroupBy on, but getting an error: "RuntimeError: Query function must return either a Spark or Koalas DataFrame". Here is my code:@dlt.table def groups_hierarchy():   df = dlt.read_stream("groups_h...

  • 3556 Views
  • 4 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Preben Olsen​ Does @Debayan Mukherjee​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
3 More Replies
190809
by Contributor
  • 1388 Views
  • 0 replies
  • 0 kudos

Pulling Data From Stripe to Databricks using the Webhook

I am doing some investigation in how to connect Databricks and Stripe. Stirpe has really good documentation and I have decided to set up a webhook in Django as per their recommendation. This function handles events as they occur in stripe:-----------...

  • 1388 Views
  • 0 replies
  • 0 kudos
Munni
by New Contributor II
  • 632 Views
  • 0 replies
  • 0 kudos

Hai,I need somehelp,I am reading csv file through pyspark ,in which one field encoded with double quotes,I should get that value along with double quo...

Hai,I need somehelp,I am reading csv file through pyspark ,in which one field encoded with double quotes,I should get that value along with double quotes.Spark version is 3.0.1.col1,col2,col3"A",""B,C"","D"-----------INPUTOUTPUT:A , "B,C" , D

  • 632 Views
  • 0 replies
  • 0 kudos
KrishZ
by Contributor
  • 3102 Views
  • 2 replies
  • 1 kudos

Where to report a bug with Databricks ?

I have in issue in Pyspark.Pandas to report.Is there a github or some forum where I can register my issue?Here's the issue

  • 3102 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, @Krishna Zanwar​ Could you please raise a support case to report the bug. Please refer https://docs.databricks.com/resources/support.html to engage with Databricks Support.

  • 1 kudos
1 More Replies
Andrei_Radulesc
by Contributor III
  • 7104 Views
  • 1 replies
  • 2 kudos

Resolved! Error: cannot create mws credentials: Cannot complete request; user is unauthenticated

I am configuring databricks_mws_credentials through Terraform on AWS. This used to work up to a couple days ago - now, I am getting "Error: cannot create mws credentials: Cannot complete request; user is unauthenticated".My user/pw/account credential...

  • 7104 Views
  • 1 replies
  • 2 kudos
Latest Reply
Andrei_Radulesc
Contributor III
  • 2 kudos

Update: after changing the account password, the error went away. There seems to have been a temporary glitch in Databricks preventing Terraform from working with the old password - because the old password was correctly set up.Anyhow, now I have a w...

  • 2 kudos
RohitKulkarni
by Contributor II
  • 2017 Views
  • 0 replies
  • 2 kudos

Get file from SharePoint to copy into Azure blob storage

Hello Team,I am trying to copy the xlx files from sharepoint and move to the Azure blob storageUSERNAME = app_config_client.get_configuration_setting(key='BIAppConfig:SharepointUsername',label='BIApp').valuePASSWORD = app_config_client.get_configurat...

  • 2017 Views
  • 0 replies
  • 2 kudos
Anonymous
by Not applicable
  • 1554 Views
  • 0 replies
  • 0 kudos

Data + AI World Tour �� ✈️ Data + AI World Tour brings the data lakehouse to the global datacommunity. With content, customers and speakers tai...

Data + AI World Tour Data + AI World Tour brings the data lakehouse to the global datacommunity. With content, customers and speakers tailored to eachregion, the tour showcases how and why the data lakehouse is quicklybecoming the cloud data archite...

  • 1554 Views
  • 0 replies
  • 0 kudos
ahuarte
by New Contributor III
  • 21596 Views
  • 17 replies
  • 3 kudos

Resolved! Getting Spark & Scala version in Cluster node initialization script

Hi there, I am developing a Cluster node initialization script (https://docs.gcp.databricks.com/clusters/init-scripts.html#environment-variables) in order to install some custom libraries.Reading the docs of Databricks we can get some environment var...

  • 21596 Views
  • 17 replies
  • 3 kudos
Latest Reply
Lingesh
Databricks Employee
  • 3 kudos

We can infer the cluster DBR version using the env $DATABRICKS_RUNTIME_VERSION. (For the exact spark/scala version mapping, you can refer to the specific DBR release notes)Sample usage inside a init script, DBR_10_4_VERSION="10.4" if [[ "$DATABRICKS_...

  • 3 kudos
16 More Replies
nickagel
by New Contributor III
  • 5369 Views
  • 5 replies
  • 4 kudos

AWS Glue Catalog w/ Delta Tables Connected to Databricks SQL Engine - Incompatible format detected.

I've posted the same question on stack overflow to try to maximize reach here & potentially raise this issue to Databricks.I am trying to query delta tables from my AWS Glue Catalog on Databricks SQL Engine. They are stored in Delta Lake format. I ha...

  • 5369 Views
  • 5 replies
  • 4 kudos
Latest Reply
Vidula
Honored Contributor
  • 4 kudos

Hi @Nick Agel​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
4 More Replies
tariq
by New Contributor III
  • 8864 Views
  • 4 replies
  • 0 kudos

Importing python module

I'm not sure how a simple thing like importing a module in python can be so broken in such a product. First, I was able to make it work using the following:import sys sys.path.append("/Workspace/Repos/Github Repo/sparkling-to-databricks/src") from ut...

  • 8864 Views
  • 4 replies
  • 0 kudos
Latest Reply
KrishZ
Contributor
  • 0 kudos

I too wonder the same thing. How can importing a python module be so difficult and not even documented lol.No need for libraries..Here's what worked for me..Step1: Upload the module by first opening a notebook >> File >> Upload Data >> drag and drop ...

  • 0 kudos
3 More Replies
DineshNO
by New Contributor II
  • 2565 Views
  • 1 replies
  • 1 kudos

How to execute a spark-submit on a databricks job for a spring-boot jar built using maven. Failing with error : Error: Failed to load class com.****.settlement.jobs.EntryPoint.

I have setup a spring boot application which works as expected as a standalone spring boot app.When i build the jar and try to set it up as a databricks job, i am facing these issues.i am getting same error in local as well.I have tried using maven-s...

  • 2565 Views
  • 1 replies
  • 1 kudos
Latest Reply
Atanu
Databricks Employee
  • 1 kudos

could you please try with python terminal and see how that behaves?I am not 100% sure if this is relates to your use case.@Dinesh L​ 

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels