cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Etyr
by Contributor
  • 1219 Views
  • 3 replies
  • 1 kudos

databricks.sql.exc.RequestError OpenSession error None

I'm trying to access to a Databricks SQL Warehouse with python. I'm able to connect with a token on a Compute Instance on Azure Machine Learning. It's a VM with conda installed, I create an env in python 3.10.from databricks import sql as dbsql dbsq...

  • 1219 Views
  • 3 replies
  • 1 kudos
Latest Reply
Etyr
Contributor
  • 1 kudos

The issue was that the new version of databricks-sql-connector (3.0.1) does not handle well error messages. So It gave a generic error and a timeout where it should have given me 403 and instant error message without a 900 second timeout.https://gith...

  • 1 kudos
2 More Replies
menonshiji
by New Contributor
  • 344 Views
  • 1 replies
  • 0 kudos

#HelpPost for Azure Blob to Databricks connection.

Hi,There is a set of .csv/.txt files in the storage container ie Azure Blob Storage/ Azure Storage Gen 2. I would like to ingest the files to Databricks. Dataset,LinkedServices was created on both ends. Also an all purpose cluster was created in Bric...

  • 344 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @menonshiji, The error in the Databricks Pipeline indicates that the Azure Blob Storage container is inaccessible due to the lack of proper credentials. To resolve this issue, make sure to include the necessary credentials in the Spark configurati...

  • 0 kudos
Rishabh264
by Honored Contributor II
  • 2151 Views
  • 3 replies
  • 3 kudos

www.linkedin.com

woahhh #Excel plug in for #DeltaSharing.Now I can import delta tables directly into my spreadsheet using Delta Sharing.It puts the power of #DeltaLake into the hands of millions of business users.What does this mean?Imagine a data provider delivering...

  • 2151 Views
  • 3 replies
  • 3 kudos
Latest Reply
udit02
New Contributor II
  • 3 kudos

If you have any uncertainties, feel free to inquire here or connect with me on my LinkedIn profile for further assistance.https://whatsgbpro.org/

  • 3 kudos
2 More Replies
ShankarReddy
by New Contributor II
  • 281 Views
  • 1 replies
  • 0 kudos

XML Unmarshalling using JAXB from JavaRDD

I have a JavRDD with complex nested xml content that I want to unmarshall using JAXB and get the data in to java objects. Can anyone please help with how can I achieve?Thanks

Data Engineering
java
java spark xml jaxb
jaxb
spark
XML
  • 281 Views
  • 1 replies
  • 0 kudos
Latest Reply
ShankarReddy
New Contributor II
  • 0 kudos

I hope this should workJavaPairRDD<String, PortableDataStream> jrdd = javaSparkContext.binaryFiles("<path_to_file>");Map<String, PortableDataStream> mp = jrdd.collectAsMap();OutputStream os = new FileOutputStream(f);mp.values().forEach(pd -> { try...

  • 0 kudos
JensH
by New Contributor III
  • 2590 Views
  • 3 replies
  • 2 kudos

Resolved! How to pass parameters to a "Job as Task" from code?

Hi,I would like to use the new "Job as Task" feature but Im having trouble to pass values.ScenarioI have a workflow job which contains 2 tasks.Task_A (type "Notebook"): Read data from a table and based on the contents decide, whether the workflow in ...

Data Engineering
job
parameters
workflow
  • 2590 Views
  • 3 replies
  • 2 kudos
Latest Reply
Walter_C
Valued Contributor II
  • 2 kudos

I found the following information: value is the value for this task value’s key. This command must be able to represent the value internally in JSON format. The size of the JSON representation of the value cannot exceed 48 KiB.You can refer to https...

  • 2 kudos
2 More Replies
Alessandro
by New Contributor
  • 392 Views
  • 1 replies
  • 0 kudos

Update jobs parameter, when running, from API

Hi, When a Job is running, I would like to change the parameters with an API call.I know that I can set parameters value from API when I start a job from API, or that I can update the default value if the job isn't running, but I didn't find an API c...

  • 392 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Valued Contributor II
  • 0 kudos

No, there is currently no option to change parameters while the job is running, from the UI you will be able to modify them but it wont affect the current run, it will be applied on the new job runs you trigger. 

  • 0 kudos
User16826992185
by New Contributor II
  • 5679 Views
  • 2 replies
  • 3 kudos

Databricks Auto-Loader vs. Delta Live Tables

What is the difference between Databricks Auto-Loader and Delta Live Tables? Both seem to manage ETL for you but I'm confused on where to use one vs. the other.

  • 5679 Views
  • 2 replies
  • 3 kudos
Latest Reply
SteveL
New Contributor II
  • 3 kudos

You say "...__would__ be a piece..." and "...DLT __would__ pick up...".Is DLT built upon AL?

  • 3 kudos
1 More Replies
Shivam_Pawar
by New Contributor III
  • 8083 Views
  • 11 replies
  • 4 kudos

Databricks Lakehouse Fundamentals Badge

I have successfully passed the test after completion of the course with 95%. But I have'nt recieved any badge from your side as promised. I have been provided with a certificate which looks fake by itself. I need to post my credentials on Linkedin wi...

  • 8083 Views
  • 11 replies
  • 4 kudos
Latest Reply
Shruti_Prajapat
New Contributor II
  • 4 kudos

Even I'm facing similar issue. I have completed the training and the quiz successful and able to download a course completion certificate. Certificate doesn't have any ID and looking very generic and fake. Have signed up for the https://credentials.d...

  • 4 kudos
10 More Replies
Maxi1693
by New Contributor II
  • 1009 Views
  • 1 replies
  • 0 kudos

Resolved! Error java.lang.NullPointerException using Autoloader

Hi!I am pulling data from a Blob storage to Databrick using Autoloader. This process is working well for almost 10 resources, but for a specific one I am getting this error  java.lang.NullPointerException.Looks like this issue in when I connect to th...

  • 1009 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 0 kudos

@Maxi1693  - The value for the schemaEvolutionMode should be a string. could you please try changing the below from .option("cloudFiles.schemaEvolutionMode", None)    to  .option("cloudFiles.schemaEvolutionMode", "none") and let us know. Refe...

  • 0 kudos
FurqanAmin
by New Contributor II
  • 896 Views
  • 5 replies
  • 1 kudos

Logs not coming up in the UI - while being written to DBFS

I have a few spark-submit jobs that are being run via Databricks workflows. I have configured logging in DBFS and specified a location in my GCS bucket.The logs are present in that GCS bucket for the latest run but whenever I try to view them from th...

FurqanAmin_1-1705921514830.png FurqanAmin_0-1705921735529.png FurqanAmin_0-1705922202627.png
Data Engineering
logging
LOGS
ui
  • 896 Views
  • 5 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

Yes, I meant to set it to None. Is the issue specific to any particular cluster? Or do you see the issue with all the clusters in your workspace?

  • 1 kudos
4 More Replies
Noman_Q
by New Contributor II
  • 573 Views
  • 2 replies
  • 1 kudos

Error Running Delta Live Pipeline.

Hi Guys, I am new to the Delta pipeline. I have created a pipeline and now when i try to run the pipeline i get the error message "PERMISSION_DENIED: You are not authorized to create clusters. Please contact your administrator" even though I can crea...

  • 573 Views
  • 2 replies
  • 1 kudos
Latest Reply
Noman_Q
New Contributor II
  • 1 kudos

Thank you for responding @Palash01 . thanks for giving me the direction so to get around it i had to get permission to "unrestricted cluster creation". 

  • 1 kudos
1 More Replies
rt-slowth
by Contributor
  • 545 Views
  • 3 replies
  • 0 kudos

why the userIdentity is anonymous?

Do you know why the userIdentity is anonymous in AWS Cloudtail's logs even though I have specified an instance profile?

  • 545 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
2 More Replies
joeyslaptop
by New Contributor II
  • 1993 Views
  • 5 replies
  • 2 kudos

How to add a column to a new table containing the original source filenames in DataBricks.

If this isn't the right spot to post this, please move it or refer me to the right area.I recently learned about the "_metadata.file_name".  It's not quite what I need.I'm creating a new table in DataBricks and want to add a USR_File_Name column cont...

Data Engineering
Databricks
filename
import
SharePoint
Upload
  • 1993 Views
  • 5 replies
  • 2 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 2 kudos

Hi, Could you please elaborate more on the expectation here? 

  • 2 kudos
4 More Replies
William_Scardua
by Valued Contributor
  • 236 Views
  • 1 replies
  • 0 kudos

Cluster types pricing

Hy guys,How can I get the pricing of cluster types (standard_D*, standard_E*, standart_F*, etc.) ?Im doing a study to decrease the price of my actual cluster.Have any idea ?Thank you, thank you

  • 236 Views
  • 1 replies
  • 0 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 0 kudos

Hey, you can use the pricing calculator here: https://www.databricks.com/product/pricing/product-pricing/instance-types

  • 0 kudos
JJ_LVS1
by New Contributor III
  • 1283 Views
  • 4 replies
  • 1 kudos

FiscalYear Start Period Is not Correct

Hi, I'm trying to create a calendar dimension including a fiscal year with a fiscal start of April 1. I'm using the fiscalyear library and am setting the start to month 4 but it insists on setting April to month 7.runtime 12.1My code snipet is:start_...

  • 1283 Views
  • 4 replies
  • 1 kudos
Latest Reply
DataEnginner
New Contributor II
  • 1 kudos

 import fiscalyear import datetime def get_fiscal_date(year,month,day): fiscalyear.setup_fiscal_calendar(start_month=4) v_fiscal_month=fiscalyear.FiscalDateTime(year, month, day).fiscal_month #To get the Fiscal Month v_fiscal_quarter=fiscalyea...

  • 1 kudos
3 More Replies
Labels
Top Kudoed Authors