cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Tim2407
by New Contributor
  • 1911 Views
  • 1 replies
  • 1 kudos

Connection Error DataGrip Databricks

When trying to connect DataGrip with Databricks SQL, I'm able to do it for multiple connections by using a Token. However, for one specific connection it is not working. We internally tried everything, but we are not able to connect. Below is the err...

  • 1911 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, This looks like, the requested resource is forbidden. Could you please check the destination webserver and recheck the configuration? Please tag @Debayan​ with your next comment so that I will get notified. Thank you!

  • 1 kudos
B_J_Innov
by New Contributor III
  • 7898 Views
  • 12 replies
  • 0 kudos

Resolved! Can't use job cluster for scheduled jobs ADD_NODES_FAILED : Failed to add 9 containers to the cluster. Will attempt retry: false. Reason: Azure Quota Exceeded Exception

Hi everyone,I've been using my all purpose cluster for scheduled jobs and I've been told that it's a suboptimal thing to do and that using a job cluster for the scheduled jobs cuts costs by half.Unfortunately, when I tried to switch clusters on my ex...

  • 7898 Views
  • 12 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@Bassem Jaber​ If you are seeing same error then you need to increase quota, for that your azure plan should be changed from pay as you go to other plan. as pay-as-go azure model has limitations on quota increase

  • 0 kudos
11 More Replies
ejloh
by New Contributor II
  • 3022 Views
  • 3 replies
  • 0 kudos

How to trigger alert for twice per day at set times?

I need to create a databricks alert for 9:30am and 5pm every day...is there a way to do this with one alert? I can't use "Refresh every 1 day at time..." since this will only trigger once per day.  I also can't use "Refresh every 12 hours at minute....

image image2
  • 3022 Views
  • 3 replies
  • 0 kudos
Latest Reply
Mits
New Contributor II
  • 0 kudos

Did anyone find a solution for this?

  • 0 kudos
2 More Replies
Ancil
by Contributor II
  • 5975 Views
  • 8 replies
  • 6 kudos

Job aborted due to stage failure: Task 1863 in stage 10.0 failed 4 times, most recent failure: Lost task 1863.3 in stage 10.0 (TID 2021) (10.0.4.7 executor 2): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed): Fatal Python erro

I am getting below error some time run my databricks notebook from ADF, If the executor node is one then it works fine, if it increases 2 or more some times its failing on same data.Cluster Detail : Standard_F4s_v2 · Workers: Standard_F4s_v2 · 1-8 wo...

  • 5975 Views
  • 8 replies
  • 6 kudos
Latest Reply
swethaNandan
Databricks Employee
  • 6 kudos

Hi @Ancil P A​ Can you give paste the complete stacktrace from the failed task (from failed stage 10.0) and the code snippet that you are trying to run in the notebook . Also, do you think you can raise a databricks support ticket for the same?

  • 6 kudos
7 More Replies
Krish1
by New Contributor II
  • 2170 Views
  • 1 replies
  • 0 kudos

Loading multiple gz files from ADLS to Delta Lake/Delta table in ADB

I have several gz files (file.csv.gz) in a ADLS folder which are of pretty big size. All of these files are extractd from the same base table so it has the similar data but of different dates. How can I transfer them in delta lake/delta table. We wou...

  • 2170 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, You can read GZ files through spark. https://stackoverflow.com/questions/42761912/how-to-read-gz-compressed-file-by-pysparkPlease let us know if this helps. Also, please tag @Debayan​ with your next comment so that I will get notified. Thank you!

  • 0 kudos
Dean_Lovelace
by New Contributor III
  • 4364 Views
  • 3 replies
  • 0 kudos

How to filter the Spark UI for a notebook

When running spark under yarn each script has it's own self contained set of logs:- In databricks all I see if a list of jobs and stages that have been run on the cluster:- From a support perspective this is a nightmare.How can notebooks logs be grou...

image image
  • 4364 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Dean Lovelace​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...

  • 0 kudos
2 More Replies
elgeo
by Valued Contributor II
  • 8912 Views
  • 4 replies
  • 0 kudos

Function returns UNSUPPORTED_CORRELATED_SCALAR_SUBQUERY

Hello experts. The below function in Databricks gives UNSUPPORTED_CORRELATED_SCALAR_SUBQUERY error. We didn't have this issue though in Oracle. Is this a limitation of Databricks? Just to note the final result returns only one row. Thank you in advan...

  • 8912 Views
  • 4 replies
  • 0 kudos
Latest Reply
TheofilosSt
New Contributor II
  • 0 kudos

Hello @Suteja Kanuri​  can we have any respond on the above?Thank you.

  • 0 kudos
3 More Replies
THIAM_HUATTAN
by Valued Contributor
  • 1291 Views
  • 1 replies
  • 0 kudos

community.cloud.databricks.com

https://community.cloud.databricks.com/Last night I am still able to use it. This morning breaks down totally, and I could not login. Please help.

  • 1291 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @THIAM HUAT TAN​ Thank you for reaching out, and we’re sorry to hear about this log-in issue! We have this Community Edition login troubleshooting post on Community. Please take a look, and follow the troubleshooting steps. If the steps do not res...

  • 0 kudos
cnjrules
by New Contributor III
  • 3165 Views
  • 3 replies
  • 0 kudos

Resolved! Reference file name when using COPY INTO?

When using the COPY INTO statement is it possible to reference the current file name in the select staement? A generic example is shown below, hoping I can log the file name in the target table.COPY INTO my_table FROM (SELECT key, index, textData, ...

  • 3165 Views
  • 3 replies
  • 0 kudos
Latest Reply
cnjrules
New Contributor III
  • 0 kudos

Found the info I was looking for on the page below:https://docs.databricks.com/ingestion/file-metadata-column.html

  • 0 kudos
2 More Replies
AT
by New Contributor III
  • 10650 Views
  • 5 replies
  • 4 kudos

Resolved! Databricks Certification Voucher Code not received

I have an exam to take for the Databricks Associate ML certification early this week. I have raised multiple tickets for the same and previously but didn't receive any reply on the same. I have attended the webinar for 3 days on Data Engineering as m...

  • 10650 Views
  • 5 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Avinash Tiwari​ I hope you are doing great!I have forwarded your query to our Academic team. Soon the problem will be resolved. Please bear with us.Thanks and Regards

  • 4 kudos
4 More Replies
de-hru
by New Contributor III
  • 15983 Views
  • 3 replies
  • 1 kudos

Root cause analysis and Woraround for "Error: AttributeError: type object 'Retry' has no attribute 'DEFAULT_METHOD_WHITELIST'&quo...

Root cause analysis and Woraround for "Error: AttributeError: type object 'Retry' has no attribute 'DEFAULT_METHOD_WHITELIST'"---------------------------------------------ProblemWhen using databricks-cli, starting with 4th of May this error occurs:Er...

  • 15983 Views
  • 3 replies
  • 1 kudos
Latest Reply
apatel
New Contributor III
  • 1 kudos

For what its worth I can confirm the pinning works. I also pinned the requests package. See my details here as I am using pipelines to handle deployment. I've back linked this issue on the databricks website to this item as well.https://github.co...

  • 1 kudos
2 More Replies
karthik_p
by Esteemed Contributor
  • 618 Views
  • 0 replies
  • 1 kudos

How to Restrict Databricks Account based on IP/Subnet? To Avoid Security Risks related to accessing Databricks Environments to your employees, we have...

How to Restrict Databricks Account based on IP/Subnet?To Avoid Security Risks related to accessing Databricks Environments to your employees, we have a config that was published by databricks.we can Add IP/Subnets in IP Access List.when you enable fe...

  • 618 Views
  • 0 replies
  • 1 kudos
Suresh_AWS
by New Contributor III
  • 5364 Views
  • 4 replies
  • 2 kudos

Resolved! Databricks fs ls does not work

I have configured the databricks CLI using token generated in Azure databricks configure --aad-tokenConfigure is success. When I used the command "databricks fs ls" I am running into an error as shown belowError: AttributeError: type object 'Retry' ...

  • 5364 Views
  • 4 replies
  • 2 kudos
Latest Reply
abos
New Contributor II
  • 2 kudos

It seems the Databricks cli needs and update. This issue is discussed on StackOverflow. For the time being, installing an older version of `urllib3` should solve the issue. The method was removed since 2.X, the latest 1.X version is 1.26.15. You can ...

  • 2 kudos
3 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 1390 Views
  • 1 replies
  • 4 kudos

spark 3.4 and databricks 13 introduce two new types of timestamps for handling time zone information:- TIMESTAMP WITH LOCAL TIME ZONE: This type assum...

spark 3.4 and databricks 13 introduce two new types of timestamps for handling time zone information:- TIMESTAMP WITH LOCAL TIME ZONE: This type assumes that the input data is in the session's local time zone and converts it to UTC before processing....

timezone
  • 1390 Views
  • 1 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

This is helpful! Timestamps are always the reason to mess up the business logic as we know.

  • 4 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels