cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

robbie1
by New Contributor
  • 4986 Views
  • 2 replies
  • 2 kudos

Can't login anymore: Invalid email address or password

Since last Friday i cannot access databricks community any more, which is kinda annoying since my Bachelors dissertation is due in a couple of weeks. I always get the message: "Invalid email address or password Note: Emails/usernames are case-sensiti...

  • 4986 Views
  • 2 replies
  • 2 kudos
Latest Reply
nnaincy
New Contributor III
  • 2 kudos

Hi Team,My community edition databricks cred are locked. I am doing very important project. Please help me resolve the issue Please try that it not gets locked in future as well.Email used for login @Retired_mod @Sujitha I have sent a email to  commu...

  • 2 kudos
1 More Replies
aicd_de
by New Contributor III
  • 3117 Views
  • 2 replies
  • 2 kudos

Unity Catalog - Writing to PNG Files to Cluster and then using dbutils.fs.cp to send to Azure ADLS2

Hi AllLooking to get some help. We are on Unity Catalog in Azure. We have a requirement to use Python to write out PNG files (several) via Matplotlib and then drop those into an ADLS2 Bucket. With Unity Catalog, we can easily use dbutils.fs.cp or fs....

  • 3117 Views
  • 2 replies
  • 2 kudos
Latest Reply
aicd_de
New Contributor III
  • 2 kudos

Hmm I read something different - someone else had this error because they used a shared cluster - apparently it does not happen on a single user cluster. All those settings are already done and I am a fully admin.

  • 2 kudos
1 More Replies
njglen
by New Contributor III
  • 3749 Views
  • 4 replies
  • 0 kudos

Resolved! How do you enable verbose logging from with in Workspace Settings using Terraform?

I've searched in the databricks provider and online and couldn't find out if it is possible to set the `Verbose Audit Logs` to `enabled` using Terraform. Can anybody clarify if it is possible?

  • 3749 Views
  • 4 replies
  • 0 kudos
Latest Reply
qiaochu
New Contributor II
  • 0 kudos

The switch you're looking for is enableVerboseAuditLogs in databricks_workspace_confresource: { databricks_workspace_conf: { this: { custom_config: { enableIpAccessLists: true, enableVerboseAuditLogs: true, }, }, },

  • 0 kudos
3 More Replies
ChingizK
by New Contributor III
  • 2209 Views
  • 0 replies
  • 0 kudos

Use Python code from a remote Git repository

I'm trying to create a task where the source is a Python script located in remote GitLab repo. I'm following the instructions HERE and this is how I have the task set up:However, no matter what path I specify all I get is the error below:Cannot read ...

03.png
  • 2209 Views
  • 0 replies
  • 0 kudos
Ravikumashi
by Contributor
  • 2419 Views
  • 3 replies
  • 0 kudos

Resolved! Issue with Logging Spark Events to LogAnalytics after Upgrading to Databricks 11.3 LTS

We have recently been in the process of upgrading our Databricks clusters to version 11.3 LTS. As part of this upgrade, we have been working on integrating the logging of Spark events to LogAnalytics using the repository available at https://github.c...

  • 2419 Views
  • 3 replies
  • 0 kudos
Latest Reply
swethaNandan
Databricks Employee
  • 0 kudos

Hi Ravikumashi, Can you please raise a ticket with us so that we can look deeper in to the issue

  • 0 kudos
2 More Replies
Skr7
by New Contributor II
  • 2919 Views
  • 0 replies
  • 0 kudos

Scheduled job output export

Hi ,I have a Databricks job that results in a dashboard post run , I'm able to download the dashboard as HTML from the view job runs page , but I want to automate the process , so I tried using the Databricks API , but it says {"error_code":"INVALID_...

Data Engineering
data engineering
  • 2919 Views
  • 0 replies
  • 0 kudos
Manjula_Ganesap
by Contributor
  • 5247 Views
  • 2 replies
  • 1 kudos

Resolved! Delta Live Table pipeline failure - Table missing

Hi All,I set up a DLT pipeline to create 58 bronze tables and a subsequent DLT live table that joins the 58 bronze tables created in the first step. The pipeline runs successfully most times.My issue is that the pipeline fails once every 3/4 runs say...

Manjula_Ganesap_0-1692373291621.png Manjula_Ganesap_1-1692373340027.png
  • 5247 Views
  • 2 replies
  • 1 kudos
Latest Reply
Manjula_Ganesap
Contributor
  • 1 kudos

@jose_gonzalez @Retired_mod  - Missed to update the group on the fix. Reached out to Databricks to understand and it was identified that the threads call that i was making was causing the issue. After i removed it - i don't see it happening. 

  • 1 kudos
1 More Replies
Manjula_Ganesap
by Contributor
  • 2123 Views
  • 2 replies
  • 1 kudos

Delta Live Table (DLT) Initialization fails frequently

With no change in code, i've noticed that my DLT initialization fails and then an automatic rerun succeeds. Can someone help me understand this behavior. Thank you.  

Manjula_Ganesap_0-1694002699491.png
  • 2123 Views
  • 2 replies
  • 1 kudos
Latest Reply
Manjula_Ganesap
Contributor
  • 1 kudos

@jose_gonzalez  - Missed to update the group on the fix. Reached out to Databricks to understand and it was identified that the threads call that i was making was causing the issue. After i removed it - i don't see it happening. 

  • 1 kudos
1 More Replies
Kit
by New Contributor III
  • 5694 Views
  • 2 replies
  • 1 kudos

How to use checkpoint with change data feed

I have a scheduled job (running in continuous mode) with the following code``` ( spark .readStream .option("checkpointLocation", databricks_checkpoint_location) .option("readChangeFeed", "true") .option("startingVersion", VERSION + 1)...

  • 5694 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Kit Yam Tse​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers y...

  • 1 kudos
1 More Replies
editter
by New Contributor II
  • 2572 Views
  • 1 replies
  • 1 kudos

Unable to open a file in dbfs. Trying to move files from Google Bucket to Azure Blob Storage

Background:I am attempting to download the google cloud sdk on Databricks. The end goal is to be able to use the sdk to transfer files from a Google Cloud Bucket to Azure Blob Storage using Databricks. (If you have any other ideas for this transfer p...

Data Engineering
dbfs
Google Cloud SDK
pyspark
tarfile
  • 2572 Views
  • 1 replies
  • 1 kudos
Latest Reply
editter
New Contributor II
  • 1 kudos

Thanks you for the response!2 Questions:1. How would you create a cluster with the custom requirements for the google cloud sdk? Is that still possible for a Unity Catalog enabled cluster with Shared Access Mode?2. Is a script action the same as a cl...

  • 1 kudos
AMadan
by New Contributor III
  • 8852 Views
  • 1 replies
  • 1 kudos

Date difference in Months

Hi Team,I am working on migration from Sql server to databricks environment.I encounter a challenge where Databricks and sql server giving different results for date difference function. Can you please help?--SQL SERVERSELECT DATEDIFF(MONTH , '2007-0...

  • 8852 Views
  • 1 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

While I was pretty sure it has to do with T-SQL not following ANSI standards, I could not actually tell you what exactly the difference is.  So I asked chatgpt and here we go:The difference between DATEDIFF(month, date1, date2) in T-SQL and ANSI SQL ...

  • 1 kudos
alvaro_databric
by New Contributor III
  • 2600 Views
  • 0 replies
  • 0 kudos

Azure Databricks Spot Cost

Hi all,I started using Azure Spot VMs by switching on the spot option when creating a cluster, however in the Azure billing dashboard, after some months of using spot instances, I only have OnDemand PurchaseType. Does someone guess what could be happ...

  • 2600 Views
  • 0 replies
  • 0 kudos
THIAM_HUATTAN
by Valued Contributor
  • 48478 Views
  • 8 replies
  • 2 kudos

Skip number of rows when reading CSV files

staticDataFrame = spark.read.format("csv")\ .option("header", "true").option("inferSchema", "true").load("/FileStore/tables/Consumption_2019/*.csv") when above, I need an option to skip say first 4 lines on each CSV file, How do I do that?

  • 48478 Views
  • 8 replies
  • 2 kudos
Latest Reply
Michael_Appiah
Contributor
  • 2 kudos

The option... .option("skipRows", <number of rows to skip>) ...works for me as well. However, I am surprised that the official Spark doc does not list it as a CSV Data Source Option: https://spark.apache.org/docs/latest/sql-data-sources-csv.html#data...

  • 2 kudos
7 More Replies
rsamant07
by New Contributor III
  • 1232 Views
  • 0 replies
  • 0 kudos

TLS Mutual Authentication for Databricks API

Hi,we are exploring the use of Databricks Statement Execution API for sharing the data through API to different consumer applications, however  we have a security requirement  to configure TLS Mutual Authentication to limit the consumer application t...

  • 1232 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels