cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

sheree
by New Contributor III
  • 2529 Views
  • 5 replies
  • 2 kudos

Resolved! I can't access to my account.

I can't access to my account.This acccount was created today(not community, after 14 days trial it will chargable)when I'm try to access my account it gives meInvalid email address or passwordNote: Emails/usernames are case-sensitiveI tried to reset ...

  • 2529 Views
  • 5 replies
  • 2 kudos
Latest Reply
sheree
New Contributor III
  • 2 kudos

I got a reset link from the community. Actually the problem was with my username ,it did not identify a character within my username which was my email id.

  • 2 kudos
4 More Replies
oussamak
by New Contributor II
  • 2830 Views
  • 2 replies
  • 3 kudos

How to install JAR libraries from ADLS? I'm having an error

I mounted the ADLS to my Azure Databricks resource and I keep on getting this error when I try to install a JAR from a container:Library installation attempted on the driver node of cluster 0331-121709-buk0nvsq and failed. Please refer to the followi...

  • 2830 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @Oussama KIASSI​ , The error message says :- Failure to initialize configurationInvalid configuration value detected for fs.azure.account.keyYou can't use the storage account access key to access data using the abfss protocol. You need to provide ...

  • 3 kudos
1 More Replies
kenldk
by New Contributor III
  • 3151 Views
  • 7 replies
  • 4 kudos

Resolved! When will the bills from Databricks arrive?

I am using Databricks for the first time and after 3 months I didn't see a single bill from Databricks. However, the accumulated usage has reached $180. Currently my workspace status is still running. Do I need to terminate my workspace to get billed...

  • 3151 Views
  • 7 replies
  • 4 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 4 kudos

Hi @Ken Lei​, Just a friendly follow-up. Do you still need help? Please let us know.

  • 4 kudos
6 More Replies
rk077y
by New Contributor
  • 706 Views
  • 2 replies
  • 0 kudos

Can any one give me the code based on attached commands and source is Json file and Tables are in Tabs in excel

I want the code based on attached commands filesource file is attached jsonTables are in Tabs in excel sheetkindly give me the code for 3-5 tables for my understanding

  • 706 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @santhosh madduri​ , Just a friendly follow-up. Do you still need help?

  • 0 kudos
1 More Replies
Taha_Hussain
by Valued Contributor II
  • 975 Views
  • 1 replies
  • 1 kudos

Databricks Office Hours Register for Office Hours to participate in a live Q&A session with Databricks experts! Our next events are scheduled for ...

Databricks Office HoursRegister for Office Hours to participate in a live Q&A session with Databricks experts! Our next events are scheduled for June 8th & June 22 from 8:00 am - 9:00am PT.This is your opportunity to connect directly with our experts...

  • 975 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Fantastic opportunity for the Community!Thank you @Taha Hussain​ for sharing this excellent post!

  • 1 kudos
chandan_a_v
by Valued Contributor
  • 13458 Views
  • 7 replies
  • 6 kudos

Resolved! Spark Driver Out of Memory Issue

Hi, I am executing a simple job in Databricks for which I am getting below error. I increased the Driver size still I faced same issue. Spark config :from pyspark.sql import SparkSessionspark_session = SparkSession.builder.appName("Demand Forecasting...

  • 13458 Views
  • 7 replies
  • 6 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 6 kudos

Hi @Chandan Angadi​, Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ and @Werner Stinckens​'s responses help you to find the solution? Please let us know.

  • 6 kudos
6 More Replies
William_Scardua
by Valued Contributor
  • 1966 Views
  • 1 replies
  • 2 kudos

Resolved! Best way to encrypt PII data

Hi guys, I have around 600GB per load, in you opnion, what is the best way to encrypt PII data in terms of performance ? (lib, cluster type, etc.)Thank youWilliam

  • 1966 Views
  • 1 replies
  • 2 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 2 kudos

Hello @William Scardua​ please check if the blog helps you.https://databricks.com/blog/2020/11/20/enforcing-column-level-encryption-and-avoiding-data-duplication-with-pii.html

  • 2 kudos
Devarsh
by Contributor
  • 7700 Views
  • 3 replies
  • 7 kudos

Resolved! Getting the error 'No such file or directory', when trying to access the json file

I am trying to write in my google sheet through Databricks but when it comes to reading the json, file containing the credentials, I am getting the error that No such file or directory exists.import gspread     gc = gspread.service_account(filename='...

  • 7700 Views
  • 3 replies
  • 7 kudos
Latest Reply
Noopur_Nigam
Valued Contributor II
  • 7 kudos

Hi @Devarsh Shah​ The issue is not with json file but the location you are specifying while reading.As suggested by @Werner Stinckens​ please start using spark API to read the json file as below:spark.read.format("json").load("testjson")Please check ...

  • 7 kudos
2 More Replies
palzor
by New Contributor III
  • 8786 Views
  • 5 replies
  • 4 kudos

Getting error when using CDC in delta live table

Hi,I am trying to use CDC for delta live table, and when when I run the pipeline second time I get an error :org.apache.spark.sql.streaming.StreamingQueryException: Query tbl_cdc [id = ***-xx-xx-bf7e-6cb8b0deb690, runId = ***-xxxx-4031-ba74-b4b22be05...

  • 8786 Views
  • 5 replies
  • 4 kudos
Latest Reply
jose_gonzalez
Moderator
  • 4 kudos

Hi @Palzor Lama​,A streaming live table can only process append queries; that is, queries where new rows are inserted into the source table. Processing updates from source tables, for example, merges and deletes, is not supported. To process updates,...

  • 4 kudos
4 More Replies
JeromeB974
by New Contributor II
  • 6490 Views
  • 8 replies
  • 6 kudos

can we use spark-xml with delta live tables ?

Hiis there a way to use spark-xml with delta live tables (Azure Databricks) ?i 've try something like this without any succes for the momentCREATE LIVE TABLE df17 USING com.databricks.spark.xmlAS SELECT * FROM cloud_files("/mnt/dev/bronze/xml/s432799...

  • 6490 Views
  • 8 replies
  • 6 kudos
Latest Reply
Zachary_Higgins
Contributor
  • 6 kudos

This is a tough one since the only magic command available is %pip, but spark-xml is a maven package. The only way I found to do this was to install the spark-xml jar from the maven repo using the databricks-cli. You can reference the cluster ID usin...

  • 6 kudos
7 More Replies
thaipham
by New Contributor III
  • 1804 Views
  • 3 replies
  • 4 kudos

Resolved! How would I export the latest revision of a notebook?

I've been trying to export some notebooks from my Databricks workspace to my laptop. I can't use Git Repos because the company restricted access to external services from the control plane.However it looks to me that I always exported the previous re...

  • 1804 Views
  • 3 replies
  • 4 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 4 kudos

Too bad you are not allowed to use Repos, can be a life saver.Can you check your answer as best answer so the question is marked as solved?

  • 4 kudos
2 More Replies
Soma
by Valued Contributor
  • 3385 Views
  • 2 replies
  • 2 kudos

Resolved! Spark Failure Error Unable to download spark docker Image

Cluster terminated. Reason: Spark Image Download Failure  "reason": { "code": "SPARK_IMAGE_DOWNLOAD_FAILURE", "type": "SERVICE_FAULT", "parameters": { "instance_id": "6565aa39b0ae4fe69c7fe6f313e3ca2a", "databricks_error_message": "Failed to set up th...

  • 3385 Views
  • 2 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

Hi @somanath Sankaran​, Have you enabled container services on your cluster?To use custom containers on your clusters, a workspace administrator must enable Databricks Container Services as follows:Go to the admin console.Click the Workspace Settings...

  • 2 kudos
1 More Replies
Ruby8376
by Valued Contributor
  • 1948 Views
  • 2 replies
  • 0 kudos

Primary/Foreign key Costraints on Delta tables?

Hi All!I am using databricks in data migration project . We need to transform the data before loading it to SalesForce. Can we do Primary key/foreign key constraints on databricks delta tables?

  • 1948 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Ruby Rubi​  following- up did you get a chance to check @Werner Stinckens​ previous comments or do you need any further help on this?

  • 0 kudos
1 More Replies
laurencewells
by New Contributor III
  • 3319 Views
  • 4 replies
  • 1 kudos

Resolved! Log4J Custom Filter Not Working

Hi All, Hoping you can help. I am looking to set up a custom logging process that captures application ETL logs and Streaming logs I have set up multiple custom logging appenders using the guide here: https://kb.databricks.com/clusters/overwrite-log4...

  • 3319 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hey there @Laurence Wells​ Hope you are doing great.Does @Kaniz Fatma​ 's response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?Thanks!

  • 1 kudos
3 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels