cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

646901
by New Contributor II
  • 1787 Views
  • 2 replies
  • 0 kudos

Cloud storage - enabling object versioning?

So i am going to keep this generic as to all cloud provider storage options as its relevant across the board, (GCS, S3 and blob store). Nothing is mentioned in docs as far as i can see. Is there a use case against enabling object versioning in cloud ...

  • 1787 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Matt User​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 0 kudos
1 More Replies
Meghala
by Valued Contributor II
  • 2747 Views
  • 4 replies
  • 0 kudos

I faced some problem while taking databricks exam

Hi team,  Good evening today I got problem while taking the exam my exam is @11:30 but some audio problem it's got reschedule @12:45 again also I faced problem ,question was some time appears and some time it's not so, because this I can't able to ta...

  • 2747 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @S Meghala​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 0 kudos
3 More Replies
Larrio
by New Contributor III
  • 7961 Views
  • 6 replies
  • 3 kudos

Autoloader - understanding missing file after schema update.

Hello,Concerning Autoloader (based on https://docs.databricks.com/ingestion/auto-loader/schema.html), so far what I understand is when it detects a schema update, the stream fails and I have to rerun it to make it works, it's ok.But once I rerun it, ...

  • 7961 Views
  • 6 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Lucien Arrio​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 3 kudos
5 More Replies
Anonymous
by Not applicable
  • 3468 Views
  • 9 replies
  • 3 kudos

Dear Community members, We want to extend our sincere gratitude for attending the Community event - March series on March 31st 2023. Your presence mad...

Dear Community members,We want to extend our sincere gratitude for attending the Community event - March series on March 31st 2023. Your presence made the event a huge success, and we appreciate the time you took to join us. We were thrilled to hear ...

Screenshot 2023-03-31 at 9.38.36 AM
  • 3468 Views
  • 9 replies
  • 3 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 3 kudos

@Suteja Kanuri​ Hi Suteja. Great initiative. Please plan a common timezone between India and UK/EUR/US so that we can also attend. BTW is there any recorded session that we can go through?

  • 3 kudos
8 More Replies
chanansh
by Contributor
  • 1884 Views
  • 2 replies
  • 0 kudos

Delta table acceleration for group by on key columns using ZORDER does not work

What is the best practice for accelerating queries which looks like the following?win = Window.partitionBy('key1','key2').orderBy('timestamp') df.select('timestamp', (F.col('col1') - F.lag('col1').over(win)).alias('col1_diff'))I have tried to use OP...

  • 1884 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Hanan Shteingart​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answ...

  • 0 kudos
1 More Replies
Kanna1706
by New Contributor III
  • 3660 Views
  • 3 replies
  • 4 kudos
  • 3660 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Machireddy Nikitha​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best an...

  • 4 kudos
2 More Replies
kll
by New Contributor III
  • 3439 Views
  • 1 replies
  • 0 kudos

Fatal error: The Python kernel is unresponsive when attempting to query data from AWS Redshift within Jupyter notebook

I am running jupyter notebook on a cluster with configuration: 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)Worker type: i3.xlarge 30.5gb memory, 4 coresMin 2 and max 8 workers cursor = conn.cursor()   cursor.execute( """ ...

  • 3439 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please confirm the usage of your cluster while running this job? you can monitor the performance here: https://docs.databricks.com/clusters/clusters-manage.html#monitor-performance with different metrics. Also, please tag @Debayan​ with...

  • 0 kudos
MaheshDR
by New Contributor II
  • 9435 Views
  • 6 replies
  • 1 kudos

Open firewall to Azure Databricks workspace from AWS RDS machine/EC2 machine

Hi All,As part of our solution approach, we need to connect to one of our AWS RDS Oracle databases from Azure Databricks notebook.We need your help to understand which IP range of Azure Databricks to consider to whitelist them on AWS RDS security gro...

  • 9435 Views
  • 6 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Mahesh D​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 1 kudos
5 More Replies
jakubk
by Contributor
  • 11764 Views
  • 13 replies
  • 9 kudos

dbt workflow job limitations - naming the target? where do docs go?

I'm on unity catalogI'm trying to do a dbt run on a project that works locallybut the databricks dbt workflow task seems to be ignoring the project.yml settings for schemas and catalogs, as well as that defined in the config block of individual model...

  • 11764 Views
  • 13 replies
  • 9 kudos
Latest Reply
Anonymous
Not applicable
  • 9 kudos

Hi @Jakub K​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest provid...

  • 9 kudos
12 More Replies
SS2
by Valued Contributor
  • 2202 Views
  • 2 replies
  • 0 kudos

How we can read data from adls gen 2 using bash (%sh) command.(without mounting)

Hi @Ananth Arunachalam/Team,Can we read file from ADLS gen 2 using shell script (%%bash or %%sh ) without doing mounting.​ Please let me know. Thank you.​

  • 2202 Views
  • 2 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@S S​ you can access data in ADLS GEn2 using multiple ways, please check below article.easy way is using storage account access key method https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage

  • 0 kudos
1 More Replies
blockee
by New Contributor II
  • 6862 Views
  • 3 replies
  • 0 kudos

DE 4.1 - DLT UI Walkthrough Error in Classroom Setup

Trying to follow along with the DLT videos in the academy. I get an error when running the setup script. Error trace below. It stems from running Classroom-Setup-04.1DA = DBAcademyHelper(course_config=course_config,                     lesson_config=...

  • 6862 Views
  • 3 replies
  • 0 kudos
Latest Reply
blockee
New Contributor II
  • 0 kudos

I tried with Py4J versions 0.10.9.5, .3, and .1. None of those versions worked. I also tried upgrading the runtime to 13.0 and 12.1 and saw the same issue. The 13.0 runtime upgraded Py4J to 0.10.9.7 and that didn't resolve the issue. The error stayed...

  • 0 kudos
2 More Replies
adrin
by New Contributor III
  • 40610 Views
  • 9 replies
  • 6 kudos

Resolved! How to access the result of a %sql cell from python

I see the way to move from python to sql is to create a temp view, and then access that dataframe from sql, and in a sql cell. Now the question is, how can I have a %sql cell with a select statement in it, and assign the result of that statement to ...

  • 40610 Views
  • 9 replies
  • 6 kudos
Latest Reply
dogwoodlx
New Contributor II
  • 6 kudos

Results from an SQL cell are available as a Python DataFrame. The Python DataFrame name is _sqldf.To save the DataFrame, run this code in a Python cell:df = _sqldfKeep in mind that the value in _sqldf is held in memory and will be replaced with the m...

  • 6 kudos
8 More Replies
shamly
by New Contributor III
  • 4832 Views
  • 4 replies
  • 4 kudos

Urgent - Use Python Variable in shell command in databricks notebook

I am trying to read a csv and do an activity from azure storage account using databricks shell script. I wanted to add this shell script into my big python code for other sources as well. I have created widgets for file path in python. I have created...

  • 4832 Views
  • 4 replies
  • 4 kudos
Latest Reply
SS2
Valued Contributor
  • 4 kudos

You can mount the storage account and then can set env level variable and can do the operation that you want.

  • 4 kudos
3 More Replies
KVNARK
by Honored Contributor II
  • 3563 Views
  • 9 replies
  • 5 kudos

It would be great if Databricks starts increasing the number of rewards, as the no of users in community ae increasing. When we want to redeem somethi...

It would be great if Databricks starts increasing the number of rewards, as the no of users in community ae increasing. When we want to redeem something the limited goodies available in community rewards portal are out of stock. So its better to incr...

  • 3563 Views
  • 9 replies
  • 5 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 5 kudos

@Kaniz Fatma​ @Vidula Khanna​ Hi. I just see the below rewards available to redeem. Is this different based on the location?

  • 5 kudos
8 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels