cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

kingmobil
by New Contributor
  • 314 Views
  • 0 replies
  • 0 kudos

Summit

Really enjoying my time at the summit!

  • 314 Views
  • 0 replies
  • 0 kudos
Orthoscope
by New Contributor
  • 396 Views
  • 0 replies
  • 0 kudos

Epic Keynote

Wow what was cool, so many of my workarounds have materialized into solutions!!

  • 396 Views
  • 0 replies
  • 0 kudos
abetogi
by New Contributor III
  • 411 Views
  • 0 replies
  • 0 kudos

AI

At Chevron we actively use Databricks to provide answers to business users. It was extremely interesting to see the use LakeHouseIQ initiatives as it can expedite how fast our users can receive their answers/reports. Is there any documentation that I...

  • 411 Views
  • 0 replies
  • 0 kudos
IS8
by New Contributor
  • 248 Views
  • 0 replies
  • 0 kudos

Summit

Great conference. Exited about upcoming LakehouseIQ feature. 

  • 248 Views
  • 0 replies
  • 0 kudos
DiegoG
by New Contributor
  • 348 Views
  • 0 replies
  • 0 kudos

Lot of new learns

the new way to optimize my notebook is something that I learned and I will execute soon 

  • 348 Views
  • 0 replies
  • 0 kudos
tariq
by New Contributor III
  • 6722 Views
  • 2 replies
  • 5 kudos

Databricks reading from a zip file

I have mounted an Azure Blob Storage in the Azure Databricks workspace filestore. The mounted container has zipped files with csv files in them. What is the best way to read the zipped files and write into a delta table?@sasikumar sagabala​ 

  • 6722 Views
  • 2 replies
  • 5 kudos
Latest Reply
Rishitha
New Contributor III
  • 5 kudos

Hello @Debayan  I recently came across the similar scenario, is there a way to do this via autoloader. We have zip Folders added daily to our AWS S3 bucket and we want to be able to unzip and load the csv files continuously (Autoloading)

  • 5 kudos
1 More Replies
Zhudocode
by New Contributor II
  • 1256 Views
  • 2 replies
  • 2 kudos

Resolved! What's the purpose of data governance

Forgive me for a nooby question, but what is the point of data governance if everyone is working at the same company? Is it for insider purposes?

  • 1256 Views
  • 2 replies
  • 2 kudos
Latest Reply
Datajoe
Contributor
  • 2 kudos

Hey Zhudocode, Actually answered this is person, but Data Governance fundamentally is about the appropriate, efficient and effective use of data. Appropriate use has to do with ethical ai, use of personal information, and policy around confidential a...

  • 2 kudos
1 More Replies
danieleads
by New Contributor
  • 1063 Views
  • 0 replies
  • 0 kudos

Parameters as query strings in the URL

Working with a query that was imported to Databricks from Redash.There are a couple fields set as parameters on this query:In Redash, it was possible to pre-fill the values for those parameters in the URL. Here's an example showing a valid pre-fill f...

danieleads_0-1687972165051.png
  • 1063 Views
  • 0 replies
  • 0 kudos
Kishore76
by New Contributor
  • 759 Views
  • 1 replies
  • 0 kudos

Do databricks support iceberg tables

Do databricks support iceberg tables

  • 759 Views
  • 1 replies
  • 0 kudos
Latest Reply
Fangrou
New Contributor II
  • 0 kudos

It support iceberg tables but with some Requirements and limitations, you could find détails in this doc: Incrementally clone Parquet and Iceberg tables to Delta Lake 

  • 0 kudos
amitca71
by Contributor II
  • 1906 Views
  • 2 replies
  • 1 kudos

performance tool for databricks sql

Hii'm looking for performance test tool.I saw that there was apost about jmeter https://stackoverflow.com/questions/66913893/how-can-i-connect-jmeter-with-databricks-spark-cluster#comment118293766_66915965 , however, the jdbc paraeters are requesting...

  • 1906 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Amit Cahanovich​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
1 More Replies
SubhaJ
by New Contributor III
  • 1038 Views
  • 2 replies
  • 1 kudos

Test Proctoring Issue

Hi Team,I have given the databricks associate engineer v2 test on 18th June. There were some proctoring issues on my test. They kept removing me from the test so they could thoroughly check the room. Even though they were satisfied they removed me fr...

  • 1038 Views
  • 2 replies
  • 1 kudos
Latest Reply
SubhaJ
New Contributor III
  • 1 kudos

Hi team, hope you are doing well, It's been a week or far I didn't get any response. It'll be a great help if you can look into the issue as a priority.Thanks, Subha  

  • 1 kudos
1 More Replies
nicolamonaca
by New Contributor III
  • 1761 Views
  • 2 replies
  • 0 kudos

Resolved! IoT-based scenario

Hi,We're trying to design a new solution to collect IoT data in real-time on Azure. Could you please suggest me which tools from Azure should we pick along with Databricks?

  • 1761 Views
  • 2 replies
  • 0 kudos
Latest Reply
lorenz
New Contributor III
  • 0 kudos

To build an IoT data ingestion and processing platform, you can consider using the following Azure tools and services:Azure IoT Hub: Azure IoT Hub is a fully managed service that enables reliable and secure communication between IoT devices and the c...

  • 0 kudos
1 More Replies
Srikanth_Gupta_
by Valued Contributor
  • 1553 Views
  • 2 replies
  • 1 kudos

How can I use data skipping with Delta Lake

How does data skipping work with delta lake, can I run ANALYZE TABLE COMPUTE STATISTICS with Delta lake? or Zorder going to solve these problems?

  • 1553 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
New Contributor III
  • 1 kudos

You can use Zorder with indexes for data skipping. Data skipping information is collected automatically when you write to delta table. Delta lake uses this information to provide faster query.You dont need to configure anything for data skipping as t...

  • 1 kudos
1 More Replies
Geeks
by New Contributor
  • 618 Views
  • 1 replies
  • 0 kudos

Hello Databricks

I would love Unity catalog , eager to learn more

  • 618 Views
  • 1 replies
  • 0 kudos
Latest Reply
Jsarfraz
New Contributor II
  • 0 kudos

There is a Databricks education university, you should get the account and also certified for some exam for free

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels