At Chevron we actively use Databricks to provide answers to business users. It was extremely interesting to see the use LakeHouseIQ initiatives as it can expedite how fast our users can receive their answers/reports. Is there any documentation that I...
I have mounted an Azure Blob Storage in the Azure Databricks workspace filestore. The mounted container has zipped files with csv files in them. What is the best way to read the zipped files and write into a delta table?@sasikumar sagabala​
Hello @Debayan I recently came across the similar scenario, is there a way to do this via autoloader. We have zip Folders added daily to our AWS S3 bucket and we want to be able to unzip and load the csv files continuously (Autoloading)
Hey Zhudocode, Actually answered this is person, but Data Governance fundamentally is about the appropriate, efficient and effective use of data. Appropriate use has to do with ethical ai, use of personal information, and policy around confidential a...
Working with a query that was imported to Databricks from Redash.There are a couple fields set as parameters on this query:In Redash, it was possible to pre-fill the values for those parameters in the URL. Here's an example showing a valid pre-fill f...
Hii'm looking for performance test tool.I saw that there was apost about jmeter https://stackoverflow.com/questions/66913893/how-can-i-connect-jmeter-with-databricks-spark-cluster#comment118293766_66915965 , however, the jdbc paraeters are requesting...
Hi @Amit Cahanovich​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
Hi Team,I have given the databricks associate engineer v2 test on 18th June. There were some proctoring issues on my test. They kept removing me from the test so they could thoroughly check the room. Even though they were satisfied they removed me fr...
Hi team, hope you are doing well, It's been a week or far I didn't get any response. It'll be a great help if you can look into the issue as a priority.Thanks, Subha
Hi,We're trying to design a new solution to collect IoT data in real-time on Azure. Could you please suggest me which tools from Azure should we pick along with Databricks?
To build an IoT data ingestion and processing platform, you can consider using the following Azure tools and services:Azure IoT Hub: Azure IoT Hub is a fully managed service that enables reliable and secure communication between IoT devices and the c...
You can use Zorder with indexes for data skipping. Data skipping information is collected automatically when you write to delta table. Delta lake uses this information to provide faster query.You dont need to configure anything for data skipping as t...