Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
Hi Databricks Team,We would appreciate it if you could inform us about the situations when Column-Masking, Row-Level Filtering, and Attributed-Based Masking should be utilized, as well as the recommended technique for handling large data volumes cont...
I’m exploring how Databricks can support financial auditing and compliance, especially with platforms like aauditing. Has anyone here used Databricks for data analysis or reporting in the context of auditing services? Any insights on workflows or bes...
I need gdal in my course work.After reading this post, I used init script as follows to install gdal into runtime 12.2 LTS dbutils.fs.put("/databricks/scripts/gdal_install.sh","""
#!/bin/bash
sudo add-apt-repository ppa:ubuntugis/ppa
sudo apt-get up...
Hi, in case anyone is still struggling here. I found I could not get the init script approach to work, but if I just run a shell command to install gdal at the start of my notebook it works fine. You might note, however, that this installs gdal versi...
Hello everyone,I am currently facing several challenges related to big data solutions, particularly with the Databricks. As many of you may know, Databricks is a powerful platform for data engineering and analytics, but I have encountered some signif...
Your problem statement is too generic. If your company is facing this, you can reach out to your SA; they will help you. If it's a personal project, then mention what you are trying in detail with cluster size, what you are trying to integrate with, ...
Billing system table provides cost by notebook, jobs, and clusters. If catalog-to-job/cluster/notebook relation is maintained, then catalog-based usage can be determined.
https://docs.databricks.com/en/admin/system-tables/billing.html
Hello, I have registered for databricks certified data engineering associate exam. One of the requirements to give the exam is The exam is set for Sunday 6th October, 2024 but the browser installation (psi secure bridge browser) does not work. .Reac...
Hi @hetrasol ,I'm a Windows user. After installation, I just got the Lockdown Browser OEM instead of the PSI browser, as you mentioned above. Can you help to instruct again on how to install these browsers
Hi, I am using the latest version of pyspark and I am trying to connect to a remote cluster with runtime 13.3.My doubts are:- Do i need databricks unity catalog enabled?- My cluster is already in a Shared policy in Access Mode, so what other configur...
Hi, Is your workspace is already unity catalog enabled? Also, did you go through the considerations for enabling workspace for unity catalog? https://docs.databricks.com/en/data-governance/unity-catalog/enable-workspaces.html#considerations-before-yo...
Hi all. I am no longer able to install my custom wheel in my DLT pipeline. No matter what configuration I try I cannot get it to work: parameterized or just hard-coding the path to the wheel. If I run the hard-coded cell with an all-purpose cluster t...
I managed to fix the issue. The problem was that my wheel was built for Databricks Runtime 14.3 LTS and I was using the PREVIEW channel rather than the CURRENT channel. At time of writing:CURRENT(default): Databricks Runtime 14.1 --> Python: 3.10.12P...
Hi All,I have a situation where I'm receiving various CSV files in a storage location.The issue I'm facing is that I'm using Databricks Autoloader, but some files might arrive later than expected. In this case, we need to notify the relevant team ab...
Well, Autoloader could work nicely with the notification event for arriving files. You could probably specify a window duration for your "on-time" arrivels and that could be your base check for on time. As files arrive they go to their window and whe...
Hi,Can you pls share some more details on what you are looking for ?If you are trying to share the data to/from Databricks, you can use Delta sharing , Clean rooms option - these provide data sharing options with strong security & governance.or if yo...
Hello! we are trying to use Fivetran for ingesting different sources into the data lake so we will have multiple connectors. We would like to know what are the recommendations when selecting the SQL warehouses. Since the new serverless SQL warehouses...
Hi,To understand about the Databricks SQL Serverless cost, you can see here - https://www.databricks.com/product/pricing/databricks-sqlIn terms of comparison, Databricks is said to be the most cost efficient & high performant in the market amongst it...
Hi,As far as I am aware, for security scanning/monitoring at Databricks account level, we have belowSAT - https://github.com/databricks-industry-solutions/security-analysis-toolhttps://www.databricks.com/trust/trusthttps://learn.microsoft.com/en-us/a...
Hi All,how we can handle CDC for unstructured data in Databricks. What are some best practices we should follow to make this work effectively?Regards,Phani
Hi @Phani1 ,Handling CDC for unstructured data—such as audio, images, or video files—in Databricks involves efficiently detecting and processing changes to these files as they occur.Here's how you can approach this:Use Databricks Autoloader: Autoload...
Hi All,For an application that we are building, we need a encoding detector/utf-8 enforcer. For this, we used the python library chardet in combination with "with open". We open a file from a mounted adls location (we use a legacy hive-metastore)When...
Hi @mathijs-fish @Ayushi_Suthar - I am having the same issue with shared cluster. I can see the list of PDF files on the mount using dbutils.fs.ls(mount_point), but when I am trying to read the PDF files using PyPDF, I am getting - FileNotFoundError...
If I run the following code on a cluster in SingleNode mode it works fine, but if I run the exact same cell on a MultiNode Cluster It throws:SparkConnectGrpcException: (java.sql.SQLTransientConnectionException) Could not connect to address=(host=HOST...
data_security_mode": "NONE": This is a non-Unity Catalog Cluster. No Governance enforced.
"data_security_mode": "USER_ISOLATION": This is a UC Shared Compute cluster that has certain limitations when accessing Low-Level APIs, RDDs, and dbfs/data bri...
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.