Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
I'm executing a notebook and failed with this error: Sometime, when i execute some function in spark and also failed with the error 'this class is not whitelist'. Could everyone help me check on this?Thanks for your help!
Greetings everyone,We are trying to implement a series of visualizations. All these visualizations have queries assigned to them in the form of “ Select * from test table where timestamp between :Days start and :Days end”. The is also a filter applie...
We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...
Hi,I'm trying to create a customer docker image with some R packages re-installed. However, when I try to use it in a notebook, it can't seem to find the installed packages. The build runs fine.FROM databricksruntime/rbase:14.3-LTS## update system li...
Hi I'm creating a DLT pipeline which uses DLT CDC to implement SCD Type 1 to take the latest record using a datetime column which works with no issues:@dlt.view
def users():
return spark.readStream.table("source_table")
dlt.create_streaming_table(...
Hey Databricks, Why did you take away the jobids from the parallel runs? We use those to identify which output goes with which run. Please put them back.Benedetta
Hi,We are trying to ingest zip files into Azure Databricks delta lake using COPY INTO command. There are 100+ zip files with average size of ~300MB each.Cluster configuration:1 driver: 56GB, 16 cores2-8 workers: 32GB, 8 cores (each). Autoscaling enab...
Although we were able to copy the zip files onto the DB volume, we were not able to share them with any system outside of the Databricks environment. Guess delta sharing does not support sharing files that are on UC volumes.
Hi folks, I'm working on a project with Databricks using Unity Catalog and a connection to SSIS (SQL Server Integration Services).My team is trying to access data registered in Unity Catalog using Simba ODBC driver version 2.8.0.1002. They mentioned ...
After downloading a file using `wget`, I'm attempting to read it by spark.read.json.I am getting error: PATH_NOT_FOUND - Path does not exist: dbfs:/tmp/data.json. SQLSTATE: 42K03File <command-3327713714752929>, line 2 I have checked the file do exist...
Hi @sharma_kamal , Good Day!
Could you please try the below code suggested by @ThomazRossito , it will help you.
Also please refer to the below document to work with the files on Databricks:
https://docs.databricks.com/en/files/index.html
Please l...
I setup a model serving endpoint and created a monitoring dashboard to monitor its performance. The problem is my inference table doesn't get updated by model serving endpoints. To test the endpoint I use the following codeimport random
import time
...
Hello,Previously I was able to run the folowing command in databricks to see a list of the mount points but it seems the system does not accept this anymore as I get the following error.Any thoughts on how to get a list of the mount points?Thank youd...
I have a very strange thing happening. I'm importing a csv file and nulls and blanks are being interpreted correctly. What is strange is that a column that regularly has a single space character value is having the single space converted to null.I'...
For AWS databricks, i have configured 1 worker and 1 driver node with same node type. In the AWS console, all details are same for the two instance, as instance id only different. How to identify which instance id is for worker and which one is for d...
Hi @Retired_mod Thanks for your response.There is no instance pool ID while configured the cluster, then how will i able to differentiate then.Could you give any alternative way for finding the driver instance id and worker instance id in the AWS con...
Hi,In a particular Workflows Job, I am trying to add some data checks in between each task by using If else statement. I used following statement in a notebook to call parameter in if else condition to check logic.{"job_id": XXXXX,"notebook_params": ...
Hi, I have started using databricks recently, and I'm not able find a right solution in the documentations. i have linked multiple repos in my databricks workspace in the repos folders, and I wanted to update the repos with remote AzureDevops reposit...
Hi @Ha2001 , Good Day!
Databricks API has a limit of 10 per second for the /repos/* combined requests in the workspace. You can check the below documentation for the API limit:
https://docs.databricks.com/en/resources/limits.html#:~:text=Git%20fold...
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.