- 934 Views
- 2 replies
- 0 kudos
Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on clas
I'm executing a notebook and failed with this error: Sometime, when i execute some function in spark and also failed with the error 'this class is not whitelist'. Could everyone help me check on this?Thanks for your help!
- 934 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for your feedback, actually my cluster is shared cluster, and then I change to single cluster then can run that method/
- 0 kudos
- 1357 Views
- 0 replies
- 0 kudos
Lakeview dashboard dynamically change filter values
Greetings everyone,We are trying to implement a series of visualizations. All these visualizations have queries assigned to them in the form of “ Select * from test table where timestamp between :Days start and :Days end”. The is also a filter applie...
- 1357 Views
- 0 replies
- 0 kudos
- 4213 Views
- 1 replies
- 0 kudos
How to access storage with private endpoint
We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...
- 4213 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @jx1226 , were you able to solve this at the customer? I am currently struggling with same issues here.
- 0 kudos
- 2494 Views
- 0 replies
- 0 kudos
Installing R packages for a customer docker container for compute
Hi,I'm trying to create a customer docker image with some R packages re-installed. However, when I try to use it in a notebook, it can't seem to find the installed packages. The build runs fine.FROM databricksruntime/rbase:14.3-LTS## update system li...
- 2494 Views
- 0 replies
- 0 kudos
- 1001 Views
- 0 replies
- 0 kudos
DLT CDC/SCD - Taking the latest ID per day
Hi I'm creating a DLT pipeline which uses DLT CDC to implement SCD Type 1 to take the latest record using a datetime column which works with no issues:@dlt.view def users(): return spark.readStream.table("source_table") dlt.create_streaming_table(...
- 1001 Views
- 0 replies
- 0 kudos
- 2099 Views
- 0 replies
- 0 kudos
What happened to the JobIds in the parallel runs (again)????
Hey Databricks, Why did you take away the jobids from the parallel runs? We use those to identify which output goes with which run. Please put them back.Benedetta
- 2099 Views
- 0 replies
- 0 kudos
- 1907 Views
- 3 replies
- 0 kudos
Error ingesting zip files: ExecutorLostFailure Reason: Command exited with code 50
Hi,We are trying to ingest zip files into Azure Databricks delta lake using COPY INTO command. There are 100+ zip files with average size of ~300MB each.Cluster configuration:1 driver: 56GB, 16 cores2-8 workers: 32GB, 8 cores (each). Autoscaling enab...
- 1907 Views
- 3 replies
- 0 kudos
- 0 kudos
Although we were able to copy the zip files onto the DB volume, we were not able to share them with any system outside of the Databricks environment. Guess delta sharing does not support sharing files that are on UC volumes.
- 0 kudos
- 1440 Views
- 0 replies
- 0 kudos
Not able to access data registered in Unity Catalog using Simba ODBC driver
Hi folks, I'm working on a project with Databricks using Unity Catalog and a connection to SSIS (SQL Server Integration Services).My team is trying to access data registered in Unity Catalog using Simba ODBC driver version 2.8.0.1002. They mentioned ...
- 1440 Views
- 0 replies
- 0 kudos
- 4226 Views
- 3 replies
- 4 kudos
Resolved! Unable to read available file after downloading
After downloading a file using `wget`, I'm attempting to read it by spark.read.json.I am getting error: PATH_NOT_FOUND - Path does not exist: dbfs:/tmp/data.json. SQLSTATE: 42K03File <command-3327713714752929>, line 2 I have checked the file do exist...
- 4226 Views
- 3 replies
- 4 kudos
- 4 kudos
Hi @sharma_kamal , Good Day! Could you please try the below code suggested by @ThomazRossito , it will help you. Also please refer to the below document to work with the files on Databricks: https://docs.databricks.com/en/files/index.html Please l...
- 4 kudos
- 1972 Views
- 0 replies
- 0 kudos
the inference table table doesn't get updated
I setup a model serving endpoint and created a monitoring dashboard to monitor its performance. The problem is my inference table doesn't get updated by model serving endpoints. To test the endpoint I use the following codeimport random import time ...
- 1972 Views
- 0 replies
- 0 kudos
- 5828 Views
- 2 replies
- 0 kudos
reading mount points
Hello,Previously I was able to run the folowing command in databricks to see a list of the mount points but it seems the system does not accept this anymore as I get the following error.Any thoughts on how to get a list of the mount points?Thank youd...
- 5828 Views
- 2 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 2609 Views
- 1 replies
- 0 kudos
Informatica ETLs
I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...
- 2609 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod Do you know if Informatica Cloud Modernization can convert mappings into Delta Live Tables? Do we have to use Informatica Cloud for this, or can we use it as a one time migration and maintain the artifacts in DataBricks?Alternatively, we...
- 0 kudos
- 2534 Views
- 0 replies
- 0 kudos
Spark Handling White Space as NULL
I have a very strange thing happening. I'm importing a csv file and nulls and blanks are being interpreted correctly. What is strange is that a column that regularly has a single space character value is having the single space converted to null.I'...
- 2534 Views
- 0 replies
- 0 kudos
- 1600 Views
- 0 replies
- 0 kudos
How the Scale up process done in the databricks cluster?
For my AWS databricks cluster, i configured shared computer with 1min worker node and 3 max worker node, initailly only one worker node and driver node instance is created in the AWS console. Is there any rule set by databricks for scale up the next ...
- 1600 Views
- 0 replies
- 0 kudos
- 1045 Views
- 1 replies
- 0 kudos
How to identify Worker and Driver instance in AWS console for databricks instance?
For AWS databricks, i have configured 1 worker and 1 driver node with same node type. In the AWS console, all details are same for the two instance, as instance id only different. How to identify which instance id is for worker and which one is for d...
- 1045 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod Thanks for your response.There is no instance pool ID while configured the cluster, then how will i able to differentiate then.Could you give any alternative way for finding the driver instance id and worker instance id in the AWS con...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »