- 821 Views
- 1 replies
- 0 kudos
- 821 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, can you clatify what is your aim ? Maybe there is no need to use DB SDK at all ?
- 0 kudos
- 1949 Views
- 3 replies
- 0 kudos
Unity Catalog view access in Azure Storage account
Hi,I have my unity catalog in Azure Storage account and I can able to access table objects but I couldn't find my views that were created on top of those table. 1. I can can access Delta tables & related views via Databricks SQL and also find the tab...
- 1949 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi, Couple of options are possible : Use Databricks to do the complex SQL queries (joins, unions, etc) and write to a staging Delta Table. Then use DataFlow to read from that staged table. Orchestrate all of this using ADF or even Databricks Workflo...
- 0 kudos
- 1932 Views
- 2 replies
- 1 kudos
PowerBI Tips
Does anyone have any tips for using PowerBI on top of databricks? Any best practices you know of or roadblocks you have run into that should be avoided?Thanks.
- 1932 Views
- 2 replies
- 1 kudos
- 1 kudos
Hey, Use Partner Connect to establish a connection to PBI Consider to use Databricks SQL Serverless warehouses for the best user experience and performance (see Intelligent Workload Management aka auto-scaling and query queuing, remote result cache, ...
- 1 kudos
- 1045 Views
- 1 replies
- 0 kudos
Connecting to DataBricks Sql warehouse from .net
Hi,How can I connect to DataBricks sql warehouse from .net application. Kr
- 1045 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, Please, take a look at Statement Execution API Best,
- 0 kudos
- 1139 Views
- 1 replies
- 1 kudos
Resolved! Can we get SQL Serverless warehouses monitoring data using APIs or logs ?
I am looking for a possible way to get the autoscaling history data for SQL Serverless Warehouses using API or logs.I want something like what we see in monitoring UI.
- 1139 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi Rahul, you need to perform two actions : Enable system tables schema named "compute" (how-to, take a look on the page, it's highly possible that you'll find other schemas useful too)Explore system.compute.warehouse_events table Hope this helps. B...
- 1 kudos
- 1810 Views
- 1 replies
- 3 kudos
Regional Group Request for Istanbul
Hello, I kindly request the formation of a regional group for Istanbul/Turkey. I would appreciate your assistance in this matter.Thank you,Can
- 1810 Views
- 1 replies
- 3 kudos
- 3 kudos
@kankotan Happy to help set it up for you. I have dropped an email for more information!
- 3 kudos
- 895 Views
- 0 replies
- 0 kudos
Usage of SparkMetric_CL, SparkListenerEvent_CL and SparkLoggingEvent_CL
I am wondering If can retrieve any information from Azure Log Analytics custom tables (already set) for Azure Databricks. Would like to retrieve information about query and data performance for SQL Warehouse Cluster. I am not sure If I can get it fro...
- 895 Views
- 0 replies
- 0 kudos
- 1169 Views
- 2 replies
- 0 kudos
Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on clas
I'm executing a notebook and failed with this error: Sometime, when i execute some function in spark and also failed with the error 'this class is not whitelist'. Could everyone help me check on this?Thanks for your help!
- 1169 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for your feedback, actually my cluster is shared cluster, and then I change to single cluster then can run that method/
- 0 kudos
- 1964 Views
- 0 replies
- 0 kudos
Lakeview dashboard dynamically change filter values
Greetings everyone,We are trying to implement a series of visualizations. All these visualizations have queries assigned to them in the form of “ Select * from test table where timestamp between :Days start and :Days end”. The is also a filter applie...
- 1964 Views
- 0 replies
- 0 kudos
- 4805 Views
- 1 replies
- 0 kudos
How to access storage with private endpoint
We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...
- 4805 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @jx1226 , were you able to solve this at the customer? I am currently struggling with same issues here.
- 0 kudos
- 3012 Views
- 0 replies
- 0 kudos
Installing R packages for a customer docker container for compute
Hi,I'm trying to create a customer docker image with some R packages re-installed. However, when I try to use it in a notebook, it can't seem to find the installed packages. The build runs fine.FROM databricksruntime/rbase:14.3-LTS## update system li...
- 3012 Views
- 0 replies
- 0 kudos
- 1128 Views
- 0 replies
- 0 kudos
DLT CDC/SCD - Taking the latest ID per day
Hi I'm creating a DLT pipeline which uses DLT CDC to implement SCD Type 1 to take the latest record using a datetime column which works with no issues:@dlt.view def users(): return spark.readStream.table("source_table") dlt.create_streaming_table(...
- 1128 Views
- 0 replies
- 0 kudos
- 2200 Views
- 0 replies
- 0 kudos
What happened to the JobIds in the parallel runs (again)????
Hey Databricks, Why did you take away the jobids from the parallel runs? We use those to identify which output goes with which run. Please put them back.Benedetta
- 2200 Views
- 0 replies
- 0 kudos
- 2266 Views
- 3 replies
- 0 kudos
Error ingesting zip files: ExecutorLostFailure Reason: Command exited with code 50
Hi,We are trying to ingest zip files into Azure Databricks delta lake using COPY INTO command. There are 100+ zip files with average size of ~300MB each.Cluster configuration:1 driver: 56GB, 16 cores2-8 workers: 32GB, 8 cores (each). Autoscaling enab...
- 2266 Views
- 3 replies
- 0 kudos
- 0 kudos
Although we were able to copy the zip files onto the DB volume, we were not able to share them with any system outside of the Databricks environment. Guess delta sharing does not support sharing files that are on UC volumes.
- 0 kudos
- 1647 Views
- 0 replies
- 0 kudos
Not able to access data registered in Unity Catalog using Simba ODBC driver
Hi folks, I'm working on a project with Databricks using Unity Catalog and a connection to SSIS (SQL Server Integration Services).My team is trying to access data registered in Unity Catalog using Simba ODBC driver version 2.8.0.1002. They mentioned ...
- 1647 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »