- 1233 Views
- 3 replies
- 0 kudos
Does Delta Table can be the source of streaming/auto loader?
Hi,Since the Auto Loader only accept "append-only" data as the source, I am wondering if the "Delta Table" can also be the source.Does VACCUM(deleting stale files) or _delta_log(creating nested and different file format than parquet) going to break A...
- 1233 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @QPeiran, Auto-loader is a feature that allows to integrate files into the Data Platform. Once your data is stored into the Delta Table, you can rely on spark.readStream.table("<my_table_name>") to continuously read from the table. Take a look at ...
- 0 kudos
- 534 Views
- 1 replies
- 0 kudos
Handling large volumes of streamed transactional data using DLT
We have a data stream from event hub with approximately 10 million rows per day (into one table) - these records are insert only (no update). We are trying to find a solution to aggregate / group by the data based on multiple data points and our requ...
- 534 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, please find below a set of resources I believe relevant for you. Success stories You can find the success stories of companies leveraging the streaming on Databricks here. Videos Introduction to Data Streaming on the Lakehouse : Structured Stream...
- 0 kudos
- 2154 Views
- 3 replies
- 1 kudos
Resolved! Rearrange tasks in databricks workflow
Hello,There is anyway to rearrange tasks in databricks workflow?.I would like that line that join the two marked tasks doesn't pass behind the other tasks. It is posible that this line by one side?Thanks.
- 2154 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @chemajar, Take a look at Databricks Asset Bundles. It allows you to streamline the development of complex workflows using a yaml definition. In case you need to change the task dependencies, you can rearrange the flow as you need just change the ...
- 1 kudos
- 1271 Views
- 3 replies
- 0 kudos
Do we pay just for qurery run duration while using databricks serverless sql ?
While using databricks serverless sql to run queries does we only pay for the compute resources during the run duration of the query ?
- 1271 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @RahulChaubey , When using Databricks Serverless SQL, the pricing model is designed to be pay-as-you-go and is based on Databricks Units (DBUs). Let me break it down for you: Serverless SQL allows you to run SQL queries for BI reporting,...
- 0 kudos
- 554 Views
- 1 replies
- 0 kudos
- 554 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, can you clatify what is your aim ? Maybe there is no need to use DB SDK at all ?
- 0 kudos
- 1263 Views
- 3 replies
- 0 kudos
Unity Catalog view access in Azure Storage account
Hi,I have my unity catalog in Azure Storage account and I can able to access table objects but I couldn't find my views that were created on top of those table. 1. I can can access Delta tables & related views via Databricks SQL and also find the tab...
- 1263 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi, Couple of options are possible : Use Databricks to do the complex SQL queries (joins, unions, etc) and write to a staging Delta Table. Then use DataFlow to read from that staged table. Orchestrate all of this using ADF or even Databricks Workflo...
- 0 kudos
- 1454 Views
- 2 replies
- 1 kudos
PowerBI Tips
Does anyone have any tips for using PowerBI on top of databricks? Any best practices you know of or roadblocks you have run into that should be avoided?Thanks.
- 1454 Views
- 2 replies
- 1 kudos
- 1 kudos
Hey, Use Partner Connect to establish a connection to PBI Consider to use Databricks SQL Serverless warehouses for the best user experience and performance (see Intelligent Workload Management aka auto-scaling and query queuing, remote result cache, ...
- 1 kudos
- 3254 Views
- 4 replies
- 0 kudos
Concurrent Update to Delta - Throws error
Team,I get a ConcurrentAppendException: Files were added to the root of the table by a concurrent update when trying to update a table which executes via jobs with for each activity in ADF,I tried with Databricks run time 14.x and set the delete vect...
- 3254 Views
- 4 replies
- 0 kudos
- 0 kudos
Hey, This issue happens whenever two or more jobs try to write to the same partition for a table. This exception is often thrown during concurrent DELETE, UPDATE, or MERGE operations. While the concurrent operations may be physically updating differe...
- 0 kudos
- 728 Views
- 1 replies
- 0 kudos
Connecting to DataBricks Sql warehouse from .net
Hi,How can I connect to DataBricks sql warehouse from .net application. Kr
- 728 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, Please, take a look at Statement Execution API Best,
- 0 kudos
- 714 Views
- 1 replies
- 1 kudos
Resolved! Can we get SQL Serverless warehouses monitoring data using APIs or logs ?
I am looking for a possible way to get the autoscaling history data for SQL Serverless Warehouses using API or logs.I want something like what we see in monitoring UI.  
- 714 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi Rahul, you need to perform two actions : Enable system tables schema named "compute" (how-to, take a look on the page, it's highly possible that you'll find other schemas useful too)Explore system.compute.warehouse_events table Hope this helps. B...
- 1 kudos
- 1400 Views
- 1 replies
- 3 kudos
Regional Group Request for Istanbul
Hello, I kindly request the formation of a regional group for Istanbul/Turkey. I would appreciate your assistance in this matter.Thank you,Can
- 1400 Views
- 1 replies
- 3 kudos
- 3 kudos
@kankotan Happy to help set it up for you. I have dropped an email for more information!
- 3 kudos
- 811 Views
- 2 replies
- 0 kudos
Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on clas
I'm executing a notebook and failed with this error: Sometime, when i execute some function in spark and also failed with the error 'this class is not whitelist'. Could everyone help me check on this?Thanks for your help!
- 811 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for your feedback, actually my cluster is shared cluster, and then I change to single cluster then can run that method/
- 0 kudos
- 3461 Views
- 2 replies
- 0 kudos
How to access storage with private endpoint
We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...
- 3461 Views
- 2 replies
- 0 kudos
- 0 kudos
Hey @jx1226 , were you able to solve this at the customer? I am currently struggling with same issues here.
- 0 kudos
- 1561 Views
- 4 replies
- 0 kudos
Error ingesting zip files: ExecutorLostFailure Reason: Command exited with code 50
Hi,We are trying to ingest zip files into Azure Databricks delta lake using COPY INTO command. There are 100+ zip files with average size of ~300MB each.Cluster configuration:1 driver: 56GB, 16 cores2-8 workers: 32GB, 8 cores (each). Autoscaling enab...
- 1561 Views
- 4 replies
- 0 kudos
- 0 kudos
Although we were able to copy the zip files onto the DB volume, we were not able to share them with any system outside of the Databricks environment. Guess delta sharing does not support sharing files that are on UC volumes.
- 0 kudos
- 1596 Views
- 1 replies
- 0 kudos
the inference table table doesn't get updated
I setup a model serving endpoint and created a monitoring dashboard to monitor its performance. The problem is my inference table doesn't get updated by model serving endpoints. To test the endpoint I use the following codeimport random import time ...
- 1596 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @MohsenJ, The log shows several reconfiguration errors related to the logger configuration. These errors are likely due to missing or incorrect configuration settings. Here are some steps to troubleshoot: Check Log Configuration: Verify that the...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »