- 3089 Views
- 5 replies
- 5 kudos
Problem sharing a streaming table created in Delta Live Table via Delta Sharing
Hi all,I hope you could help me to figure out what I am missing.I'm trying to do a simple thing. To read the data from the data ingestion zone (csv files saved to Azure Storage Account) using the Delta Live Tables pipeline and share the resulting tab...
- 3089 Views
- 5 replies
- 5 kudos
- 5 kudos
I'm curious if Databricks plans to address this. We use delta live streaming tables extensively and also planned on using delta sharing to get our data from our production unity catalog (different region). Duplicating the data as a workaround is no...
- 5 kudos
- 2035 Views
- 1 replies
- 0 kudos
How to create Delta live tables in Silver layer
How to create Delta live tables in Silver layerHi DB Experts,Having basic questions :I am working on Madalian Architecture (B, S, G) Layers.on B i am getting Delta files (Parq) format. with log folders. One folder for one table, multiple files are ge...
- 2035 Views
- 1 replies
- 0 kudos
- 0 kudos
Dear Kaniz,Thank you for addressing question :I am getting following error if i follow above: pyspark.errors.exceptions.captured.IllegalArgumentException: Reading from a Delta table is not supported with this syntax. If you would like to consume data...
- 0 kudos
- 2138 Views
- 1 replies
- 0 kudos
I was charged by a free trial
Hello databricks community, I took a databricks course to prepare for certification exam and requested a 14-days free trial on february 13 at 4:51 PM. So, February 27 at 4:51 pm must be the end of the free trial, but it ended 1 day before. Additional...
- 2138 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 7363 Views
- 7 replies
- 24 kudos
Big news: Our Community is now 100,000 members strong with over 50,000 posts🚀
Thanks to every one of you, the Databricks Community has reached an incredible milestone: 100,000 members and over 50,000 posts! Your dedication, expertise and passion have made this possible. Whether you're a seasoned data professional, a coding en...
- 7363 Views
- 7 replies
- 24 kudos
- 3259 Views
- 1 replies
- 0 kudos
Error Spark reading CSV from DBFS MNT: incompatible format detected
I am trying to follow along with a training course, but I am consistently running into an error loading a CSV with Spark from DBFS. Specifically, I keep getting an "Invalid format detected error". Has anyone else encountered this and found a soluti...
- 3259 Views
- 1 replies
- 0 kudos
- 0 kudos
Well your error message is telling you that Spark is encountering a Delta table conflict while trying to read a CSV file. The file path dbfs:/mnt/dbacademy... points to a CSV file. This is where the fun begins. Spark detects a Delta transaction log d...
- 0 kudos
- 4839 Views
- 13 replies
- 0 kudos
Resolved! Want to split JSON data into multiple rows
Hi,This is my sample JSON data which is generated from api response and it is all coming in a single row. I want to split this in multiple rows and store it in a dataframe.[{"transaction_id":"F6001EC5-528196D1","corrects_transaction_id":null,"transac...
- 4839 Views
- 13 replies
- 0 kudos
- 0 kudos
Yes indeed, it was datatype issue. After changing it to Longtype in the schema definition, it is working now. Thanks once again for all your inputs and time. Much appreciated !!!
- 0 kudos
- 1198 Views
- 2 replies
- 0 kudos
Bug with show tblproperties directly returning redacted in the result set where "userid" in value
When I use show tblproperties on a view/table to see the metadata, it will redact any value which has "userid" anywhere put in to it.And it is not just through the visual interface, when I query it through python directly, it contains the redacted va...
- 1198 Views
- 2 replies
- 0 kudos
- 0 kudos
I understand that yours is a View. For my case, it's a Table so I could use `desc detail <schema_name>.<table_name>` to get the table properties info that are not redacted in the `properties` column from the `desc detail` output.
- 0 kudos
- 2680 Views
- 4 replies
- 0 kudos
My exam got suspended ; Need help immediately (10/09/2023)
Hello Team,I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file ...
- 2680 Views
- 4 replies
- 0 kudos
- 0 kudos
Hello, @sirishavemula20 It's a general practice for a proctor to ask the test taker to pan the room(as part of security measures) and its the responsibility of the test taker to make sure the surroundings are clear of any other objects whilst attempt...
- 0 kudos
- 3922 Views
- 8 replies
- 6 kudos
Error at model serving for quantised models using bitsandbytes library
Hello,I've been trying to serve registered MLflow models at GPU Model Serving Endpoint, which works except for the models using bitsandbytes library. The library is used to quantise the LLM models into 4-bit/ 8-bit (e.g. Mistral-7B), however, it runs...
- 3922 Views
- 8 replies
- 6 kudos
- 6 kudos
@phi_alpacaWe have solved it by providing a conda_env.yaml when we log the model, all we needed was to add cudatoolkit=11.8 to the dependencies.
- 6 kudos
- 1804 Views
- 2 replies
- 0 kudos
Databricks Job cost(AWS)
Hi Databricks Community,I am looking for a formula/way to calculate the estimated cost for a job run, for which I have a few questions:1. Is there any formula to calculate the cost of any job like -> [(EC2 per hr cost) * (total time job ran)]and when...
- 1804 Views
- 2 replies
- 0 kudos
- 0 kudos
This looks a little bit confusing to me, I'm looking for a more straight forward answer, more like a simple formulaThanks though for your reply
- 0 kudos
- 1117 Views
- 0 replies
- 0 kudos
sparklyr::spark_read_csv forbidden 403 error
Hi,I am trying to read a csv file into a Spark DataFrame using sparklyr::spark_read_csv. I am receiving a 403 access denied error.I have stored my AWS credentials as environment variables, and can successfully read the file as an R dataframe using ar...
- 1117 Views
- 0 replies
- 0 kudos
- 980 Views
- 0 replies
- 0 kudos
Left Outer Join returns an Inner Join in Delta Live Tables
In our Delta Live Table pipeline I am simply joining two streaming tables to a new streaming table.We use the following code: @Dlt.create_table() def fact_event_faults(): events = dlt.read_stream('event_list').withWatermark('TimeStamp', '4 hours'...
- 980 Views
- 0 replies
- 0 kudos
- 3890 Views
- 0 replies
- 0 kudos
[Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server.
I am using databricks jdbc driver to run a certain app. It runs fine for a few mins to hours and then I get the error [Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 502,...
- 3890 Views
- 0 replies
- 0 kudos
- 838 Views
- 0 replies
- 0 kudos
Integrate databrick cluster with aws using cloudformation
Hello community, I am trying to launch databrick cluster using AWS cloudformation template. I have checked public extension option in cloudformation and databrick cluster extension is also available in third party extension however not getting how t...
- 838 Views
- 0 replies
- 0 kudos
- 1657 Views
- 1 replies
- 2 kudos
Hide widgets logic
Hello,We have recently created a notebook in order to allow users inserting/updating values in specific tables. The logic behind the update statements is included in a separate notebook where users don't have access. However we would like to know if ...
- 1657 Views
- 1 replies
- 2 kudos
- 2 kudos
When you want users to perform some write action (for example, change parameters, etc.), it is usually easiest to build a small app in Azure PowerApps, save those values, and extract them to the table in Delta Lake (so your notebooks will take values...
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »