- 1260 Views
- 1 replies
- 0 kudos
DLT: Only STREAMING tables can have multiple queries.
I am trying to to do a one-time back-fill on a DLT table following the example here: dlt.table() def test(): # providing a starting version return (spark.readStream.format("delta") .option("readChangeFeed", "true") .option("...
- 1260 Views
- 1 replies
- 0 kudos
- 0 kudos
I should also add that when I drop the `backfill` function, validation happens successfully and we get the following pipeline DAG:
- 0 kudos
- 12371 Views
- 1 replies
- 1 kudos
Introducing AI Model Sharing with Databricks!
Today, we're excited to announce that AI model sharing is available in both Databricks Delta Sharing and on the Databricks Marketplace. With Delta Sharing you can now easily share and serve AI models securely within your organization or externally ac...
- 12371 Views
- 1 replies
- 1 kudos
- 1 kudos
I'm eager to dive in and leverage these new features to elevate my AI game with Databricks.This is Johnson from KBS Technologies.Thanks for your update.
- 1 kudos
- 9382 Views
- 2 replies
- 0 kudos
Resolved! Show Existing Header From CSV I External Table
Hello, is there a way to load csv data into an external table without the _c0, _c1 columns showing?
- 9382 Views
- 2 replies
- 0 kudos
- 0 kudos
My question was answered in a separate thread here.
- 0 kudos
- 2236 Views
- 3 replies
- 0 kudos
Resolved! Unable to load csv data with correct header values in External tables
Hello, is there a way to load "CSV" data into an external table without the _c0, _c1 columns showing?I've tried using the options within the sql statement that does not appear to work.Which results in this table
- 2236 Views
- 3 replies
- 0 kudos
- 0 kudos
you need set "USING data_source"https://community.databricks.com/t5/data-engineering/create-external-table-using-multiple-paths-locations/td-p/44042
- 0 kudos
- 1564 Views
- 1 replies
- 1 kudos
Resolved! New Regional Group Request
Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul @Sujitha
- 1564 Views
- 1 replies
- 1 kudos
- 1 kudos
- 1 kudos
- 2369 Views
- 2 replies
- 0 kudos
Python Logging cant save log in DBFS
Hi! I am trying to integrate logging into my project. Got the library and logs to work but cant log the file into DBFS directly.Have any of you been able to save and append the log file directly to dbfs? From what i came across online the best way to...
- 2369 Views
- 2 replies
- 0 kudos
- 0 kudos
you can use azure_storage_loggingSet Python Logging to Azure Blob, but Can not Find Log File there - Stack Overflow
- 0 kudos
- 3755 Views
- 2 replies
- 1 kudos
Databricks notebook how to stop truncating numbers when export the query result to csv
I use Databricks notebook to query databases and export / download result to csv. I just accidentally close a pop-up window asking if need to truncate the numbers, I accidentally chose yes and don't ask again. Now all my long digit numbers are trunca...
- 3755 Views
- 2 replies
- 1 kudos
- 1 kudos
@Jasonh202222 - Kindly check the below navigation path user settings -> Account settings -> Display -> Download and Export. Under Download and Export, Enable the checkbox - "Prompt for formatting large numbers when downloading or exporting" and cl...
- 1 kudos
- 1818 Views
- 0 replies
- 0 kudos
Using streaming data received from Pub/sub topic
I have a notebook in Databricks in which I am streaming a Pub/sub topic. The code for this looks like following-%pip install --upgrade google-cloud-pubsub[pandas] from pyspark.sql import SparkSession authOptions={"clientId" : "123","clientEmail"...
- 1818 Views
- 0 replies
- 0 kudos
- 1978 Views
- 2 replies
- 0 kudos
Resolved! Using Python RPA Library on Databricks
Hi I didn't see any conversations regarding using python RPA package on Data bricks clusters. Is anyone doing this or have gotten it to successfully work on the clusters? I ran into the following errors:1) Initially I was getting the error below rega...
- 1978 Views
- 2 replies
- 0 kudos
- 0 kudos
If you want to capture browser screenshot, you can use playwright%sh pip install playwright playwright install sudo apt-get update playwright install-deps from playwright.async_api import async_playwright async with async_playwright() as p: ...
- 0 kudos
- 1481 Views
- 1 replies
- 0 kudos
how to create volume using databricks cli commands
I am new to using volumes on databricks. Is there a way to create volume using CLI commands.On the similar note, is there a way to create DBFS directories and subdirectories using single command.for example: I want to copy file here dbfs:/FileStore/T...
- 1481 Views
- 1 replies
- 0 kudos
- 0 kudos
Creates a new volume. The user could create either an external volume or a managed volume. An external volume will be created in the specified external location, while a managed volume will be located in the default location which is specified bythe...
- 0 kudos
- 1284 Views
- 1 replies
- 0 kudos
Can I update a table comment using REST API?
https://docs.databricks.com/api/workspace/tablesIt seems I could only list/delete tables, is there a way to update a table's metadata like comment or detail fields by REST API?
- 1284 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @al2co33 , We don't currently provide any APIs for updating table comments, however you can utilize the SQL Statement Execution API to do it. You can use the following tutorial to ALTER TABLE/COLUMN COMMENT. https://learn.microsoft.com/en-us/azure...
- 0 kudos
- 3173 Views
- 1 replies
- 0 kudos
No space left on device and IllegalStateException: Have already allocated a maximum of 8192 pages
Hello, I'm writing to bring to your attention an issue that we have encountered while working with Data bricks and seek your assistance in resolving it.Context of the Error : When a sql query(1700 lines) is ran, corresponding data bricks job is faili...
- 3173 Views
- 1 replies
- 0 kudos
- 0 kudos
are processing Parquet files or what is the format of your tables? can you split your sql query instead of having a huge query with 1700 lines
- 0 kudos
- 1756 Views
- 2 replies
- 2 kudos
Resolved! Regarding cloning my gitrepo under workspace/Users/user_name
Hi all,I am recently started using databricks. I want to my git repo under workspace/Users/user_name path which I can't able to do it. But i can able to clone only under repo directory by default.Can anyone pls advice me regarding this Thank you
- 1756 Views
- 2 replies
- 2 kudos
- 2469 Views
- 4 replies
- 0 kudos
Connect my spark code running in AWS ECS to databricks cluster
Hi team, I wanted to know if there is a way to connect a piece of my pyspark code running in ECS to Databricks cluster and leverage the databricks compute using Databricks connect?I see Databricks connect is for connecting local ide code to databrick...
- 2469 Views
- 4 replies
- 0 kudos
- 0 kudos
Noted @Retired_mod @RonDeFreitas. I am currently using Databricks runtime v12.2 (which is < v13.0). I followed this doc (Databricks Connect for Databricks Runtime 12.2 LTS and below) and connected my local terminal to Databricks cluster and was able ...
- 0 kudos
- 634 Views
- 0 replies
- 0 kudos
Databricks & Bigquery
Databricks is packaging a old version of big-query jar(Databricks also repackaged and created a fat jar), and our application needs a latest jar. Now the latest jar depends on spark-bigquery-connector.properties file for a property scala.binary.vers...
- 634 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »