cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Miasu
by New Contributor II
  • 1157 Views
  • 2 replies
  • 0 kudos

FileAlreadyExistsException error while analyzing table in Notebook

Databricks experts, I'm new to Databricks, and encounter an issue with the ANALYZE TABLE command in the Notebook. I created two tables nyc_taxi and nyc_taxi2, from one csv file.When executing the following command in Notebook, analyze table nyc_taxi2...

  • 1157 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Miasu, To investigate and resolve the issue at hand, there are several steps that can be taken. Firstly, it is important to check for any existing resources that may already have the same name as "nyc_taxi2" in the given path, which is "/users/my...

  • 0 kudos
1 More Replies
Akira
by New Contributor II
  • 717 Views
  • 3 replies
  • 0 kudos

"PutWithBucketOwnerFullControl" privilege missing for storage configuration

Hi. I've been unable to create workspaces manually for a while now. The error I get is "MALFORMED_REQUEST: Failed storage configuration validation checks: List,Put,PutWithBucketOwnerFullControl,Delete".  The storage configuration is on a bucket that ...

putwithownercontrols_error.trimmed.png
  • 717 Views
  • 3 replies
  • 0 kudos
Latest Reply
Akira
New Contributor II
  • 0 kudos

> Yes, it does look like the bucket permissions are not properly set up, but ...To avoid potential misunderstanding: I mean yes the error message does make it sound like the bucket permissions are wrong. I don't meant I found a problem with the ones ...

  • 0 kudos
2 More Replies
coltonflowers
by New Contributor III
  • 576 Views
  • 1 replies
  • 0 kudos

DLT: Only STREAMING tables can have multiple queries.

I am trying to to do a one-time back-fill on a DLT table following the example here: dlt.table() def test(): # providing a starting version return (spark.readStream.format("delta") .option("readChangeFeed", "true") .option("...

  • 576 Views
  • 1 replies
  • 0 kudos
Latest Reply
coltonflowers
New Contributor III
  • 0 kudos

I should also add that when I drop the `backfill` function, validation happens successfully and we get the following pipeline DAG:

  • 0 kudos
Sujitha
by Community Manager
  • 8385 Views
  • 1 replies
  • 1 kudos

Introducing AI Model Sharing with Databricks!

Today, we're excited to announce that AI model sharing is available in both Databricks Delta Sharing and on the Databricks Marketplace. With Delta Sharing you can now easily share and serve AI models securely within your organization or externally ac...

Screenshot 2024-02-06 at 7.01.48 PM.png
  • 8385 Views
  • 1 replies
  • 1 kudos
Latest Reply
johnsonit
New Contributor II
  • 1 kudos

I'm eager to dive in and leverage these new features to elevate my AI game with Databricks.This is Johnson from KBS Technologies.Thanks for your update.

  • 1 kudos
Frantz
by New Contributor III
  • 8090 Views
  • 4 replies
  • 0 kudos

Resolved! Show Existing Header From CSV I External Table

Hello, is there a way to load csv data into an external table without the _c0, _c1 columns showing?

  • 8090 Views
  • 4 replies
  • 0 kudos
Latest Reply
Frantz
New Contributor III
  • 0 kudos

My question was answered in a separate thread here.

  • 0 kudos
3 More Replies
Frantz
by New Contributor III
  • 1189 Views
  • 3 replies
  • 0 kudos

Resolved! Unable to load csv data with correct header values in External tables

Hello, is there a way to load "CSV" data into an external table without the _c0, _c1 columns showing?I've tried using the options within the sql statement that does not appear to work.Which results in this table 

Frantz_0-1707258246022.png Frantz_1-1707258264972.png
Community Discussions
External Tables
Unity Catalog
  • 1189 Views
  • 3 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor III
  • 0 kudos

you need set "USING data_source"https://community.databricks.com/t5/data-engineering/create-external-table-using-multiple-paths-locations/td-p/44042 

  • 0 kudos
2 More Replies
Anton_Lagergren
by Contributor
  • 911 Views
  • 2 replies
  • 3 kudos

Resolved! New Regional Group Request

Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul   @Sujitha 

  • 911 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @Anton_Lagergren , Thank you for reaching out and expressing interest in starting a new Regional Group for the DMV Area (DC, Maryland, Virginia). This sounds like a fantastic initiative, and we're excited to help you get started! Please checkout ...

  • 3 kudos
1 More Replies
Kaizen
by Valued Contributor
  • 1035 Views
  • 2 replies
  • 0 kudos

Python Logging cant save log in DBFS

Hi! I am trying to integrate logging into my project. Got the library and logs to work but cant log the file into DBFS directly.Have any of you been able to save and append the log file directly to dbfs? From what i came across online the best way to...

Kaizen_0-1707174350136.png
  • 1035 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor III
  • 0 kudos

you can use  azure_storage_loggingSet Python Logging to Azure Blob, but Can not Find Log File there - Stack Overflow

  • 0 kudos
1 More Replies
Jasonh202222
by New Contributor II
  • 1919 Views
  • 3 replies
  • 2 kudos

Databricks notebook how to stop truncating numbers when export the query result to csv

I use Databricks notebook to query databases and export / download result to csv. I just accidentally close a pop-up window asking if need to truncate the numbers, I accidentally chose yes and don't ask again. Now all my long digit numbers are trunca...

  • 1919 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 2 kudos
2 More Replies
SaiCharan
by New Contributor
  • 1625 Views
  • 3 replies
  • 0 kudos

No space left on device and IllegalStateException: Have already allocated a maximum of 8192 pages

Hello, I'm writing to bring to your attention an issue that we have encountered while working with Data bricks and seek your assistance in resolving it.Context of the Error : When a sql query(1700 lines) is ran, corresponding data bricks job is faili...

  • 1625 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
2 More Replies
gb
by New Contributor
  • 947 Views
  • 2 replies
  • 0 kudos

Write stream to Kafka topic with DLT

Hi,Is it possible to write stream to Kafka topic with Delta Live Table?I would like to do something like this:@dlt.view(name="kafka_pub",comment="Publish to kafka")def kafka_pub():return (dlt.readStream("source_table").selectExpr("to_json (struct (*)...

  • 947 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
1 More Replies
Patrick-Data
by New Contributor II
  • 3293 Views
  • 4 replies
  • 0 kudos

Connecting live google sheets data to Databricks

Hi! So we have a live google sheets data that gets updated on an hourly/daily basis, and we want to bring it to databricks as a live/scheduled connection for further analysis, together with other tables and views present there. Do you have any sugges...

  • 3293 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
3 More Replies
BabuMahesh
by New Contributor
  • 309 Views
  • 1 replies
  • 0 kudos

Databricks & Bigquery

Databricks is packaging a old version of big-query jar(Databricks also repackaged and created a fat jar), and our application needs a latest jar. Now the latest jar depends on spark-bigquery-connector.properties  file for a property scala.binary.vers...

  • 309 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @BabuMahesh, It seems like you’re dealing with a complex issue related to jar conflicts in Databricks.    Here are a few suggestions that might help:   Uninstall the old jars from the cluster: Continually push new changes in subsequent versions wi...

  • 0 kudos
VGS777
by New Contributor III
  • 1055 Views
  • 3 replies
  • 2 kudos

Resolved! Regarding cloning my gitrepo under workspace/Users/user_name

Hi all,I am recently started using databricks. I want to my git repo under workspace/Users/user_name path which I can't able to do it. But i can able to clone only under repo directory by default.Can anyone pls advice me regarding this Thank you

  • 1055 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution. 

  • 2 kudos
2 More Replies
pshuk
by New Contributor III
  • 786 Views
  • 2 replies
  • 0 kudos

how to create volume using databricks cli commands

I am new to using volumes on databricks. Is there a way to create volume using CLI commands.On the similar note, is there a way to create DBFS directories and subdirectories using single command.for example: I want to copy file here dbfs:/FileStore/T...

  • 786 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
1 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!