cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sanjay
by Valued Contributor II
  • 1136 Views
  • 4 replies
  • 2 kudos

Resolved! stop autoloader with continuous trigger programatically

Hi,I am running autoloader with continuous trigger. How can I stop this trigger during some specific time, only if no data pending and current batch process is complete. How to check how many records pending in queue and current state.Regards,Sanjay

  • 1136 Views
  • 4 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @sanjay, Looking to effectively manage your autoloader's continuous trigger? Follow these steps for seamless execution:    Pausing the Trigger at Specific Times: If you need to halt the continuous trigger during certain hours, consider switching t...

  • 2 kudos
3 More Replies
Sujitha
by Community Manager
  • 5101 Views
  • 0 replies
  • 1 kudos

Calling all innovators and visionaries! The 2024 Data Team Awards are open for nominations

Each year, we celebrate the amazing customers that rely on Databricks to innovate and transform their organizations — and the world — with the power of data and AI. The nomination form is now open to submit nominations. Nominations will close on Marc...

Screenshot 2024-02-12 at 10.52.07 AM.png
  • 5101 Views
  • 0 replies
  • 1 kudos
RobsonNLPT
by Contributor
  • 1051 Views
  • 5 replies
  • 0 kudos

Databricks XML - Bypassing rootTag and rowTag

I see the current conversion of dataframe to xml need to be improved.My dataframe schema is a perfect nested schema based on structs but when I create a xml I have the follow issues:1) I can't add elements to root2) rootTag and rowTag are requiredIn ...

  • 1051 Views
  • 5 replies
  • 0 kudos
Latest Reply
sandip_a
New Contributor II
  • 0 kudos

Here is one of the ways to use the struct field name as rowTag:     import org.apache.spark.sql.types._ val schema = new StructType().add("Record", new StructType().add("age", IntegerType).add("name", StringType)) val data = Seq(Row(Row(18, "John ...

  • 0 kudos
4 More Replies
Israel_H
by New Contributor III
  • 746 Views
  • 3 replies
  • 0 kudos

The risks of code execution by default on widget change

Taking from my experience, the default action of widgets triggering code execution upon value change poses risks that outweigh the convenience in certain scenarios. While this feature may seem advantageous in some cases, it can lead to unintended con...

  • 746 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kayla
Contributor
  • 0 kudos

I definitely have to agree with the original point- if you have a notebook that you import, and you touch any widget value you're running code, most likely accidentally. I'd love to see a workspace or user type option where you can change the default...

  • 0 kudos
2 More Replies
RobsonNLPT
by Contributor
  • 679 Views
  • 2 replies
  • 1 kudos

databricks spark XML Writer

Hi.I'm trying to generate XML as output base on my nested dataframe. Everything is ok except by I don't know how to add elements to rootTag.I can add elements from rowtag but not in rootTag. Same problems to add attributes to root <books  version = "...

  • 679 Views
  • 2 replies
  • 1 kudos
Latest Reply
Ayushi_Suthar
Honored Contributor
  • 1 kudos

Hi @RobsonNLPT ,Thanks for bringing up your concerns, always happy to help  Can you please refer to the below document to read and write the XML files? https://docs.databricks.com/en/query/formats/xml.html Please let me know if this helps and leave a...

  • 1 kudos
1 More Replies
Miasu
by New Contributor II
  • 745 Views
  • 2 replies
  • 0 kudos

FileAlreadyExistsException error while analyzing table in Notebook

Databricks experts, I'm new to Databricks, and encounter an issue with the ANALYZE TABLE command in the Notebook. I created two tables nyc_taxi and nyc_taxi2, from one csv file.When executing the following command in Notebook, analyze table nyc_taxi2...

  • 745 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Miasu, To investigate and resolve the issue at hand, there are several steps that can be taken. Firstly, it is important to check for any existing resources that may already have the same name as "nyc_taxi2" in the given path, which is "/users/my...

  • 0 kudos
1 More Replies
Akira
by New Contributor II
  • 518 Views
  • 3 replies
  • 0 kudos

"PutWithBucketOwnerFullControl" privilege missing for storage configuration

Hi. I've been unable to create workspaces manually for a while now. The error I get is "MALFORMED_REQUEST: Failed storage configuration validation checks: List,Put,PutWithBucketOwnerFullControl,Delete".  The storage configuration is on a bucket that ...

putwithownercontrols_error.trimmed.png
  • 518 Views
  • 3 replies
  • 0 kudos
Latest Reply
Akira
New Contributor II
  • 0 kudos

> Yes, it does look like the bucket permissions are not properly set up, but ...To avoid potential misunderstanding: I mean yes the error message does make it sound like the bucket permissions are wrong. I don't meant I found a problem with the ones ...

  • 0 kudos
2 More Replies
coltonflowers
by New Contributor III
  • 414 Views
  • 1 replies
  • 0 kudos

DLT: Only STREAMING tables can have multiple queries.

I am trying to to do a one-time back-fill on a DLT table following the example here: dlt.table() def test(): # providing a starting version return (spark.readStream.format("delta") .option("readChangeFeed", "true") .option("...

  • 414 Views
  • 1 replies
  • 0 kudos
Latest Reply
coltonflowers
New Contributor III
  • 0 kudos

I should also add that when I drop the `backfill` function, validation happens successfully and we get the following pipeline DAG:

  • 0 kudos
Sujitha
by Community Manager
  • 1437 Views
  • 1 replies
  • 1 kudos

Introducing AI Model Sharing with Databricks!

Today, we're excited to announce that AI model sharing is available in both Databricks Delta Sharing and on the Databricks Marketplace. With Delta Sharing you can now easily share and serve AI models securely within your organization or externally ac...

Screenshot 2024-02-06 at 7.01.48 PM.png
  • 1437 Views
  • 1 replies
  • 1 kudos
Latest Reply
johnsonit
New Contributor II
  • 1 kudos

I'm eager to dive in and leverage these new features to elevate my AI game with Databricks.This is Johnson from KBS Technologies.Thanks for your update.

  • 1 kudos
Frantz
by New Contributor III
  • 1415 Views
  • 4 replies
  • 0 kudos

Resolved! Show Existing Header From CSV I External Table

Hello, is there a way to load csv data into an external table without the _c0, _c1 columns showing?

  • 1415 Views
  • 4 replies
  • 0 kudos
Latest Reply
Frantz
New Contributor III
  • 0 kudos

My question was answered in a separate thread here.

  • 0 kudos
3 More Replies
Frantz
by New Contributor III
  • 829 Views
  • 3 replies
  • 0 kudos

Resolved! Unable to load csv data with correct header values in External tables

Hello, is there a way to load "CSV" data into an external table without the _c0, _c1 columns showing?I've tried using the options within the sql statement that does not appear to work.Which results in this table 

Frantz_0-1707258246022.png Frantz_1-1707258264972.png
Community Discussions
External Tables
Unity Catalog
  • 829 Views
  • 3 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor III
  • 0 kudos

you need set "USING data_source"https://community.databricks.com/t5/data-engineering/create-external-table-using-multiple-paths-locations/td-p/44042 

  • 0 kudos
2 More Replies
Anton_Lagergren
by New Contributor III
  • 652 Views
  • 2 replies
  • 3 kudos

Resolved! New Regional Group Request

Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul   @Sujitha 

  • 652 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @Anton_Lagergren , Thank you for reaching out and expressing interest in starting a new Regional Group for the DMV Area (DC, Maryland, Virginia). This sounds like a fantastic initiative, and we're excited to help you get started! Please checkout ...

  • 3 kudos
1 More Replies
shubbansal27
by New Contributor II
  • 4842 Views
  • 7 replies
  • 1 kudos

How to disable task in databricks job workflow

I don't see any option to disable the task. I have created one workflow which contains multiple tasks.sometime i need to disable some task for my experiment.  but there is no straightforward way for this.  I feel "disable" button is must have option ...

  • 4842 Views
  • 7 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @shubbansal27  We haven't heard from you since the last response from @Siebert_Looije , and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpful to ot...

  • 1 kudos
6 More Replies
Kaizen
by Contributor III
  • 621 Views
  • 2 replies
  • 0 kudos

Python Logging cant save log in DBFS

Hi! I am trying to integrate logging into my project. Got the library and logs to work but cant log the file into DBFS directly.Have any of you been able to save and append the log file directly to dbfs? From what i came across online the best way to...

Kaizen_0-1707174350136.png
  • 621 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor III
  • 0 kudos

you can use  azure_storage_loggingSet Python Logging to Azure Blob, but Can not Find Log File there - Stack Overflow

  • 0 kudos
1 More Replies
Jasonh202222
by New Contributor II
  • 1446 Views
  • 3 replies
  • 2 kudos

Databricks notebook how to stop truncating numbers when export the query result to csv

I use Databricks notebook to query databases and export / download result to csv. I just accidentally close a pop-up window asking if need to truncate the numbers, I accidentally chose yes and don't ask again. Now all my long digit numbers are trunca...

  • 1446 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 2 kudos
2 More Replies