cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kaizen
by Valued Contributor
  • 4446 Views
  • 2 replies
  • 0 kudos

Python Logging cant save log in DBFS

Hi! I am trying to integrate logging into my project. Got the library and logs to work but cant log the file into DBFS directly.Have any of you been able to save and append the log file directly to dbfs? From what i came across online the best way to...

Kaizen_0-1707174350136.png
  • 4446 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 0 kudos

you can use  azure_storage_loggingSet Python Logging to Azure Blob, but Can not Find Log File there - Stack Overflow

  • 0 kudos
1 More Replies
Jasonh202222
by New Contributor II
  • 6271 Views
  • 2 replies
  • 1 kudos

Databricks notebook how to stop truncating numbers when export the query result to csv

I use Databricks notebook to query databases and export / download result to csv. I just accidentally close a pop-up window asking if need to truncate the numbers, I accidentally chose yes and don't ask again. Now all my long digit numbers are trunca...

  • 6271 Views
  • 2 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

@Jasonh202222  - Kindly check the below navigation path user settings -> Account settings -> Display -> Download and Export.  Under Download and Export, Enable the checkbox -  "Prompt for formatting large numbers when downloading or exporting" and cl...

  • 1 kudos
1 More Replies
edmundsecho
by New Contributor II
  • 6088 Views
  • 2 replies
  • 1 kudos

Resolved! Difference between username and account_id

I have a web app that can read files from a person's cloud-based drive (e.g., OneDrive, Google Drive, Dropbox).  The app gets access to the files using OAuth2. The app only ever has access to the files for that user.  Part of the configuration requir...

  • 6088 Views
  • 2 replies
  • 1 kudos
Latest Reply
edmundsecho
New Contributor II
  • 1 kudos

The provided links were helpful.  The take-away* usernames are "globally" unique to an individual; the username is the person's email.* a username can be associated with up to 50 accounts; account_ids track the resources available to the user.This cl...

  • 1 kudos
1 More Replies
sumitdesai
by New Contributor II
  • 3345 Views
  • 0 replies
  • 0 kudos

Using streaming data received from Pub/sub topic

I have a notebook in Databricks in which I am streaming a Pub/sub topic. The code for this looks like following-%pip install --upgrade google-cloud-pubsub[pandas] from pyspark.sql import SparkSession authOptions={"clientId" : "123","clientEmail"...

  • 3345 Views
  • 0 replies
  • 0 kudos
AbdurRehman
by New Contributor II
  • 921 Views
  • 0 replies
  • 1 kudos

Error Signing Up for Databricks Community Edition

@Retired_mod I've been trying to sign up for Databricks Community Edition using different email addresses over the past 24 hours, but I keep getting the error message: "An error has occurred. Please try again later." Can anyone help?Tags: #Databricks...

  • 921 Views
  • 0 replies
  • 1 kudos
Kaizen
by Valued Contributor
  • 4701 Views
  • 2 replies
  • 0 kudos

Resolved! Using Python RPA Library on Databricks

Hi I didn't see any conversations regarding using python RPA package on Data bricks clusters. Is anyone doing this or have gotten it to successfully work on the clusters? I ran into the following errors:1) Initially I was getting the error below rega...

Kaizen_0-1706743633855.png Kaizen_1-1706743734836.png
  • 4701 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 0 kudos

If you want to capture browser screenshot, you can use playwright%sh pip install playwright playwright install sudo apt-get update playwright install-deps  from playwright.async_api import async_playwright async with async_playwright() as p: ...

  • 0 kudos
1 More Replies
ChristianRRL
by Valued Contributor III
  • 2353 Views
  • 0 replies
  • 0 kudos

Unity Catalog: Databricks *Specific* Features

Good day,Deceptively simple question, are there any "Databricks only" specific features that Unity Catalog offers? I understand that generally speaking enabling UC offers some of the following:Data Discovery and LineageAuditing and MonitoringAccess C...

  • 2353 Views
  • 0 replies
  • 0 kudos
CMA
by New Contributor II
  • 3877 Views
  • 2 replies
  • 0 kudos

Problem login in

Hello allI´m new in this platform, I sign up, validated my email, create my password everything is fine and when I try to log in and start a message came upI create a new password but same happen again! But it works a few times, I think like 3 times....

CMA_0-1706923323303.png
  • 3877 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Can you confirm the username was created all lower case? Login is case sensitive so you need to make sure the username is set same exact as you add it in the console or workspace

  • 0 kudos
1 More Replies
pshuk
by New Contributor III
  • 2684 Views
  • 1 replies
  • 0 kudos

how to create volume using databricks cli commands

I am new to using volumes on databricks. Is there a way to create volume using CLI commands.On the similar note, is there a way to create DBFS directories and subdirectories using single command.for example: I want to copy file here dbfs:/FileStore/T...

  • 2684 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Creates a new volume. The user could create either an external volume or a managed volume. An external volume will be created in the specified external location, while a managed volume will be located in the default location which is specified bythe...

  • 0 kudos
ChristianRRL
by Valued Contributor III
  • 8851 Views
  • 3 replies
  • 3 kudos

Resolved! DLT Job Clusters: Continuous vs Triggered Cluster Start Times

Hi there,I'm curious if anyone is able to definitively help me answer how DLT Job Clusters operate/run.For example, the following is my baseline understanding of DLT Job Clusters. If I run a Triggered DLT Pipeline (e.g. daily) the job cluster takes m...

  • 8851 Views
  • 3 replies
  • 3 kudos
Latest Reply
melbourne
Contributor
  • 3 kudos

Ideally one would expect clusters used for DLT pipeline to terminate after the pipeline execution has finished. However, while running in `development` environment, you'll notice it doesn't terminate on its own, whereas in `production` it terminates ...

  • 3 kudos
2 More Replies
al2co33
by New Contributor
  • 2398 Views
  • 1 replies
  • 0 kudos

Can I update a table comment using REST API?

https://docs.databricks.com/api/workspace/tablesIt seems I could only list/delete tables, is there a way to update a table's metadata like comment or detail fields by REST API?

  • 2398 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @al2co33 , We don't currently provide any APIs for updating table comments, however you can utilize the SQL Statement Execution API to do it. You can use the following tutorial to ALTER TABLE/COLUMN COMMENT. https://learn.microsoft.com/en-us/azure...

  • 0 kudos
Databricks_Java
by New Contributor
  • 3629 Views
  • 1 replies
  • 0 kudos

Databricks Java - Create Jar in Java 11

I am trying to a run simple print java program which is not working and getting compilation version issues though i changed the environment variable points to java 11. Can you please help me ? Can we create java with spark session and execute as a ja...

Get Started Discussions
Databricks
env
jar
java
spark
  • 3629 Views
  • 1 replies
  • 0 kudos
Latest Reply
arpit
Databricks Employee
  • 0 kudos

@Databricks_Java You can run command like this: spark-submit --class com.test.Main example.jarand make sure to check the java version and match with the DBR compatibility

  • 0 kudos
mathijs-fish
by New Contributor III
  • 2108 Views
  • 1 replies
  • 0 kudos

Disable personal compute with the Databricks API or UI

For a production environment, I want to disable the personal compute policy, because I do not want that all users can create personal compute clusters in production. Unfortunately, I am not able to access the account console, so I want to revoke perm...

mathijsfish_0-1702396352437.png mathijsfish_1-1702396390529.png
Get Started Discussions
compute
permissions
policies
  • 2108 Views
  • 1 replies
  • 0 kudos
Latest Reply
arpit
Databricks Employee
  • 0 kudos

@mathijs-fish You need to be admin to disable a policy.

  • 0 kudos
SaiCharan
by New Contributor
  • 4469 Views
  • 1 replies
  • 0 kudos

No space left on device and IllegalStateException: Have already allocated a maximum of 8192 pages

Hello, I'm writing to bring to your attention an issue that we have encountered while working with Data bricks and seek your assistance in resolving it.Context of the Error : When a sql query(1700 lines) is ran, corresponding data bricks job is faili...

  • 4469 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

are processing Parquet files or what is the format of your tables? can you split your sql query instead of having a huge query with 1700 lines

  • 0 kudos
Phani1
by Valued Contributor II
  • 2262 Views
  • 3 replies
  • 0 kudos

Autoloader file latency

Hi Team,I would like to understand if there is a metadata table for the autoloader in Databricks that captures information about file arrival and processing.The reason we are experiencing data issues is because our table A receives hundreds of files ...

  • 2262 Views
  • 3 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Check with  cloud_files_state() API You can find examples here https://docs.databricks.com/en/ingestion/auto-loader/production.html#querying-files-discovered-by-auto-loader

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels