cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

none
by New Contributor
  • 2933 Views
  • 2 replies
  • 0 kudos

What's going wrong in my attempt to start with DataBricks?

I'm trying to get going with DataBricks for the first time. It's told me to create a workspace, which takes me to AWS (I'm also new to AWS). Following the instructions through there gets it to start creating something, but then it just gets stuck on ...

  • 2933 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, this looks like a workspace creation failure. Would like to know about the error details more. Thanks!

  • 0 kudos
1 More Replies
drii_cavalcanti
by New Contributor III
  • 1110 Views
  • 0 replies
  • 0 kudos

Hive Metastore permission on DBX 10.4

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...

Get Started Discussions
clusters
hive_metastore
legacy
permission
  • 1110 Views
  • 0 replies
  • 0 kudos
andresalvati
by New Contributor II
  • 2028 Views
  • 0 replies
  • 0 kudos

Tags to run S3 lifecycle rules

Hello,Is it possible to utilize S3 tags when writing a DataFrame with PySpark? Or is the only option to write the dataframe and then use boto3 to tag all the files?More information about S3 object tagging is here: Amazon S3 Object Tagging.Thank you.

  • 2028 Views
  • 0 replies
  • 0 kudos
jamescw
by New Contributor II
  • 2375 Views
  • 0 replies
  • 1 kudos

VS code 2023

Do I need to save the data locally and run the plotting locally as well or does anyone have a smart solution to this

  • 2375 Views
  • 0 replies
  • 1 kudos
mehdilamranikpl
by New Contributor
  • 1585 Views
  • 0 replies
  • 0 kudos

Change the Admin Owner Email Account for Databricks cloud standard account?

As stated. My company changed name and the email address has migrated. I need  to change it to the new one. And there is no way to open a support ticket to address that from what I saw (Standard Plan)Please do not tell me to contact AWS as they have ...

  • 1585 Views
  • 0 replies
  • 0 kudos
APKS
by New Contributor
  • 1044 Views
  • 0 replies
  • 0 kudos

Plotting using Databricks in VS code

Hi,I am quite new to working with Databricks in VS code. I am trying to figure out the best way to plot my data, when running on a cluster. I would like to have the possibility to zoom and move the plot as I have when plotting locally with Matplotlib...

  • 1044 Views
  • 0 replies
  • 0 kudos
ashik
by New Contributor II
  • 11642 Views
  • 6 replies
  • 3 kudos

Resolved! Databricks voucher code error

Hi Team, I am getting error that voucher code is invalid error when trying to register for "Databricks Certified Associate Data Engineer Associate. I got this issue once page was reloaded due to slowness of the internet before checkout. and the vouch...

  • 11642 Views
  • 6 replies
  • 3 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 3 kudos

No Worries. Contact help and support system , also raise the ticket.

  • 3 kudos
5 More Replies
gilo12
by New Contributor III
  • 5551 Views
  • 2 replies
  • 1 kudos

Change default catalog

It seems that when I am connecting to Databricks Warehouse, it is using the default catalog which is hive_metastore. Is there a way to define unity catalog to be the default?I know I can run the queryUSE CATALOG MAINAnd then the current session will ...

  • 5551 Views
  • 2 replies
  • 1 kudos
Latest Reply
Jon-ton
New Contributor II
  • 1 kudos

Thanks Brian2. Is there an equivalent config parameter for a SQL Warehouse?

  • 1 kudos
1 More Replies
fazlu_don23
by New Contributor III
  • 1161 Views
  • 0 replies
  • 0 kudos

ronaldo is back

create table SalesReport(TerritoryName NVARCHAR(50), ProductName NVARCHAR(100), TotalSales DECIMAL(10,2), PreviousYearSales DECIMAL(10,2), GrowthRate DECIMAL(10,2));  create table ErrorLog( ErrorID int, ErrorMessage nvarchar(max),ErrorDate datetime);...

  • 1161 Views
  • 0 replies
  • 0 kudos
alesventus
by Contributor
  • 1743 Views
  • 0 replies
  • 0 kudos

Save dataframe to the same variable

I would like to know if there is any difference if I save dataframe during tranformation to itself as first code or to new dataframe as second example.Thankslog_df = log_df.withColumn("process_timestamp",from_utc_timestamp(lit(current_timestamp()),"E...

  • 1743 Views
  • 0 replies
  • 0 kudos
Mohsen
by New Contributor
  • 2397 Views
  • 0 replies
  • 0 kudos

iceberg

Hi fellasi am working on databricks using icebergat first i have configured my notebook as belowspark.conf.set("spark.sql.catalog.spark_catalog","org.apache.iceberg.spark.SparkCatalog")spark.conf.set("spark.sql.catalog.spark_catalog.type", "hadoop")s...

  • 2397 Views
  • 0 replies
  • 0 kudos
olegmir
by New Contributor III
  • 2342 Views
  • 1 replies
  • 1 kudos

Resolved! threads leakage when getConnection fails

Hi,we are using databricks jdbc https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.33it seems like there is a thread leakage when getConnection failscould anyone advice?can be reproduced with @Test void databricksThreads() {...

  • 2342 Views
  • 1 replies
  • 1 kudos
Latest Reply
olegmir
New Contributor III
  • 1 kudos

Hi,none of the above suggestion will not work...we already contacted databricks jdbc team, thread leakage was confirmed and was fixed in version 2.6.34https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.34this leakage still exist if...

  • 1 kudos
kashy
by New Contributor III
  • 16538 Views
  • 2 replies
  • 0 kudos

Resolved! Access Foreign Catalog using Python in Notebook

Hello - I have a foreign catalog which I can access fine in SQL.  However, I can't access it from from python notebook.i.e. this works just fine if I have notebook using a Pro SQL Warehouse%sqlUSE CATALOG <my_foreign_catalog_name>;USE SCHEMA public;S...

  • 16538 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Are you using this in single user cluster? Also, please tag @Debayan with your next response so that I will get notified. 

  • 0 kudos
1 More Replies
Policepatil
by New Contributor III
  • 1358 Views
  • 0 replies
  • 0 kudos

Missing records while using limit in multithreading

Hi,I need to process nearly 30 files from different locations and insert records to RDS. I am using multi-threading to process these files parallelly like below. Test data:             I have configuration like below based on column 4: If column 4=0:...

image.png
  • 1358 Views
  • 0 replies
  • 0 kudos
Kratik
by New Contributor III
  • 3259 Views
  • 1 replies
  • 0 kudos

--files in spark submit task

Regarding --files option in spark submit task of Databricks jobs, would like to understand how it works and what is the syntax to pass multiple files to --files? I tried using --files and --py-files and my understanding is, it should make available t...

  • 3259 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, could you please check if this helps: https://docs.databricks.com/en/files/index.html Also please tag @Debayan​ with your next response which will notify me, Thank you!

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels