cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JohnJustus
by New Contributor III
  • 13094 Views
  • 3 replies
  • 0 kudos

Space in Column names when writing to Hive

All,I have the following code.df_Warehouse_Utilization = (    spark.table("hive_metastore.dev_ork.bin_item_detail")    .join(df_DIM_Bins,col('bin_tag')==df_DIM_Bins.BinKey,'right')    .groupby(col('BinKey'))    .agg(count_distinct(when(col('serial_lo...

  • 13094 Views
  • 3 replies
  • 0 kudos
Latest Reply
KandyKad
New Contributor III
  • 0 kudos

Hi,I have faced this issue a few times. When we are overwriting the dataframes to hive catalog in databricks, it doesn't naturally allow for column names to have spaces or special characters. However, you can add an option statement to bypass that ru...

  • 0 kudos
2 More Replies
EvanMarth
by New Contributor III
  • 14024 Views
  • 11 replies
  • 1 kudos

Cannot create an account to try Community Edition

Hi,Whenever I try to signup for an account, I keep getting the following message  - "an error has occurred. please try again later" when I click on the button "get started with databricks community edition".Could you please let me know why this could...

  • 14024 Views
  • 11 replies
  • 1 kudos
Latest Reply
senthur123
New Contributor II
  • 1 kudos

I got the same problem if I try to register or login through Community Edition link. But I tried by clicking the "Try Databricks" button on top right corner of the https://www.databricks.com/ home page, I was able to register and login successfully j...

  • 1 kudos
10 More Replies
jes
by New Contributor II
  • 938 Views
  • 2 replies
  • 0 kudos

spark_partition_id() - User does not have permission SELECT on anonymous function

I'm trying to verify the partitions assigned to rows.I'm running something like this:from pyspark.sql.functions import spark_partition_id df = spark.read.table("some.uc.table").limit(10) df = df.repartition(2) df = df.withColumn("partitionid", spar...

  • 938 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @jes, I have validate your failure internally and found that there is already an internal request to address this behavior.  Are you using a shared access mode cluster? As this behavior does not look to be observed when using single access mode...

  • 0 kudos
1 More Replies
KristiLogos
by Contributor
  • 1256 Views
  • 4 replies
  • 1 kudos

Connection type 'SALESFORCE' is not enabled. Please enable the connection to use it.

I'm trying to connect to Salesforce in databricks, I'm following this:https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud#sql-1and when I run the "Create Catalog..." I see this error, how would I enable salesforc...

  • 1256 Views
  • 4 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

The reason they're getting this error is because workspace is not enabled for the LakeFlow Connect preview.  Could you please file a ticket with us, as we might required additional details. Please refer to: https://docs.databricks.com/en/resources/s...

  • 1 kudos
3 More Replies
MattRoger
by New Contributor
  • 1278 Views
  • 1 replies
  • 0 kudos

Databricks User Group

Are there any Databricks User Group Meetups in UK?

  • 1278 Views
  • 1 replies
  • 0 kudos
Latest Reply
twole
Databricks Employee
  • 0 kudos

You can find some of the groups in EMEA here: https://community.databricks.com/t5/europe-middle-east-and-africa/ct-p/EMEA

  • 0 kudos
MonuDatabricks
by New Contributor II
  • 2204 Views
  • 1 replies
  • 0 kudos

Resolved! Using Autoloader with merge

Hi Everyone, I have been trying to use autoloader with foreach so that I could able to use merge into in databricks, but while using I have been getting below error.error-Found error inside foreachBatch Python processMy code-from delta.tables import ...

  • 2204 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

It seems the columns of your join condition are not found.  Are they in the dataframes/table?Also try to put the whole join condition in a single string:"s.JeHeaderId = t.JeHeaderId and s.JeLineId = t.JeLineId"

  • 0 kudos
NagarajuT
by New Contributor
  • 2090 Views
  • 1 replies
  • 0 kudos

Connect to SQL Developer using Custom JDBC

Hello,I'm trying to connect databricks SQL to the SQL Developer using custom JDBC.I'm getting errorjdbc:databricks:<server>:443;HttpPath=<HttpPath>;UID=token;PWD=<password> RegardsNaga

  • 2090 Views
  • 1 replies
  • 0 kudos
Latest Reply
ShaliniC
New Contributor II
  • 0 kudos

Hi,We are trying to test if we can connect sql developer to databricks. did it work for you?Regards, shalini

  • 0 kudos
tyorisoo
by New Contributor III
  • 1772 Views
  • 6 replies
  • 0 kudos

Unity Catalog About Metastore

Registered on 2024/10 from AWS marketplace.We have created a customer management VPC and manually created the workspace.No specific metastore settings were made when the workspace was created.In the catalog screen of the account console,unity catalog...

  • 1772 Views
  • 6 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @tyorisoo,I hope you are doing well!Metastore manages metadata, not catalog information, schema information, table information, function information, access control information, etc. In the current state, the metastore configuration is not done...

  • 0 kudos
5 More Replies
Surajv
by New Contributor III
  • 2396 Views
  • 2 replies
  • 1 kudos

What is the quota limit for using create user token api?

Hi Community, I was going through this doc: https://docs.databricks.com/api/workspace/tokens/create to and got to know, that there is a quota limit to how many token one can generate using the api: POST /api/2.0/token/create, having breached the thre...

  • 2396 Views
  • 2 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hello@Surajv, Q1: What is and how to find out the quota limit? The quota limit for creating user tokens via the API (POST /api/2.0/token/create) is essential to manage token usage. Each user can have multiple personal access tokens in a Databricks wo...

  • 1 kudos
1 More Replies
DBricksNewbie
by New Contributor III
  • 1926 Views
  • 2 replies
  • 0 kudos

Resolved! Can I give different git branches in the same repo for different tasks in a data bricks workflow

I have 2 tasks (T1 &T2) that run in branch B1 of Repo1.I have created a new task (depends on T2 ) which points to a different branch B2  of same Repo1.Is it possible to run them in the same workflow pipeline? When I tried to set this up, databricks c...

  • 1926 Views
  • 2 replies
  • 0 kudos
Latest Reply
DBricksNewbie
New Contributor III
  • 0 kudos

I was to able to find a workaround. Created separate jobs for those that need to be in different branch (testing tasks) and then ran all of them from a new job.

  • 0 kudos
1 More Replies
MKTexas13
by New Contributor III
  • 1725 Views
  • 1 replies
  • 0 kudos

Resolved! Setting a preset list of values in a task parameter in databricks job

I want to be able to have a user select from a preset list of values for a task parameter when they kick off a job with the "Run now with different parameters" option. In a notebook I am able to use dbutils.widgets.dropdown() to set the list of value...

  • 1725 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Unfortunately providing a job params dropdown list is not currently available, you can alway do a Run with different params, but the user will have to change them manually and not with a predefined list.

  • 0 kudos
colinhoad
by New Contributor
  • 721 Views
  • 1 replies
  • 0 kudos

New icon for SQL Editor looks like a broken image

Hey - I may be showing my age here, but I felt compelled to point out that at a glance, the new icon for a SQL Editor tab in the Databricks UI looks an awful lot like a broken image link icon, from the days of Internet Explorer. This, subconsciously,...

87338258.png colinhoad_1-1728893941310.png
  • 721 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Is this still showing broking image? Is this only happening in Explorer, if you try Chrome for example does it work?Can you share an screenshot of your workspace to better understand how it shows?  

  • 0 kudos
benito
by New Contributor
  • 829 Views
  • 1 replies
  • 0 kudos

Databricks Initial Costs AWS

I have a new premium account. I set up a cost dashboard (see attached) after I create a new workspace using AWS Quickstart, where I see some costs. Why do I have this If I am not using Databricks at all? How can I save the costs?

databricks.png
  • 829 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Are you seeing this data from the Usage tab in the Account console? Does it allow you to filter it by SKU?

  • 0 kudos
Rafael-Sousa
by Contributor II
  • 1547 Views
  • 2 replies
  • 1 kudos

Resolved! Internal Error with MERGE Command in Spark SQL

I'm trying to perform a MERGE between two tables (customers and customers_update) using Spark SQL, but I’m encountering an internal error during the planning phase. The error message suggests it might be a bug in Spark or one of the plugins in use.He...

  • 1547 Views
  • 2 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

The issue you encountered with the MERGE statement in Spark SQL, which was resolved by specifying the database and metastore, is likely related to how Spark handles table references during the planning phase. The internal error you faced suggests a b...

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels