cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DB_learn
by New Contributor II
  • 1831 Views
  • 2 replies
  • 0 kudos

Mosaic library not supported in db13 and unity catalog

I want to use gepatial functions which uses mosaic library. How can I use it without using mosaic library and import functions available ?Example st_aswkt

  • 1831 Views
  • 2 replies
  • 0 kudos
Latest Reply
sean_owen
Databricks Employee
  • 0 kudos

You have a syntax error there indeed - extra parenthesis.

  • 0 kudos
1 More Replies
dbx_687_3__1b3Q
by New Contributor III
  • 6337 Views
  • 1 replies
  • 2 kudos

Using Azure Event Grid for structured streaming

Can anyone point me to any Databricks documentation (or other resources) for configuring structured streaming to use Azure Event Grid for a source/sink? I found examples for Kafka and EventHubs but Azure Event Grid is different than Azure Event Hubs....

  • 6337 Views
  • 1 replies
  • 2 kudos
Latest Reply
dbx_687_3__1b3Q
New Contributor III
  • 2 kudos

I must be missing something. I don't see how the examples in the referenced document can be applied to Azure Event Grid.Is there another example that shows how to subscribe to an Event Grid topic?

  • 2 kudos
190857
by New Contributor II
  • 1127 Views
  • 1 replies
  • 0 kudos

sql end point and jdbc driver

When we try and connect to a sql warehouse endpoint with the databricks jdbc driver our query is failing if we use first_value().  We've rewritten the query to use limit 1, but we would like to understand if this is a gap in the simba/databricks driv...

  • 1127 Views
  • 1 replies
  • 0 kudos
Latest Reply
190857
New Contributor II
  • 0 kudos

A sample error message when using first_value() is:An error occurred while calling o132.csv. [Databricks][JDBC](10140) Error converting value to BigDecimal.

  • 0 kudos
sirishavemula20
by New Contributor III
  • 1257 Views
  • 1 replies
  • 0 kudos

Databricks certified Data engineer Associate exam got suspended_need immediate help(10/09/2023)

Hi team,I've scheduled my exam on 10th September,2023 at 15:15hrs Asia Calcutta timing but my exam got suspended stating no proper environment by the proctor.Please check this issue as I didn't attempt one single question too, this is not fair as I l...

  • 1257 Views
  • 1 replies
  • 0 kudos
Latest Reply
APadmanabhan
Databricks Employee
  • 0 kudos

Hi @sirishavemula20 We have addressed the case. If the issue is not resolved, please reply back to the case notes.

  • 0 kudos
drii_cavalcanti
by New Contributor III
  • 625 Views
  • 0 replies
  • 0 kudos

Hive Metastore permission on DBX 10.4

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...

Community Platform Discussions
clusters
hive_metastore
legacy
permission
  • 625 Views
  • 0 replies
  • 0 kudos
gilo12
by New Contributor III
  • 3517 Views
  • 2 replies
  • 1 kudos

Change default catalog

It seems that when I am connecting to Databricks Warehouse, it is using the default catalog which is hive_metastore. Is there a way to define unity catalog to be the default?I know I can run the queryUSE CATALOG MAINAnd then the current session will ...

  • 3517 Views
  • 2 replies
  • 1 kudos
Latest Reply
Jon-ton
New Contributor II
  • 1 kudos

Thanks Brian2. Is there an equivalent config parameter for a SQL Warehouse?

  • 1 kudos
1 More Replies
fazlu_don23
by New Contributor III
  • 522 Views
  • 0 replies
  • 0 kudos

ronaldo is back

create table SalesReport(TerritoryName NVARCHAR(50), ProductName NVARCHAR(100), TotalSales DECIMAL(10,2), PreviousYearSales DECIMAL(10,2), GrowthRate DECIMAL(10,2));  create table ErrorLog( ErrorID int, ErrorMessage nvarchar(max),ErrorDate datetime);...

  • 522 Views
  • 0 replies
  • 0 kudos
alesventus
by Contributor
  • 1001 Views
  • 0 replies
  • 0 kudos

Save dataframe to the same variable

I would like to know if there is any difference if I save dataframe during tranformation to itself as first code or to new dataframe as second example.Thankslog_df = log_df.withColumn("process_timestamp",from_utc_timestamp(lit(current_timestamp()),"E...

  • 1001 Views
  • 0 replies
  • 0 kudos
Mohsen
by New Contributor
  • 1962 Views
  • 0 replies
  • 0 kudos

iceberg

Hi fellasi am working on databricks using icebergat first i have configured my notebook as belowspark.conf.set("spark.sql.catalog.spark_catalog","org.apache.iceberg.spark.SparkCatalog")spark.conf.set("spark.sql.catalog.spark_catalog.type", "hadoop")s...

  • 1962 Views
  • 0 replies
  • 0 kudos
olegmir
by New Contributor III
  • 1715 Views
  • 1 replies
  • 1 kudos

Resolved! threads leakage when getConnection fails

Hi,we are using databricks jdbc https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.33it seems like there is a thread leakage when getConnection failscould anyone advice?can be reproduced with @Test void databricksThreads() {...

  • 1715 Views
  • 1 replies
  • 1 kudos
Latest Reply
olegmir
New Contributor III
  • 1 kudos

Hi,none of the above suggestion will not work...we already contacted databricks jdbc team, thread leakage was confirmed and was fixed in version 2.6.34https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.34this leakage still exist if...

  • 1 kudos
Policepatil
by New Contributor III
  • 1028 Views
  • 0 replies
  • 0 kudos

Missing records while using limit in multithreading

Hi,I need to process nearly 30 files from different locations and insert records to RDS. I am using multi-threading to process these files parallelly like below. Test data:             I have configuration like below based on column 4: If column 4=0:...

image.png
  • 1028 Views
  • 0 replies
  • 0 kudos
priyakant1
by New Contributor II
  • 948 Views
  • 1 replies
  • 0 kudos

Suspension of Data Engineer Professional exam

Hi Databricks TeamI had scheduled my exam on 6th sep 2023, during exam same pop up came up, stating that I am looking in some other direction. I told them that my laptop mouse is not working properly, so I was looking at it. But still they suspended ...

  • 948 Views
  • 1 replies
  • 0 kudos
Latest Reply
sirishavemula20
New Contributor III
  • 0 kudos

Hi @priyakant1 ,Have you got any response from the team, like did they reschedule your exam?

  • 0 kudos
sirishavemula20
by New Contributor III
  • 2426 Views
  • 1 replies
  • 0 kudos

My exam has suspended , Need help Urgently (21/08/2023)

Hello Team,I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file a complain...

  • 2426 Views
  • 1 replies
  • 0 kudos
Latest Reply
sirishavemula20
New Contributor III
  • 0 kudos

Sub: My exam Datbricks Data Engineer Associate got suspended_need immediate help please (10/09/2023)I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showin...

  • 0 kudos
Policepatil
by New Contributor III
  • 2866 Views
  • 1 replies
  • 1 kudos

Resolved! Records are missing while filtering the dataframe in multithreading

 Hi, I need to process nearly 30 files from different locations and insert records to RDS. I am using multi-threading to process these files parallelly like below.   Test data:               I have configuration like below based on column 4: If colum...

Policepatil_0-1694077661899.png
  • 2866 Views
  • 1 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

Looks like you are comparing to strings like "1", not values like 1 in your filter condition. It's hard to say, there are some details missing like the rest of the code and the DF schema, and what output you are observing.

  • 1 kudos