cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RC
by Contributor
  • 4170 Views
  • 1 replies
  • 0 kudos

Resolved! Connecting to hive metastore as well as glue catalog at the sametime

Hi,Is there any way we can connect glue catalog as well as to hive metastore in the same warehouse?I can create a single instance profile and provide all the required access for buckets or for glue catalog.I tried with the below configuration,spark.s...

Community Platform Discussions
Databricks
glue catalog
hive metastore
  • 4170 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @RC, Based on the provided information, you cannot dynamically switch between Glue Catalog and a Hive metastore in the same warehouse. As per the limitations mentioned in the AWS Glue metastore documentation, you must restart the cluster for new ...

  • 0 kudos
Shwang
by New Contributor
  • 1037 Views
  • 1 replies
  • 0 kudos

Optuna results change when rerun on Databricks

The best trial results seem to change every time the same study is rerun. On Microsoft Azure this can be fixed by setting the sampler seed. However this solution doesn't seem to work on Databricks. Does anyone know why that is the case and how to mak...

  • 1037 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Shwang, It's not uncommon for results to vary when re-running experiments, especially when using complex machine learning or optimization techniques. To better understand your situation, please provide more details about the changes you observed ...

  • 0 kudos
daniel23
by New Contributor II
  • 6519 Views
  • 3 replies
  • 3 kudos

Resolved! How to properly import spark functions?

I have the following command that runs in my databricks notebook.spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")I have wrapped this command into a function (simplified).def get_info(): return spark.conf.get("spark.databri...

  • 6519 Views
  • 3 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @daniel23 ,  The behaviour you're experiencing is related to how the spark object is scoped and available within different contexts in Databricks. When you define and run code directly in a Databricks notebook, the spark object is automatically av...

  • 3 kudos
2 More Replies
Erik_L
by Contributor II
  • 474 Views
  • 0 replies
  • 0 kudos

Structured Streaming from TimescaleDB

I realize that the best practice would be to integrate our service with Kafka as a streaming source for Databricks, but given that the service already stores data into TimescaleDB, how can I stream data from TimescaleDB into DBX? Debezium doesn't wor...

  • 474 Views
  • 0 replies
  • 0 kudos
ncouture
by Contributor
  • 1509 Views
  • 1 replies
  • 0 kudos

No results were found: the query results may no longer be available or you may not have permissions

When anyone (admins included) click on an alert task in a job run we see the error `No results were found: the query results may no longer be available or you may not have permissions`.Should we be seeing something else or is this a matter of a poor ...

  • 1509 Views
  • 1 replies
  • 0 kudos
Latest Reply
JunYang
New Contributor III
  • 0 kudos

The error message "No results were found: the query results may no longer be available or you may not have permissions" is designed to address a range of potential situations. This includes instances where data might not be accessible due to reasons ...

  • 0 kudos
Murat_Aykit
by New Contributor
  • 712 Views
  • 1 replies
  • 0 kudos

Cannot create an account to try Community Edition

Hi,Whenever I try to signup for an account, I keep getting the following message in the first step - "an error has occurred. please try again later" Could you please let me know why this could be? I tried multiple emails and seems to be having same i...

  • 712 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Is it the same in all the browsers? 

  • 0 kudos
Chris_Shehu
by Valued Contributor III
  • 1574 Views
  • 2 replies
  • 0 kudos

Databricks Assistant HIPPA? Future Cost?

With the Public Preview of Databricks Assistant, I have a few questions. 1) If the Azure Tenet is HIPPA compliant does that compliance also include the Databricks Assistant features? 2) Right now the product is free but what will the cost be? Will we...

  • 1574 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Chris_Shehu, in Answer to your question -  If the Azure Tenet is HIPPA compliant, does that compliance also include the Databricks Assistant features? New features go through compliance assessment. If it is not a significant feature, after maybe ...

  • 0 kudos
1 More Replies
saberw
by New Contributor
  • 3084 Views
  • 1 replies
  • 1 kudos

Cron Schedule like 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI * does not work

when we use this cron schedule: 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *so far only the 58th minute will run, but not the 28th minute (30minutes after 58th minute). Is there some kind of bug in the cron scheduler?Reference: h...

  • 3084 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @saberw ,  The cron schedule you provided is 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *. This schedule specifies that a task should run on weekdays (Monday to Friday) between 6 AM and 5 PM. The task should start at the 58th ...

  • 1 kudos
jomt
by New Contributor III
  • 4555 Views
  • 1 replies
  • 0 kudos

Resolved! How do you properly read database-files (.db) with Spark in Python after the JDBC update?

I have a set of database-files (.db) which I need to read into my Python Notebook in Databricks. I managed to do this fairly simple up until July when a update in SQLite JDBC library was introduced. Up until now I have read the files in question with...

  • 4555 Views
  • 1 replies
  • 0 kudos
Latest Reply
jomt
New Contributor III
  • 0 kudos

When the numbers in the table are really big (millions and billions) or really low (e.g. 1e-15), SQLite JDBC may struggle to import the correct values. To combat this, a good idea could be to use customSchema in options to define the schema using Dec...

  • 0 kudos
Ludo
by New Contributor III
  • 3146 Views
  • 2 replies
  • 2 kudos

[DeltaTable] Usage with Unity Catalog (ParseException)

Hi,I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)See: Tutorial: Delta Lake | Databricks on AWSI'm having the following exception when trying to use DeltaTable.forName(string name...

  • 3146 Views
  • 2 replies
  • 2 kudos
Latest Reply
Ludo
New Contributor III
  • 2 kudos

Thank you for the quick feedback @saipujari_spark Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. It's not working in my scala application running in local with dire...

  • 2 kudos
1 More Replies
hukel
by Contributor
  • 2396 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Add-on for Splunk v1.2 - Error in 'databricksquery' command

Is anyone else using the new v1.2 of the Databricks Add-on for Splunk ?   We upgraded to 1.2 and now get this error for all queries.Running process: /opt/splunk/bin/nsjail-wrapper /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-Databricks/bin/datab...

  • 2396 Views
  • 5 replies
  • 1 kudos
Latest Reply
hukel
Contributor
  • 1 kudos

There is a new mandatory parameter for databricksquery called account_name.    This breaking change is not documented in Splunkbase release notes but it does appear in the docs within the Splunk app. databricksquery cluster="<cluster_name>" query="<S...

  • 1 kudos
4 More Replies
yjiao
by New Contributor
  • 646 Views
  • 0 replies
  • 0 kudos

Use DataBricks migration tool to export query

Dear all,I tried to use Databricks migration tool (https://github.com/databrickslabs/migrate) to migrate objects from one Databricks instance to another. I realized that notebooks, clusters, jobs can be done but queries can not be migrated by this to...

  • 646 Views
  • 0 replies
  • 0 kudos
GeKo
by New Contributor III
  • 807 Views
  • 0 replies
  • 0 kudos

global init script from workspace file ?

Hi Community,based on the announced change on Sep 1st, disabling cluster scoped init scripts in DBFS, I have questions re *global* init scripts.I am creating global init scripts via terraform "databricks_global_init_script" resources. Where do those ...

Community Platform Discussions
databricks_global_init_script
init script
workspace file
  • 807 Views
  • 0 replies
  • 0 kudos
shanmukh_b
by New Contributor
  • 15205 Views
  • 1 replies
  • 0 kudos

Convert string date to date after changing format

Hi,I am using Data bricks SQL and came across a scenario. I have a date field whose dates are in format of 'YYYY-MM-DD'. I changed their format into 'MM/DD/YYYY' using DATE_FORMAT() function.EFF_DT = 2000-01-14   EFF_DT _2 = DATE_FORMAT(EFF_DT, 'MM/d...

Community Platform Discussions
Databricks SQL
date
sql
string
  • 15205 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

if you use to_date, you will get a date column as mentioned above.If you want to use the format MM/dd/yyyy you can use date_format but this will return a string column.In order to use Spark date functions, Date string should comply with Spark DateTyp...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors