- 4170 Views
- 1 replies
- 0 kudos
Resolved! Connecting to hive metastore as well as glue catalog at the sametime
Hi,Is there any way we can connect glue catalog as well as to hive metastore in the same warehouse?I can create a single instance profile and provide all the required access for buckets or for glue catalog.I tried with the below configuration,spark.s...
- 4170 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @RC, Based on the provided information, you cannot dynamically switch between Glue Catalog and a Hive metastore in the same warehouse. As per the limitations mentioned in the AWS Glue metastore documentation, you must restart the cluster for new ...
- 0 kudos
- 1037 Views
- 1 replies
- 0 kudos
Optuna results change when rerun on Databricks
The best trial results seem to change every time the same study is rerun. On Microsoft Azure this can be fixed by setting the sampler seed. However this solution doesn't seem to work on Databricks. Does anyone know why that is the case and how to mak...
- 1037 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Shwang, It's not uncommon for results to vary when re-running experiments, especially when using complex machine learning or optimization techniques. To better understand your situation, please provide more details about the changes you observed ...
- 0 kudos
- 6519 Views
- 3 replies
- 3 kudos
Resolved! How to properly import spark functions?
I have the following command that runs in my databricks notebook.spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")I have wrapped this command into a function (simplified).def get_info(): return spark.conf.get("spark.databri...
- 6519 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @daniel23 , The behaviour you're experiencing is related to how the spark object is scoped and available within different contexts in Databricks. When you define and run code directly in a Databricks notebook, the spark object is automatically av...
- 3 kudos
- 474 Views
- 0 replies
- 0 kudos
Structured Streaming from TimescaleDB
I realize that the best practice would be to integrate our service with Kafka as a streaming source for Databricks, but given that the service already stores data into TimescaleDB, how can I stream data from TimescaleDB into DBX? Debezium doesn't wor...
- 474 Views
- 0 replies
- 0 kudos
- 1509 Views
- 1 replies
- 0 kudos
No results were found: the query results may no longer be available or you may not have permissions
When anyone (admins included) click on an alert task in a job run we see the error `No results were found: the query results may no longer be available or you may not have permissions`.Should we be seeing something else or is this a matter of a poor ...
- 1509 Views
- 1 replies
- 0 kudos
- 0 kudos
The error message "No results were found: the query results may no longer be available or you may not have permissions" is designed to address a range of potential situations. This includes instances where data might not be accessible due to reasons ...
- 0 kudos
- 712 Views
- 1 replies
- 0 kudos
Cannot create an account to try Community Edition
Hi,Whenever I try to signup for an account, I keep getting the following message in the first step - "an error has occurred. please try again later" Could you please let me know why this could be? I tried multiple emails and seems to be having same i...
- 712 Views
- 1 replies
- 0 kudos
- 807 Views
- 0 replies
- 0 kudos
Struct type limitation: possible hidden limit for parquet tables
Recently I discovered an issue when creating a PARQUET table that contains a column of type STRUCT with more than 350 string subfields. Such a table can be successfully created via a standard DDL script nevertheless each subsequent attempt to work wi...
- 807 Views
- 0 replies
- 0 kudos
- 1574 Views
- 2 replies
- 0 kudos
Databricks Assistant HIPPA? Future Cost?
With the Public Preview of Databricks Assistant, I have a few questions. 1) If the Azure Tenet is HIPPA compliant does that compliance also include the Databricks Assistant features? 2) Right now the product is free but what will the cost be? Will we...
- 1574 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Chris_Shehu, in Answer to your question - If the Azure Tenet is HIPPA compliant, does that compliance also include the Databricks Assistant features? New features go through compliance assessment. If it is not a significant feature, after maybe ...
- 0 kudos
- 3084 Views
- 1 replies
- 1 kudos
Cron Schedule like 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI * does not work
when we use this cron schedule: 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *so far only the 58th minute will run, but not the 28th minute (30minutes after 58th minute). Is there some kind of bug in the cron scheduler?Reference: h...
- 3084 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @saberw , The cron schedule you provided is 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *. This schedule specifies that a task should run on weekdays (Monday to Friday) between 6 AM and 5 PM. The task should start at the 58th ...
- 1 kudos
- 4555 Views
- 1 replies
- 0 kudos
Resolved! How do you properly read database-files (.db) with Spark in Python after the JDBC update?
I have a set of database-files (.db) which I need to read into my Python Notebook in Databricks. I managed to do this fairly simple up until July when a update in SQLite JDBC library was introduced. Up until now I have read the files in question with...
- 4555 Views
- 1 replies
- 0 kudos
- 0 kudos
When the numbers in the table are really big (millions and billions) or really low (e.g. 1e-15), SQLite JDBC may struggle to import the correct values. To combat this, a good idea could be to use customSchema in options to define the schema using Dec...
- 0 kudos
- 3146 Views
- 2 replies
- 2 kudos
[DeltaTable] Usage with Unity Catalog (ParseException)
Hi,I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)See: Tutorial: Delta Lake | Databricks on AWSI'm having the following exception when trying to use DeltaTable.forName(string name...
- 3146 Views
- 2 replies
- 2 kudos
- 2 kudos
Thank you for the quick feedback @saipujari_spark Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. It's not working in my scala application running in local with dire...
- 2 kudos
- 2396 Views
- 5 replies
- 1 kudos
Resolved! Databricks Add-on for Splunk v1.2 - Error in 'databricksquery' command
Is anyone else using the new v1.2 of the Databricks Add-on for Splunk ? We upgraded to 1.2 and now get this error for all queries.Running process: /opt/splunk/bin/nsjail-wrapper /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-Databricks/bin/datab...
- 2396 Views
- 5 replies
- 1 kudos
- 1 kudos
There is a new mandatory parameter for databricksquery called account_name. This breaking change is not documented in Splunkbase release notes but it does appear in the docs within the Splunk app. databricksquery cluster="<cluster_name>" query="<S...
- 1 kudos
- 646 Views
- 0 replies
- 0 kudos
Use DataBricks migration tool to export query
Dear all,I tried to use Databricks migration tool (https://github.com/databrickslabs/migrate) to migrate objects from one Databricks instance to another. I realized that notebooks, clusters, jobs can be done but queries can not be migrated by this to...
- 646 Views
- 0 replies
- 0 kudos
- 807 Views
- 0 replies
- 0 kudos
global init script from workspace file ?
Hi Community,based on the announced change on Sep 1st, disabling cluster scoped init scripts in DBFS, I have questions re *global* init scripts.I am creating global init scripts via terraform "databricks_global_init_script" resources. Where do those ...
- 807 Views
- 0 replies
- 0 kudos
- 15205 Views
- 1 replies
- 0 kudos
Convert string date to date after changing format
Hi,I am using Data bricks SQL and came across a scenario. I have a date field whose dates are in format of 'YYYY-MM-DD'. I changed their format into 'MM/DD/YYYY' using DATE_FORMAT() function.EFF_DT = 2000-01-14 EFF_DT _2 = DATE_FORMAT(EFF_DT, 'MM/d...
- 15205 Views
- 1 replies
- 0 kudos
- 0 kudos
if you use to_date, you will get a date column as mentioned above.If you want to use the format MM/dd/yyyy you can use date_format but this will return a string column.In order to use Spark date functions, Date string should comply with Spark DateTyp...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »