cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Chris_Shehu
by Valued Contributor III
  • 1623 Views
  • 2 replies
  • 0 kudos

Databricks Assistant HIPPA? Future Cost?

With the Public Preview of Databricks Assistant, I have a few questions. 1) If the Azure Tenet is HIPPA compliant does that compliance also include the Databricks Assistant features? 2) Right now the product is free but what will the cost be? Will we...

  • 1623 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Chris_Shehu, in Answer to your question -  If the Azure Tenet is HIPPA compliant, does that compliance also include the Databricks Assistant features? New features go through compliance assessment. If it is not a significant feature, after maybe ...

  • 0 kudos
1 More Replies
saberw
by New Contributor
  • 3287 Views
  • 1 replies
  • 1 kudos

Cron Schedule like 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI * does not work

when we use this cron schedule: 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *so far only the 58th minute will run, but not the 28th minute (30minutes after 58th minute). Is there some kind of bug in the cron scheduler?Reference: h...

  • 3287 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @saberw ,  The cron schedule you provided is 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *. This schedule specifies that a task should run on weekdays (Monday to Friday) between 6 AM and 5 PM. The task should start at the 58th ...

  • 1 kudos
jomt
by New Contributor III
  • 4751 Views
  • 1 replies
  • 0 kudos

Resolved! How do you properly read database-files (.db) with Spark in Python after the JDBC update?

I have a set of database-files (.db) which I need to read into my Python Notebook in Databricks. I managed to do this fairly simple up until July when a update in SQLite JDBC library was introduced. Up until now I have read the files in question with...

  • 4751 Views
  • 1 replies
  • 0 kudos
Latest Reply
jomt
New Contributor III
  • 0 kudos

When the numbers in the table are really big (millions and billions) or really low (e.g. 1e-15), SQLite JDBC may struggle to import the correct values. To combat this, a good idea could be to use customSchema in options to define the schema using Dec...

  • 0 kudos
Ludo
by New Contributor III
  • 3291 Views
  • 2 replies
  • 2 kudos

[DeltaTable] Usage with Unity Catalog (ParseException)

Hi,I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)See: Tutorial: Delta Lake | Databricks on AWSI'm having the following exception when trying to use DeltaTable.forName(string name...

  • 3291 Views
  • 2 replies
  • 2 kudos
Latest Reply
Ludo
New Contributor III
  • 2 kudos

Thank you for the quick feedback @saipujari_spark Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. It's not working in my scala application running in local with dire...

  • 2 kudos
1 More Replies
hukel
by Contributor
  • 2512 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Add-on for Splunk v1.2 - Error in 'databricksquery' command

Is anyone else using the new v1.2 of the Databricks Add-on for Splunk ?   We upgraded to 1.2 and now get this error for all queries.Running process: /opt/splunk/bin/nsjail-wrapper /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-Databricks/bin/datab...

  • 2512 Views
  • 5 replies
  • 1 kudos
Latest Reply
hukel
Contributor
  • 1 kudos

There is a new mandatory parameter for databricksquery called account_name.    This breaking change is not documented in Splunkbase release notes but it does appear in the docs within the Splunk app. databricksquery cluster="<cluster_name>" query="<S...

  • 1 kudos
4 More Replies
yjiao
by New Contributor
  • 683 Views
  • 0 replies
  • 0 kudos

Use DataBricks migration tool to export query

Dear all,I tried to use Databricks migration tool (https://github.com/databrickslabs/migrate) to migrate objects from one Databricks instance to another. I realized that notebooks, clusters, jobs can be done but queries can not be migrated by this to...

  • 683 Views
  • 0 replies
  • 0 kudos
GeKo
by New Contributor III
  • 856 Views
  • 0 replies
  • 0 kudos

global init script from workspace file ?

Hi Community,based on the announced change on Sep 1st, disabling cluster scoped init scripts in DBFS, I have questions re *global* init scripts.I am creating global init scripts via terraform "databricks_global_init_script" resources. Where do those ...

Community Platform Discussions
databricks_global_init_script
init script
workspace file
  • 856 Views
  • 0 replies
  • 0 kudos
shanmukh_b
by New Contributor
  • 15802 Views
  • 1 replies
  • 0 kudos

Convert string date to date after changing format

Hi,I am using Data bricks SQL and came across a scenario. I have a date field whose dates are in format of 'YYYY-MM-DD'. I changed their format into 'MM/DD/YYYY' using DATE_FORMAT() function.EFF_DT = 2000-01-14   EFF_DT _2 = DATE_FORMAT(EFF_DT, 'MM/d...

Community Platform Discussions
Databricks SQL
date
sql
string
  • 15802 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

if you use to_date, you will get a date column as mentioned above.If you want to use the format MM/dd/yyyy you can use date_format but this will return a string column.In order to use Spark date functions, Date string should comply with Spark DateTyp...

  • 0 kudos
DineshKumar
by New Contributor III
  • 1118 Views
  • 1 replies
  • 0 kudos

How to install AWS .pem file in databricks cluster to make a db connection to MySql RDS

I am trying to make a connection between AWS Mysql RDS and Databricks. I am using the below code to establish the connection. But its failed due to certificate is not installed. I have the .pem file with me. Could anyone help on how install this in D...

  • 1118 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please provide the error code or the full error stack? Please tag @Debayan with your next comment which will notify me. Thank you!

  • 0 kudos
FutureLegend
by New Contributor III
  • 3691 Views
  • 3 replies
  • 2 kudos

Resolved! Download Dolly model on local machine

Hi~ I am new to LLM engineering, and am trying to download the Dolly-v2-7b model on local machine, so I don't need to connect to internet each time I am going to run the Dolly-v2-7b. Is it possible to do that? Thanks a lot!

  • 3691 Views
  • 3 replies
  • 2 kudos
Latest Reply
FutureLegend
New Contributor III
  • 2 kudos

Hi Kaniz and Sean, thanks for your responses and time.I was trying Kaniz's method, but got a reply from Sean, so I tried that too. I downloaded the file from the link Sean provided and saved it on my local machine, then used the code for Dollyv2 (htt...

  • 2 kudos
2 More Replies
TalY
by New Contributor II
  • 5868 Views
  • 7 replies
  • 0 kudos

Python notebook crashes with "The Python kernel is unresponsive"

While using a Python notebook that works on my machine it crashes on the same point with the errors "The Python kernel is unresponsive" and "The Python process exited with exit code 134 (SIGABRT: Aborted).",  but with no stacktrace for debugging the ...

  • 5868 Views
  • 7 replies
  • 0 kudos
Latest Reply
TalY
New Contributor II
  • 0 kudos

I am using the following DBR 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12).Fatal error: The Python kernel is unresponsive.--------------------------------------------------------------------------- The Python process exited with exit code 134 (S...

  • 0 kudos
6 More Replies
Hani4hanuman
by New Contributor II
  • 1893 Views
  • 2 replies
  • 1 kudos

Databricks notebook issue

Hi,I'm trying to run ADF pipeline.However, it is getting fail at Notebook activity with below error.Error :NoSuchMethodError: com.microsoft.sqlserver.jdbc.SQLServerBulkCopy.writeToServer(Lcom/microsoft/sqlserver/jdbc/ISQLServerBulkRecord;)V I think i...

  • 1893 Views
  • 2 replies
  • 1 kudos
Latest Reply
Hani4hanuman
New Contributor II
  • 1 kudos

@shan_chandra   Thanks for your reply as per your suggetion changed Databricks version from 9.1LTS to 12.2LTSBut after change this when i check library which you provided(i.e com.microsoft.azure:spark-mssql-connector_2.12:1.3.0) under Maven it is not...

  • 1 kudos
1 More Replies
lightningStrike
by New Contributor III
  • 2532 Views
  • 3 replies
  • 0 kudos

unable to install pymqi in azure databricks

Hi,I am trying to install pymqi via below command:pip install pymqi However, I am getting below error message:Python interpreter will be restarted. Collecting pymqi Using cached pymqi-1.12.10.tar.gz (91 kB) Installing build dependencies: started Inst...

lightningStrike_0-1688995466000.png
  • 2532 Views
  • 3 replies
  • 0 kudos
Latest Reply
sean_owen
Honored Contributor II
  • 0 kudos

I don't think so, because it won't be specific to Databricks - this is all a property of the third party packages. And, there are billions of possible library conflicts. But this is not an example of a package conflict. It's an example of not complet...

  • 0 kudos
2 More Replies
alejandrofm
by Valued Contributor
  • 3450 Views
  • 1 replies
  • 1 kudos

Resolved! Configure job to use one cluster instance to multiple jobs

Hi! I have several tiny jobs that run in parallel and I want them to run on the same cluster:- Tasks type Python Script: I send the parameters this way to run the pyspark scripts.- Job compute cluster created as (copied JSON from Databricks Job UI)Ho...

Community Platform Discussions
cluster
job
job cluster
  • 3450 Views
  • 1 replies
  • 1 kudos
Latest Reply
KoenZandvliet
New Contributor III
  • 1 kudos

Unfortunately, running multiple jobs in parallel using a single job cluster is not supported (yet). New in databricks is the possibility to create a job that orchestrates multiple jobs. These jobs will however still use their own cluster (configurati...

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors