cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

hukel
by Contributor
  • 4156 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Add-on for Splunk v1.2 - Error in 'databricksquery' command

Is anyone else using the new v1.2 of the Databricks Add-on for Splunk ?   We upgraded to 1.2 and now get this error for all queries.Running process: /opt/splunk/bin/nsjail-wrapper /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-Databricks/bin/datab...

  • 4156 Views
  • 5 replies
  • 1 kudos
Latest Reply
hukel
Contributor
  • 1 kudos

There is a new mandatory parameter for databricksquery called account_name.    This breaking change is not documented in Splunkbase release notes but it does appear in the docs within the Splunk app. databricksquery cluster="<cluster_name>" query="<S...

  • 1 kudos
4 More Replies
GeKo
by Contributor
  • 1594 Views
  • 0 replies
  • 0 kudos

global init script from workspace file ?

Hi Community,based on the announced change on Sep 1st, disabling cluster scoped init scripts in DBFS, I have questions re *global* init scripts.I am creating global init scripts via terraform "databricks_global_init_script" resources. Where do those ...

Get Started Discussions
databricks_global_init_script
init script
workspace file
  • 1594 Views
  • 0 replies
  • 0 kudos
luna675
by New Contributor
  • 3350 Views
  • 2 replies
  • 0 kudos

Azure Databricks notebook can't be renamed

Hi all,I'm new to Databricks and just started using it for my work project. I've been trying creating test notebooks for practice purposes, but when I tried to rename it, either just through clicking the title or clicking edit from file, it showed "R...

  • 3350 Views
  • 2 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

The issue you are encountering with renaming or moving notebooks could be due to a permissions issue in Databricks. Here are a few things you can check:Check your workspace permissions:Ensure that you have the appropriate permission to edit and move ...

  • 0 kudos
1 More Replies
shanmukh_b
by New Contributor
  • 25387 Views
  • 1 replies
  • 0 kudos

Convert string date to date after changing format

Hi,I am using Data bricks SQL and came across a scenario. I have a date field whose dates are in format of 'YYYY-MM-DD'. I changed their format into 'MM/DD/YYYY' using DATE_FORMAT() function.EFF_DT = 2000-01-14   EFF_DT _2 = DATE_FORMAT(EFF_DT, 'MM/d...

Get Started Discussions
Databricks SQL
date
sql
string
  • 25387 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

if you use to_date, you will get a date column as mentioned above.If you want to use the format MM/dd/yyyy you can use date_format but this will return a string column.In order to use Spark date functions, Date string should comply with Spark DateTyp...

  • 0 kudos
priyanka08
by New Contributor II
  • 7499 Views
  • 0 replies
  • 0 kudos

Workspace region

ERROR- Your workspace region is not yet supported for model serving, please see https://docs.databricks.com/machine-learning/model-serving/index.html#region-availability for a list of supported regions.The account is in ap-south-1. I can see there is...

  • 7499 Views
  • 0 replies
  • 0 kudos
DineshKumar
by New Contributor III
  • 2157 Views
  • 1 replies
  • 0 kudos

How to install AWS .pem file in databricks cluster to make a db connection to MySql RDS

I am trying to make a connection between AWS Mysql RDS and Databricks. I am using the below code to establish the connection. But its failed due to certificate is not installed. I have the .pem file with me. Could anyone help on how install this in D...

  • 2157 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please provide the error code or the full error stack? Please tag @Debayan with your next comment which will notify me. Thank you!

  • 0 kudos
FutureLegend
by New Contributor III
  • 6796 Views
  • 2 replies
  • 1 kudos

Resolved! Download Dolly model on local machine

Hi~ I am new to LLM engineering, and am trying to download the Dolly-v2-7b model on local machine, so I don't need to connect to internet each time I am going to run the Dolly-v2-7b. Is it possible to do that? Thanks a lot!

  • 6796 Views
  • 2 replies
  • 1 kudos
Latest Reply
FutureLegend
New Contributor III
  • 1 kudos

Hi Kaniz and Sean, thanks for your responses and time.I was trying Kaniz's method, but got a reply from Sean, so I tried that too. I downloaded the file from the link Sean provided and saved it on my local machine, then used the code for Dollyv2 (htt...

  • 1 kudos
1 More Replies
TalY
by New Contributor II
  • 13546 Views
  • 5 replies
  • 0 kudos

Python notebook crashes with "The Python kernel is unresponsive"

While using a Python notebook that works on my machine it crashes on the same point with the errors "The Python kernel is unresponsive" and "The Python process exited with exit code 134 (SIGABRT: Aborted).",  but with no stacktrace for debugging the ...

  • 13546 Views
  • 5 replies
  • 0 kudos
Latest Reply
TalY
New Contributor II
  • 0 kudos

I am using the following DBR 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12).Fatal error: The Python kernel is unresponsive.--------------------------------------------------------------------------- The Python process exited with exit code 134 (S...

  • 0 kudos
4 More Replies
yogu
by Honored Contributor III
  • 4294 Views
  • 2 replies
  • 2 kudos

Resolved! Databrickscommunity reward store is not working

Hi Guys, Does anybody know when the Databricks community reward store portal will open?I see it's still under construction@Retired_mod @Sujitha   

yogu_0-1691047906033.png
  • 4294 Views
  • 2 replies
  • 2 kudos
Latest Reply
yogu
Honored Contributor III
  • 2 kudos

thanks @Sujitha 

  • 2 kudos
1 More Replies
Hani4hanuman
by New Contributor II
  • 2864 Views
  • 2 replies
  • 1 kudos

Databricks notebook issue

Hi,I'm trying to run ADF pipeline.However, it is getting fail at Notebook activity with below error.Error :NoSuchMethodError: com.microsoft.sqlserver.jdbc.SQLServerBulkCopy.writeToServer(Lcom/microsoft/sqlserver/jdbc/ISQLServerBulkRecord;)V I think i...

  • 2864 Views
  • 2 replies
  • 1 kudos
Latest Reply
Hani4hanuman
New Contributor II
  • 1 kudos

@shan_chandra   Thanks for your reply as per your suggetion changed Databricks version from 9.1LTS to 12.2LTSBut after change this when i check library which you provided(i.e com.microsoft.azure:spark-mssql-connector_2.12:1.3.0) under Maven it is not...

  • 1 kudos
1 More Replies
DeveshKaushik
by New Contributor II
  • 3946 Views
  • 6 replies
  • 0 kudos

How to close gke databricks cluster

I have on gke databricks cluster with multiple nodes running. I want to close nodes when not in use. To reduce cost usage. And Is there any way to pause gke cluster on-demand.

  • 3946 Views
  • 6 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Databricks MVP
  • 0 kudos

Hi @DeveshKaushik ,If this is the case then create a support ticket with databricks

  • 0 kudos
5 More Replies
lightningStrike
by New Contributor III
  • 4198 Views
  • 3 replies
  • 0 kudos

unable to install pymqi in azure databricks

Hi,I am trying to install pymqi via below command:pip install pymqi However, I am getting below error message:Python interpreter will be restarted. Collecting pymqi Using cached pymqi-1.12.10.tar.gz (91 kB) Installing build dependencies: started Inst...

lightningStrike_0-1688995466000.png
  • 4198 Views
  • 3 replies
  • 0 kudos
Latest Reply
sean_owen
Databricks Employee
  • 0 kudos

I don't think so, because it won't be specific to Databricks - this is all a property of the third party packages. And, there are billions of possible library conflicts. But this is not an example of a package conflict. It's an example of not complet...

  • 0 kudos
2 More Replies
alejandrofm
by Valued Contributor
  • 5921 Views
  • 1 replies
  • 1 kudos

Resolved! Configure job to use one cluster instance to multiple jobs

Hi! I have several tiny jobs that run in parallel and I want them to run on the same cluster:- Tasks type Python Script: I send the parameters this way to run the pyspark scripts.- Job compute cluster created as (copied JSON from Databricks Job UI)Ho...

Get Started Discussions
cluster
job
job cluster
  • 5921 Views
  • 1 replies
  • 1 kudos
Latest Reply
KoenZandvliet
New Contributor III
  • 1 kudos

Unfortunately, running multiple jobs in parallel using a single job cluster is not supported (yet). New in databricks is the possibility to create a job that orchestrates multiple jobs. These jobs will however still use their own cluster (configurati...

  • 1 kudos
div19882021
by New Contributor
  • 1403 Views
  • 1 replies
  • 1 kudos

Is there a solution that we can display the worker types based on spark version selection using api?

Is there a solution that allows us to display the worker types or driver types based on the selection of Spark version using an api?

  • 1403 Views
  • 1 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

Can you clarify what you mean? Worker and driver types are not related to Spark version.

  • 1 kudos
pabloanzorenac
by New Contributor II
  • 3035 Views
  • 2 replies
  • 2 kudos

Resolved! Reduce EBS Default Volumes

By default Databricks creates 2 volumes: one with 30GB and the other one with 150GB. We have a lot of nodes in our pools and so a los of Terabytes of Volumes, but we are not making any use of them in the jobs. Is there any way to reduce the volumes? ...

  • 3035 Views
  • 2 replies
  • 2 kudos
Latest Reply
sean_owen
Databricks Employee
  • 2 kudos

Yes, EBS vols are essential for shuffle spill for example. You are probably using them!

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels