cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

daniel23
by New Contributor II
  • 10477 Views
  • 1 replies
  • 1 kudos

How to properly import spark functions?

I have the following command that runs in my databricks notebook.spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")I have wrapped this command into a function (simplified).def get_info(): return spark.conf.get("spark.databri...

  • 10477 Views
  • 1 replies
  • 1 kudos
Murat_Aykit
by New Contributor
  • 1297 Views
  • 1 replies
  • 0 kudos

Cannot create an account to try Community Edition

Hi,Whenever I try to signup for an account, I keep getting the following message in the first step - "an error has occurred. please try again later" Could you please let me know why this could be? I tried multiple emails and seems to be having same i...

  • 1297 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Is it the same in all the browsers? 

  • 0 kudos
jgrycz
by New Contributor III
  • 12662 Views
  • 5 replies
  • 0 kudos

Response for list job runs API request doesn't have next/prev_page_token field

HI,When I do GET request to obtain list of job runs using `/api/2.1/jobs/runs/list` there is no `prev_page_token`, `next_page_token` fields in the response despite of having `has_more: True`. 

Screenshot 2023-08-03 at 09.23.13.png
  • 12662 Views
  • 5 replies
  • 0 kudos
Latest Reply
jgrycz
New Contributor III
  • 0 kudos

@Hubert-Dudek @Retired_mod could you confirm that behavior is an issue? if yes, can I report it anywhere?Otherwise can I report it as a feature request (to get all runs)? If yes, do you have any service to report it?

  • 0 kudos
4 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 1710 Views
  • 0 replies
  • 2 kudos

Revolutionizing Data Interaction

The new way of working with Databricks allows users to easily connect their notebook to a SQL serverless warehouse, write code in plain English, and generate output with the press of Space-Ctrl-Shift.

ezgif-5-4da5f77fec.gif
  • 1710 Views
  • 0 replies
  • 2 kudos
jlmontie
by New Contributor II
  • 2555 Views
  • 1 replies
  • 0 kudos

No database name field for PostgreSQL connection

I'm trying to get my workspace setup by connecting to my PostgreSQL database. I am following this article. The instructions under "Create a connection" are failing because my connection requires a database name. However, the database name is not a su...

  • 2555 Views
  • 1 replies
  • 0 kudos
Latest Reply
jlmontie
New Contributor II
  • 0 kudos

This is the error.[CANNOT_ESTABLISH_CONNECTION] Cannot establish connection to remote POSTGRESQL database. Please check connection information and credentials e.g. host, port, user and password options. ** If you believe the information is correct, p...

  • 0 kudos
KVNARK
by Honored Contributor II
  • 13517 Views
  • 6 replies
  • 4 kudos

Databricks Rewards Portal and the points credited in it

@Sujitha Hi Sujitha, Could you please let us know when we can see the Databricks rewards portal and we hope that the points credited over there will remain the same. Please update on these 2. 

  • 13517 Views
  • 6 replies
  • 4 kudos
Latest Reply
yogu
Honored Contributor III
  • 4 kudos

@Sujitha Could you please let us know when we can see the Databricks rewards portal. i see its still under construction

  • 4 kudos
5 More Replies
jomt
by New Contributor III
  • 7614 Views
  • 1 replies
  • 0 kudos

Resolved! How do you properly read database-files (.db) with Spark in Python after the JDBC update?

I have a set of database-files (.db) which I need to read into my Python Notebook in Databricks. I managed to do this fairly simple up until July when a update in SQLite JDBC library was introduced. Up until now I have read the files in question with...

  • 7614 Views
  • 1 replies
  • 0 kudos
Latest Reply
jomt
New Contributor III
  • 0 kudos

When the numbers in the table are really big (millions and billions) or really low (e.g. 1e-15), SQLite JDBC may struggle to import the correct values. To combat this, a good idea could be to use customSchema in options to define the schema using Dec...

  • 0 kudos
saberw
by New Contributor
  • 5820 Views
  • 0 replies
  • 0 kudos

Cron Schedule like 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI * does not work

when we use this cron schedule: 0 58/30 6,7,8,9,10,11,12,13,14,15,16,17 ? * MON,TUE,WED,THU,FRI *so far only the 58th minute will run, but not the 28th minute (30minutes after 58th minute). Is there some kind of bug in the cron scheduler?Reference: h...

  • 5820 Views
  • 0 replies
  • 0 kudos
hukel
by Contributor
  • 3938 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Add-on for Splunk v1.2 - Error in 'databricksquery' command

Is anyone else using the new v1.2 of the Databricks Add-on for Splunk ?   We upgraded to 1.2 and now get this error for all queries.Running process: /opt/splunk/bin/nsjail-wrapper /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-Databricks/bin/datab...

  • 3938 Views
  • 5 replies
  • 1 kudos
Latest Reply
hukel
Contributor
  • 1 kudos

There is a new mandatory parameter for databricksquery called account_name.    This breaking change is not documented in Splunkbase release notes but it does appear in the docs within the Splunk app. databricksquery cluster="<cluster_name>" query="<S...

  • 1 kudos
4 More Replies
GeKo
by Contributor
  • 1513 Views
  • 0 replies
  • 0 kudos

global init script from workspace file ?

Hi Community,based on the announced change on Sep 1st, disabling cluster scoped init scripts in DBFS, I have questions re *global* init scripts.I am creating global init scripts via terraform "databricks_global_init_script" resources. Where do those ...

Get Started Discussions
databricks_global_init_script
init script
workspace file
  • 1513 Views
  • 0 replies
  • 0 kudos
luna675
by New Contributor
  • 3215 Views
  • 2 replies
  • 0 kudos

Azure Databricks notebook can't be renamed

Hi all,I'm new to Databricks and just started using it for my work project. I've been trying creating test notebooks for practice purposes, but when I tried to rename it, either just through clicking the title or clicking edit from file, it showed "R...

  • 3215 Views
  • 2 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

The issue you are encountering with renaming or moving notebooks could be due to a permissions issue in Databricks. Here are a few things you can check:Check your workspace permissions:Ensure that you have the appropriate permission to edit and move ...

  • 0 kudos
1 More Replies
shanmukh_b
by New Contributor
  • 24597 Views
  • 1 replies
  • 0 kudos

Convert string date to date after changing format

Hi,I am using Data bricks SQL and came across a scenario. I have a date field whose dates are in format of 'YYYY-MM-DD'. I changed their format into 'MM/DD/YYYY' using DATE_FORMAT() function.EFF_DT = 2000-01-14   EFF_DT _2 = DATE_FORMAT(EFF_DT, 'MM/d...

Get Started Discussions
Databricks SQL
date
sql
string
  • 24597 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

if you use to_date, you will get a date column as mentioned above.If you want to use the format MM/dd/yyyy you can use date_format but this will return a string column.In order to use Spark date functions, Date string should comply with Spark DateTyp...

  • 0 kudos
priyanka08
by New Contributor II
  • 7410 Views
  • 0 replies
  • 0 kudos

Workspace region

ERROR- Your workspace region is not yet supported for model serving, please see https://docs.databricks.com/machine-learning/model-serving/index.html#region-availability for a list of supported regions.The account is in ap-south-1. I can see there is...

  • 7410 Views
  • 0 replies
  • 0 kudos
DineshKumar
by New Contributor III
  • 2057 Views
  • 1 replies
  • 0 kudos

How to install AWS .pem file in databricks cluster to make a db connection to MySql RDS

I am trying to make a connection between AWS Mysql RDS and Databricks. I am using the below code to establish the connection. But its failed due to certificate is not installed. I have the .pem file with me. Could anyone help on how install this in D...

  • 2057 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please provide the error code or the full error stack? Please tag @Debayan with your next comment which will notify me. Thank you!

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels