cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

haraldh
by New Contributor II
  • 1744 Views
  • 1 replies
  • 2 kudos

Databericks JDBC driver connection pooling support

When using Camel JDBC with Databricks JDBC driver I get an error: Caused by: java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.Is there any means to work around this limitation?

  • 1744 Views
  • 1 replies
  • 2 kudos
Latest Reply
swethaNandan
Databricks Employee
  • 2 kudos

Tools like SDI can connect to a generic JDBC source such as Databricks SQL Warehouse via the SDI Camel JDBC adapter. can you see  if these will help you https://help.sap.com/docs/HANA_SMART_DATA_INTEGRATION/7952ef28a6914997abc01745fef1b607/1247c9518...

  • 2 kudos
System1999
by New Contributor III
  • 5766 Views
  • 7 replies
  • 0 kudos

My 'Data' menu item shows 'No Options' for Databases. How can I fix?

Hi, I'm new to Databricks and I've signed up for the Community edition.First, I've noticed that I cannot return to a previously created cluster, as I get the message telling me that restarting a cluster is not available to me. Ok, inconvenient, but I...

error
  • 5766 Views
  • 7 replies
  • 0 kudos
Latest Reply
System1999
New Contributor III
  • 0 kudos

Hi @Suteja Kanuri​ ,I get the error message under Data before I've created a cluster. Then I still get it when I've created a cluster and a notebook (having attached the notebook to the cluster). Thanks.

  • 0 kudos
6 More Replies
Student185
by New Contributor III
  • 9170 Views
  • 7 replies
  • 5 kudos

Resolved! Is that long-term free version for students still available now?

Dear sir/madam,I've tried lots of methods in order to access the long-term Databricks' free version - community version for students.Also, I followed the instructions - Introduction to Databricks - in Coursera step by step: https://www.coursera.org/l...

  • 9170 Views
  • 7 replies
  • 5 kudos
Latest Reply
shreeves
New Contributor II
  • 5 kudos

Look for the "Community Edition" in small print below the button

  • 5 kudos
6 More Replies
Anonymous
by Not applicable
  • 628 Views
  • 1 replies
  • 2 kudos

www.databricks.com

Dear Community - @Youssef Mrini​ will answer all your questions on April 19, 2023 from 9:00am to 10:00am GMT during the Databricks EMEA Office Hours.Make sure to join this amazing 'Ask Me Anything' session by Databricks - https://www.databricks.com/r...

  • 628 Views
  • 1 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

It was a successful office hours. Make sure to join the next one.

  • 2 kudos
youssefmrini
by Databricks Employee
  • 1537 Views
  • 1 replies
  • 0 kudos
  • 1537 Views
  • 1 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

Make sure to watch the following video https://www.youtube.com/watch?v=DkzwFTC7WWsThis section lists the requirements for Databricks Connect.Only Databricks Runtime 13.0 ML and Databricks Runtime 13.0 are supported.Only clusters that are compatible w...

  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1327 Views
  • 2 replies
  • 8 kudos

databricks has recently introduced a new SQL function allowing easy integration of LLM (Language Model) models with Databricks. This exciting new feat...

databricks has recently introduced a new SQL function allowing easy integration of LLM (Language Model) models with Databricks. This exciting new feature simplifies calling LLM models, making them more accessible and user-friendly. To try it out, che...

Untitled
  • 1327 Views
  • 2 replies
  • 8 kudos
Latest Reply
Vartika
Databricks Employee
  • 8 kudos

Hi @Hubert Dudek​,I wanted to take a moment to express our gratitude for sharing your valuable insights and information with us. Thank you for taking the time to share your thoughts with us. We truly appreciate your contribution.You are awesome!Cheer...

  • 8 kudos
1 More Replies
JLSy
by New Contributor III
  • 14352 Views
  • 5 replies
  • 6 kudos

cannot convert Parquet type INT64 to Photon type string

I am receiving an error similar to the post in this link: https://community.databricks.com/s/question/0D58Y00009d8h4tSAA/cannot-convert-parquet-type-int64-to-photon-type-doubleHowever, instead of type double the error message states that the type can...

  • 14352 Views
  • 5 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

@John Laurence Sy​ :It sounds like you are encountering a schema conversion error when trying to read in a Parquet file that contains an INT64 column that cannot be converted to a string type. This error can occur when the Parquet file has a schema t...

  • 6 kudos
4 More Replies
Aakash_Bhandari
by New Contributor III
  • 6573 Views
  • 6 replies
  • 2 kudos

Resolved! Accessing a FastAPI endpoint using Personal Access Token (PAT)

Hello Community,I have a FastAPI endpoint on a cluster with addess 0.0.0.0:8084/predict. And I want to send a request to this endpoint from a React App which is locally hosted on my computer. I have a Personal access token for the workspace but dont ...

  • 6573 Views
  • 6 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Aakash Bhandari​ :To send a request from a React App to a FastAPI endpoint on a Databricks cluster using a Personal Access Token (PAT), you can use the requests module in Python to make HTTP requests.Here's an example of how to use requests to send ...

  • 2 kudos
5 More Replies
Anonymous
by Not applicable
  • 2518 Views
  • 1 replies
  • 1 kudos

"[PARSE_SYNTAX_ERROR] Syntax error at or near 'ROW'(line 2, pos 4)".

Alter table <TABLE_NAME> SET ROW FILTER <func_name> on (COLUMN)Got the below error while running the below code "[PARSE_SYNTAX_ERROR] Syntax error at or near 'ROW'(line 2, pos 4)". Please help on this issue. we tried this code as part of access polic...

Image
  • 2518 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rajeev45
Databricks Employee
  • 1 kudos

Hello,Please can you confirm which DBR version are you using & do you use unity catalog?

  • 1 kudos
AEW
by New Contributor II
  • 16640 Views
  • 8 replies
  • 2 kudos

Resolved! Help with Parameters in Databricks SQL

I am following the documentation on query parameters (https://docs.databricks.com/sql/user/queries/query-parameters.html) and it's not working for me. I can't get a parameter inserted at the text caret and I can't get cmd + p to work. I've changed sp...

  • 16640 Views
  • 8 replies
  • 2 kudos
Latest Reply
labromb
Contributor
  • 2 kudos

I don't see the need for spaces around the parameter name... I have just been typing {{param_name}} and it appears automatically in SQL editor

  • 2 kudos
7 More Replies
arw1070
by New Contributor II
  • 2285 Views
  • 2 replies
  • 0 kudos

Downstream delta live table is unable to read data frame from upstream table

I have been trying to work on implementing delta live tables to a pre-existing workflow. Currently trying to create two tables: appointments_raw and notes_raw, where notes_raw is "downstream" of appointments_raw. Following this as a reference, I'm at...

image.png
  • 2285 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Anna Wuest​ : Could you please send me the code snippet here? Thanks.

  • 0 kudos
1 More Replies
jose_gonzalez
by Databricks Employee
  • 3123 Views
  • 2 replies
  • 3 kudos

NoSuchObjectException(message:There is no database named global_temp)

I can see the following error message in my driver logs. whats does it means? and how to solve it?ERROR RetryingHMSHandler: NoSuchObjectException(message:There is no database named global_temp)

  • 3123 Views
  • 2 replies
  • 3 kudos
Latest Reply
source2sea
Contributor
  • 3 kudos

should one create it in the work space manually via UI? would it get overwritten if work space is created via terraform?I use 10.4 LTS runtime.

  • 3 kudos
1 More Replies
akanksha_gupta
by New Contributor II
  • 2338 Views
  • 2 replies
  • 0 kudos

ERROR : Failure starting repl. Try detaching and re-attaching the notebook. Getting this error when running any python command in 10.4LTS cluster configured with https://github.com/mspnp/spark-monitoring to send databricks spark logs to Log Analytics.

ERROR Description :java.lang.Exception: Cannot run program &quot;/local_disk0/pythonVirtualEnvDirs/virtualEnv-5acc1ea9-d03f-4de3-b76b-203d42614000/bin/python&quot; (in directory &quot;.&quot;): error=2, No such file or directory at java.lang.ProcessB...

  • 2338 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Akanksha Gupta​ :The error message suggests that the Python executable file specified in the configuration of the Databricks cluster cannot be found or accessed. Specifically, it seems that the Python executable file at the path "/local_disk0/python...

  • 0 kudos
1 More Replies
Raghu1216
by New Contributor II
  • 1927 Views
  • 3 replies
  • 0 kudos

Issue withpassing parameters to the queries in spark sql temporary function

I have created a function like belowcreate function test(location STRING, designation STRING, name STRING)RETURNS TABLE (cnt INT)RETURN(SELECT CASE WHEN location = 'INDIA' THEN (SELECT COUNT(*) FROM tbl_customers where job_role = design...

  • 1927 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Raghu Dandu​ :The error message suggests that the column "designation" does not exist in the table "tbl_customers". There could be several reasons for this error, such as a typo in the column name, a missing or deleted column, or a difference in the...

  • 0 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels