cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ankris
by New Contributor III
  • 5831 Views
  • 2 replies
  • 0 kudos

Could you please guide us on connecting ServiceNow data in databricks

Would like to extract data like ticket info, resolve time, etc., from ServiceNow in databricks.Not finding much information in community and appreciate your guidance on the same.

  • 5831 Views
  • 2 replies
  • 0 kudos
Latest Reply
crannow
New Contributor II
  • 0 kudos

ServiceNow offers API capabilities. You can consume the ServiceNow API within a Databricks notebook to extract data from ServiceNow. Following is a suggested prompt to use with ChatGPT for example python code to connect to ServiceNow's api. PROMPT: ...

  • 0 kudos
1 More Replies
Neha_1688
by New Contributor II
  • 2240 Views
  • 2 replies
  • 3 kudos

Resolved! DLT pipeline that reads data from JDBC source

Could you please guide on how to create the DLT pipeline that directly reads the data from jdbc.When I created the DLT pipeline it give me error at Setting up table, If I ran interactively in notebooks it run successfully, but in non interactive mode...

  • 2240 Views
  • 2 replies
  • 3 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 3 kudos

What you try do to is not possible.dlt uses autoloader, not jdbcno jars (dlt is sql/python only)I'd skip DLT for this scenario and use an ordinary notebook, nothing wrong with that.

  • 3 kudos
1 More Replies
Chilangdon
by New Contributor
  • 7719 Views
  • 3 replies
  • 2 kudos

How to connect to a delta table that lives in a blob storage to display in a web app?

Hi, somebody to help me how to connect a delta table with a web app? I search to a delta-rs library but I can't obtain to make the connection.

  • 7719 Views
  • 3 replies
  • 2 kudos
Latest Reply
etsyal1e2r3
Honored Contributor
  • 2 kudos

Without downloading the files directly every time, you have to create a sql warehouse cluster and connect to it via jdbc connection. This way you just use the requests library in python (or an equal one in another language like axios for javascript) ...

  • 2 kudos
2 More Replies
pablociu
by New Contributor
  • 1323 Views
  • 2 replies
  • 0 kudos

How to define write Option in a DLT using Python?

In a normal notebook I would save metadata to my Delta table using the following code:( df.write .format("delta") .mode("overwrite") .option("userMetadata", user_meta_data) .saveAsTable("my_table") )But I couldn't find online how c...

  • 1323 Views
  • 2 replies
  • 0 kudos
Latest Reply
United_Communit
New Contributor II
  • 0 kudos

In Delta lab you can set up User MetaData so i will give you some tips from delta import DeltaTable# Create or load your Delta tabledelta_table = DeltaTable.forPath(spark, "path_to_delta_table")# Define your user metadata myccpayuser_meta_data = {"ke...

  • 0 kudos
1 More Replies
Marvin_T
by New Contributor III
  • 16556 Views
  • 2 replies
  • 2 kudos

Resolved! Disabling query caching for SQL Warehouse

Hello everybody,I am currently trying to run some performance tests on queries in Databricks on Azure. For my tests, I am using a Classic SQL Warehouse in the SQL Editor. I have created two views that contain the same data but have different structur...

  • 16556 Views
  • 2 replies
  • 2 kudos
Latest Reply
Marvin_T
New Contributor III
  • 2 kudos

They are probably executing the same query plan now that you say it. And yes, restarting the warehouse does theoretically works but it isnt a nice solution.I guess I will do some restarting and build averages to have a good comparison for now

  • 2 kudos
1 More Replies
Gk
by New Contributor III
  • 5548 Views
  • 6 replies
  • 0 kudos

Resolved! Databricks

Hi Ayapanday, ​How can check soure path in databricks​​

  • 5548 Views
  • 6 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Govardhana Reddy​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feed...

  • 0 kudos
5 More Replies
GS2312
by New Contributor II
  • 5407 Views
  • 6 replies
  • 5 kudos

KeyProviderException when trying to create external table on databricks

Hi There,I have been trying to create an external table on Azure Databricks with below statement.df.write.partitionBy("year", "month", "day").format('org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat').option("path",sourcepath).mod...

  • 5407 Views
  • 6 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Gaurishankar Sakhare​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best ...

  • 5 kudos
5 More Replies
PK225
by New Contributor III
  • 3039 Views
  • 4 replies
  • 4 kudos
  • 3039 Views
  • 4 replies
  • 4 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 4 kudos

If you mean a stream-static join, yes that is possible:https://learn.microsoft.com/en-us/azure/databricks/delta-live-tables/transform#--stream-static-joinsIf not, what exactly do you mean?

  • 4 kudos
3 More Replies
William_Scardua
by Valued Contributor
  • 1553 Views
  • 1 replies
  • 0 kudos

REPOS change my notebook format

Hi guys,I have some notebooks with REPOS but I noticed that REPOS changed my notebook format to .py because of this my Azure Data Factory no longer recognizes the notebook (.py)Have any ideia to convert that .py to databricks format ?

Screenshot 2023-05-30 at 20.39.02
  • 1553 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

that is odd. repos is merely another location (linked to git).You can copy/paste the code inside the py file into a notebook, or convert them using online tools or python libraries (like py2ipynb).

  • 0 kudos
naveenprabhun
by New Contributor III
  • 4882 Views
  • 2 replies
  • 3 kudos

Resolved! Unable to read data from ElasticSearch using Databricks (AWS) Cannot detect ES version - Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [IP:PORT]

I am trying to read data from ElasticSearch(ES Version 8.5.2) using PySpark on Databricks (13.0 (includes Apache Spark 3.4.0, Scala 2.12)). The ecosystem is on AWS.I am able to run a curl command on the Databricks notebook to the ES ip:port and fetch...

ErrorScreenshot Screenshot 2023-06-01 at 1.25.29 PM
  • 4882 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hoviedo
New Contributor III
  • 3 kudos

I have the same problem, did you find any solution? thanks

  • 3 kudos
1 More Replies
Anonymous
by Not applicable
  • 949 Views
  • 0 replies
  • 0 kudos

 Dear Community-  Get ready to mark your calendars for the upcoming Databricks Community Social event! Happening on June 16th, 2023, this event promis...

 Dear Community- Get ready to mark your calendars for the upcoming Databricks Community Social event! Happening on June 16th, 2023, this event promises to be the ultimate monthly gathering for everyone in the Databricks Community.Join us for an hour ...

community_social
  • 949 Views
  • 0 replies
  • 0 kudos
Anonymous
by Not applicable
  • 655 Views
  • 0 replies
  • 0 kudos

Dear Community,  Have you enrolled into the New Large Language Model Courses with edX yet?As Large Language Model (LLM) applications disrupt countless...

Dear Community, Have you enrolled into the New Large Language Model Courses with edX yet?As Large Language Model (LLM) applications disrupt countless industries, generative AI is becoming an important foundational technology. The demand for LLM-based...

Image
  • 655 Views
  • 0 replies
  • 0 kudos
MohamedThanveer
by New Contributor II
  • 1156 Views
  • 1 replies
  • 0 kudos

Databricks Certified Associate Developer for Apache Spark 3.0 - Python Cancellation

I have scheduled an examination on 1st June 2023 and due to personal reason, I have cancelled the examination on 26th May 2023 (more than 72 hours) but I am yet to receive the refund amount. In the auto generated mail it is mentioned that the refund ...

image
  • 1156 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

adding @Suteja Kanuri​  and @Vidula Khanna​ for visibility

  • 0 kudos
Ovi
by New Contributor III
  • 2136 Views
  • 1 replies
  • 0 kudos

Spark Dataframe write to Delta format doesn't create a _delta_log

Hello everyone, I have an intermittent issue when trying to create a Delta table for the first time in Databricks: all the data gets converted into parquet at the specified location but the _delta_log is not created or, if created, it's left empty, t...

  • 2136 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Can you list (display) the folder location "deltaLocation"? what files do you see here? have you try to use a new location for testing? do you get the same behavior?

  • 0 kudos
Torlynet
by New Contributor III
  • 1579 Views
  • 1 replies
  • 1 kudos

Can't access databricks

I was updated some scripts when I all of the sudden got a few "internal server errors". I refreshed the webpage a couple of times and now I am unable to login to databricks.When I try to sign in it thinks for a few seconds and then I am rerouted back...

Recording 2023-05-23 at 21.47.23
  • 1579 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

what date and time this issue happened? are you still unable to access to your workspace?

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels