cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Ruby8376
by Valued Contributor
  • 678 Views
  • 2 replies
  • 1 kudos

Databricks sql API <- Delta tables

We have an integration flow where we want to expose databricks data for querying through odata(webapp). For this piecedatabricks sql API <- Delta tables :2 questions here:1. can you share link/documentation on how we can integrate databricks <-delta ...

  • 678 Views
  • 2 replies
  • 1 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 1 kudos

Hi @Ruby8376 - can you please review the similar posts where the resolution is provided  https://community.databricks.com/t5/warehousing-analytics/databricks-sql-restful-api-to-query-delta-table/td-p/8617 https://www.databricks.com/blog/2023/03/07/da...

  • 1 kudos
1 More Replies
ChristopherQ1
by New Contributor
  • 630 Views
  • 1 replies
  • 0 kudos

Can we share Delta table data with Salesforce using OData?

Hello!I'm seeking recommendations for streaming on-demand data from Databricks Delta tables to Salesforce. Is OData a viable choice?Thanks.

  • 630 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @ChristopherQ1,  OData (Open Data Protocol) is a standard for building and consuming RESTful APIs. It provides a consistent way to expose and consume data over the web.While OData can be used for data integration, it’s essential to evaluate whethe...

  • 0 kudos
dvmentalmadess
by Valued Contributor
  • 25063 Views
  • 8 replies
  • 3 kudos

Resolved! Private PyPI repos on DBR 13+

We use a private PyPI repo (AWS CodeArtifact) to publish custom python libraries. We make the private repo available to DBR 12.2 clusters using an init-script as prescribed here in the Databricks KB.  When we tried to upgrade to 13.2 this stopped wor...

  • 25063 Views
  • 8 replies
  • 3 kudos
Latest Reply
dvmentalmadess
Valued Contributor
  • 3 kudos

I'm coming back to provide an updated solution that doesn't rely on the implementation detail of the user name (e.g., libraries) - which is not considered a contract and could potentially change and break in the future.The key is to use the --global ...

  • 3 kudos
7 More Replies
_raman_
by New Contributor II
  • 420 Views
  • 1 replies
  • 0 kudos

Internal Error : report

I'm getting tis error while running any cell in notebook. On the top middle it is coming like this. "Uncaught TypeError: Cannot redefine property: googletagReload the page and try again. If the error persists, contact support. Reference error code: 7...

  • 420 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Honored Contributor
  • 0 kudos

Hi @_raman_ , Which DBR are you facing this issue? Most likely the issue is related to this: https://github.com/shadcn-ui/ui/issues/2837 If you are having this issue might be because of some browserextension. A quick test to confirm this theory is to...

  • 0 kudos
amit_jbs
by New Contributor II
  • 935 Views
  • 2 replies
  • 1 kudos

In databricks deployment .py files getting converted to notebooks

A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...

  • 935 Views
  • 2 replies
  • 1 kudos
Latest Reply
Dazza
New Contributor II
  • 1 kudos

Experiencing a similar issue that we are looking to resolve except the files are .sql. We have a process that has 1 orchestration notebook , calling multiple .sql files. These .sql files are being converted to regular databricks notebooks when deploy...

  • 1 kudos
1 More Replies
jdm
by New Contributor II
  • 805 Views
  • 1 replies
  • 0 kudos

Can't setup dbt with streaming tables

Hey community,i'm struggling integrate Delta Live Tables and dbt with one another.Basically i'm trying to complete this tutorial. https://www.databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricksSome further information:Crea...

  • 805 Views
  • 1 replies
  • 0 kudos
Latest Reply
jdm
New Contributor II
  • 0 kudos

i forgot to add this further discription, sorryi added the linked github repo to my databricks workspace, successfully ran the helper notebook and created a job which runs a dbt task based on the dbt project contained in the GitHub.This task complete...

  • 0 kudos
dhanshri
by New Contributor
  • 1239 Views
  • 2 replies
  • 0 kudos

Tracking File Arrivals in Nested Folders Using Databricks File Arrival Trigger

Hi Team,I'm currently exploring a file arrival trigger with Data-bricks, but my data is organized into nested folders representing various sources. For instance: source1  |-- file1       |-- file.csv  |-- file2       |-- file.csv   My goal is to dete...

Community Platform Discussions
Azure Databricks
Databricks
  • 1239 Views
  • 2 replies
  • 0 kudos
Latest Reply
adriennn
Contributor
  • 0 kudos

@Kaniz_Fatma did a LLM Bot write the above response for you? You link to a Stackoverflow post which uses Azure Data Factory, and you text contains concepts which do not apply to Databricks ("Use a lookup activity or a Get Metadata Activity to fetch t...

  • 0 kudos
1 More Replies
dataVaughan
by New Contributor II
  • 1552 Views
  • 3 replies
  • 0 kudos

Notebook Dashboard to html to pdf issues

I have created a dashboard using the notebook dashboard interface, rather than the SQL warehouse dashboards. This means that the tables and visualizations on the dashboard, as well as the dashboard itself, are directly tied to a notebook and the outp...

  • 1552 Views
  • 3 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@dataVaughan  - you can use the Lakeview dashboard which can provide an URL that is shareable outside of the Databricks workspace.  https://www.databricks.com/blog/announcing-public-preview-lakeview-dashboards In your current scenario, you can clone ...

  • 0 kudos
2 More Replies
sai_sathya
by New Contributor III
  • 816 Views
  • 1 replies
  • 0 kudos

fetching metadata for tables in a database stored in unity catalogue

Hi everyoneiam trying to fetch the metadata of every columns from an table and every tables from the database under an catalogue for that iam trying to use the samples catalogue that provided by databricks and get details for tpch database that provi...

sai_sathya_0-1713277488517.png
  • 816 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@sai_sathya  - you can use DESCRIBE EXTENDED command to get the metadata of the given table. Also, you can query the information_schema.columns within your UC catalog to check the column details of a given table.

  • 0 kudos
UdayPatel
by New Contributor III
  • 2498 Views
  • 5 replies
  • 1 kudos

Resolved! Can't run .py file using workflows anymore

Dear all,Greetings!I have been trying to run a workflow job which runs successfully when a task is created using a Notebook file from a folder present in the Workspace but when the same task's type is changed to python script and a .py file is select...

  • 2498 Views
  • 5 replies
  • 1 kudos
Latest Reply
UdayPatel
New Contributor III
  • 1 kudos

Hi,Have found the solution. It was due to following option being enabled under the Feature Enablement tab under  Databricks_Account_Console -- > Settings.  Thank you for all your help and the try!Regards,Uday

  • 1 kudos
4 More Replies
Mustafa_Kamal
by New Contributor II
  • 1016 Views
  • 4 replies
  • 0 kudos

Parameterizing DLT Pipelines

Hi Everyone,I have DLTP pipeline which I need to execute for difference source systems. Need advise on how to parametrize this.I have gone through many articles on the web, but it seems there is no accurate information available.Can anyone please hel...

  • 1016 Views
  • 4 replies
  • 0 kudos
Latest Reply
Mustafa_Kamal
New Contributor II
  • 0 kudos

Thank you @AmanSehgal ,I have done that and was able to execute the pipeline successfully. Bu t I need to change the parameter value at run time, so that the same pipeline can be used for multiple sources.Can we pass parameters from Job to DLT Pipeli...

  • 0 kudos
3 More Replies
databrciks
by New Contributor II
  • 1400 Views
  • 2 replies
  • 0 kudos

Resolved! Databrciks: failure logs

Hello Team,I am new to Databrciks. Generally where all the logs will be stored in Databricks. I see if any job fails below the command i could see some error messages.Otherwise in real time how to check the log files/error messages in Databricks UI.T...

  • 1400 Views
  • 2 replies
  • 0 kudos
Latest Reply
databrciks
New Contributor II
  • 0 kudos

Thanks for the response. This helped.

  • 0 kudos
1 More Replies
NhanNguyen
by Contributor II
  • 1623 Views
  • 2 replies
  • 1 kudos

Resolved! Cannot create delta location with mount path

Hi all,I'm trying to create a Table but cannot use a predifined mount path like '/mnt/silver/' but if i use a full path of azure blob container it will create susscessfully like this:`CREATE TABLE IF NOT EXISTS nhan_databricks.f1_processed.circuits (...

  • 1623 Views
  • 2 replies
  • 1 kudos
Latest Reply
NhanNguyen
Contributor II
  • 1 kudos

Oh thanks for you answer, actually I'm using Unity Catalog

  • 1 kudos
1 More Replies
Hetnon
by New Contributor II
  • 800 Views
  • 2 replies
  • 0 kudos

Can't Create a Workspace using Google Cloud

Trying to create my first workspace. I hit create my space and I see 3 buckets being created on my GCP, but nothing shows up in the actual 'workspaces' in my databricks console. the only thing is the  'create workspace' button' also, there is no erro...

  • 800 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Hetnon, Creating a Databricks workspace can indeed be a bit tricky at times, but let’s troubleshoot this together! Here are some steps you can take to address the issue: Check Resource Provider Registration: Ensure that the Microsoft.Databri...

  • 0 kudos
1 More Replies
anonymous_567
by New Contributor II
  • 710 Views
  • 1 replies
  • 0 kudos

Ingesting Non-Incremental Data into Delta

Hello,I have non-incremental data landing in a storage account. This data contains old data from before as well as new data. I would like to avoid doing a complete table deletion and table creation just to upload the data from storage and have an upd...

  • 710 Views
  • 1 replies
  • 0 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 0 kudos

Well, if you know the conditions to separate new data from old data, then while reading the data in to your dataframe, use filter or where clause to select new data and ingest it in to your delta table.This is how you can do in general. But if you ha...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors