cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Phani1
by Valued Contributor II
  • 1338 Views
  • 1 replies
  • 0 kudos

temporary tables or dataframes,

We have to generate over 70 intermediate tables. Should we use temporary tables or dataframes, or should we create delta tables and truncate and reload? Having too many temporary tables could lead to memory problems. In this situation, what is the mo...

  • 1338 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi Phani1, It would be a use case specific answer, so if it is possible I would suggest to work with the Solution Architect on this or share some more insights for a better guidance. When I say that, I just would want to understand would we really ne...

  • 0 kudos
Ruby8376
by Valued Contributor
  • 949 Views
  • 2 replies
  • 1 kudos

Databricks sql API <- Delta tables

We have an integration flow where we want to expose databricks data for querying through odata(webapp). For this piecedatabricks sql API <- Delta tables :2 questions here:1. can you share link/documentation on how we can integrate databricks <-delta ...

  • 949 Views
  • 2 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

Hi @Ruby8376 - can you please review the similar posts where the resolution is provided  https://community.databricks.com/t5/warehousing-analytics/databricks-sql-restful-api-to-query-delta-table/td-p/8617 https://www.databricks.com/blog/2023/03/07/da...

  • 1 kudos
1 More Replies
dvmentalmadess
by Valued Contributor
  • 30201 Views
  • 7 replies
  • 3 kudos

Resolved! Private PyPI repos on DBR 13+

We use a private PyPI repo (AWS CodeArtifact) to publish custom python libraries. We make the private repo available to DBR 12.2 clusters using an init-script as prescribed here in the Databricks KB.  When we tried to upgrade to 13.2 this stopped wor...

  • 30201 Views
  • 7 replies
  • 3 kudos
Latest Reply
dvmentalmadess
Valued Contributor
  • 3 kudos

I'm coming back to provide an updated solution that doesn't rely on the implementation detail of the user name (e.g., libraries) - which is not considered a contract and could potentially change and break in the future.The key is to use the --global ...

  • 3 kudos
6 More Replies
_raman_
by New Contributor II
  • 616 Views
  • 1 replies
  • 0 kudos

Internal Error : report

I'm getting tis error while running any cell in notebook. On the top middle it is coming like this. "Uncaught TypeError: Cannot redefine property: googletagReload the page and try again. If the error persists, contact support. Reference error code: 7...

  • 616 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @_raman_ , Which DBR are you facing this issue? Most likely the issue is related to this: https://github.com/shadcn-ui/ui/issues/2837 If you are having this issue might be because of some browserextension. A quick test to confirm this theory is to...

  • 0 kudos
amit_jbs
by New Contributor II
  • 1295 Views
  • 2 replies
  • 1 kudos

In databricks deployment .py files getting converted to notebooks

A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...

  • 1295 Views
  • 2 replies
  • 1 kudos
Latest Reply
Dazza
New Contributor II
  • 1 kudos

Experiencing a similar issue that we are looking to resolve except the files are .sql. We have a process that has 1 orchestration notebook , calling multiple .sql files. These .sql files are being converted to regular databricks notebooks when deploy...

  • 1 kudos
1 More Replies
jdm
by New Contributor II
  • 1080 Views
  • 1 replies
  • 0 kudos

Can't setup dbt with streaming tables

Hey community,i'm struggling integrate Delta Live Tables and dbt with one another.Basically i'm trying to complete this tutorial. https://www.databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricksSome further information:Crea...

  • 1080 Views
  • 1 replies
  • 0 kudos
Latest Reply
jdm
New Contributor II
  • 0 kudos

i forgot to add this further discription, sorryi added the linked github repo to my databricks workspace, successfully ran the helper notebook and created a job which runs a dbt task based on the dbt project contained in the GitHub.This task complete...

  • 0 kudos
dhanshri
by New Contributor
  • 1579 Views
  • 1 replies
  • 0 kudos

Tracking File Arrivals in Nested Folders Using Databricks File Arrival Trigger

Hi Team,I'm currently exploring a file arrival trigger with Data-bricks, but my data is organized into nested folders representing various sources. For instance: source1  |-- file1       |-- file.csv  |-- file2       |-- file.csv   My goal is to dete...

Community Platform Discussions
Azure Databricks
Databricks
  • 1579 Views
  • 1 replies
  • 0 kudos
Latest Reply
adriennn
Contributor II
  • 0 kudos

@Retired_mod did a LLM Bot write the above response for you? You link to a Stackoverflow post which uses Azure Data Factory, and you text contains concepts which do not apply to Databricks ("Use a lookup activity or a Get Metadata Activity to fetch t...

  • 0 kudos
dataVaughan
by New Contributor II
  • 2570 Views
  • 3 replies
  • 0 kudos

Notebook Dashboard to html to pdf issues

I have created a dashboard using the notebook dashboard interface, rather than the SQL warehouse dashboards. This means that the tables and visualizations on the dashboard, as well as the dashboard itself, are directly tied to a notebook and the outp...

  • 2570 Views
  • 3 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@dataVaughan  - you can use the Lakeview dashboard which can provide an URL that is shareable outside of the Databricks workspace.  https://www.databricks.com/blog/announcing-public-preview-lakeview-dashboards In your current scenario, you can clone ...

  • 0 kudos
2 More Replies
sai_sathya
by New Contributor III
  • 1348 Views
  • 1 replies
  • 0 kudos

fetching metadata for tables in a database stored in unity catalogue

Hi everyoneiam trying to fetch the metadata of every columns from an table and every tables from the database under an catalogue for that iam trying to use the samples catalogue that provided by databricks and get details for tpch database that provi...

sai_sathya_0-1713277488517.png
  • 1348 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@sai_sathya  - you can use DESCRIBE EXTENDED command to get the metadata of the given table. Also, you can query the information_schema.columns within your UC catalog to check the column details of a given table.

  • 0 kudos
UdayPatel
by New Contributor III
  • 3630 Views
  • 5 replies
  • 1 kudos

Resolved! Can't run .py file using workflows anymore

Dear all,Greetings!I have been trying to run a workflow job which runs successfully when a task is created using a Notebook file from a folder present in the Workspace but when the same task's type is changed to python script and a .py file is select...

  • 3630 Views
  • 5 replies
  • 1 kudos
Latest Reply
UdayPatel
New Contributor III
  • 1 kudos

Hi,Have found the solution. It was due to following option being enabled under the Feature Enablement tab under  Databricks_Account_Console -- > Settings.  Thank you for all your help and the try!Regards,Uday

  • 1 kudos
4 More Replies
Mustafa_Kamal
by New Contributor II
  • 1352 Views
  • 4 replies
  • 0 kudos

Parameterizing DLT Pipelines

Hi Everyone,I have DLTP pipeline which I need to execute for difference source systems. Need advise on how to parametrize this.I have gone through many articles on the web, but it seems there is no accurate information available.Can anyone please hel...

  • 1352 Views
  • 4 replies
  • 0 kudos
Latest Reply
Mustafa_Kamal
New Contributor II
  • 0 kudos

Thank you @AmanSehgal ,I have done that and was able to execute the pipeline successfully. Bu t I need to change the parameter value at run time, so that the same pipeline can be used for multiple sources.Can we pass parameters from Job to DLT Pipeli...

  • 0 kudos
3 More Replies
databrciks
by New Contributor II
  • 1628 Views
  • 2 replies
  • 0 kudos

Resolved! Databrciks: failure logs

Hello Team,I am new to Databrciks. Generally where all the logs will be stored in Databricks. I see if any job fails below the command i could see some error messages.Otherwise in real time how to check the log files/error messages in Databricks UI.T...

  • 1628 Views
  • 2 replies
  • 0 kudos
Latest Reply
databrciks
New Contributor II
  • 0 kudos

Thanks for the response. This helped.

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors