- 1338 Views
- 1 replies
- 0 kudos
temporary tables or dataframes,
We have to generate over 70 intermediate tables. Should we use temporary tables or dataframes, or should we create delta tables and truncate and reload? Having too many temporary tables could lead to memory problems. In this situation, what is the mo...
- 1338 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Phani1, It would be a use case specific answer, so if it is possible I would suggest to work with the Solution Architect on this or share some more insights for a better guidance. When I say that, I just would want to understand would we really ne...
- 0 kudos
- 586 Views
- 0 replies
- 0 kudos
Databricks bundles - good practice for multiprocessing envs
I'm seeking advice regarding Databricks bundles. In my scenario, I have multiple production environments where I aim to execute the same DLT. To simplify, let's assume the DLT reads data from 'eventhub-region-name,' with this being the only differing...
- 586 Views
- 0 replies
- 0 kudos
- 949 Views
- 2 replies
- 1 kudos
Databricks sql API <- Delta tables
We have an integration flow where we want to expose databricks data for querying through odata(webapp). For this piecedatabricks sql API <- Delta tables :2 questions here:1. can you share link/documentation on how we can integrate databricks <-delta ...
- 949 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Ruby8376 - can you please review the similar posts where the resolution is provided https://community.databricks.com/t5/warehousing-analytics/databricks-sql-restful-api-to-query-delta-table/td-p/8617 https://www.databricks.com/blog/2023/03/07/da...
- 1 kudos
- 569 Views
- 0 replies
- 0 kudos
Databrick Dashboard state not cleared when login as other user.
Hi all, I am using Databricks and created a notebook and would like to run in Dashboard. It works correctly. I share the Dashboard with another user UserA with "Can Run" permission When I login as a UserA and login and accesses Dashboard then does a...
- 569 Views
- 0 replies
- 0 kudos
- 30201 Views
- 7 replies
- 3 kudos
Resolved! Private PyPI repos on DBR 13+
We use a private PyPI repo (AWS CodeArtifact) to publish custom python libraries. We make the private repo available to DBR 12.2 clusters using an init-script as prescribed here in the Databricks KB. When we tried to upgrade to 13.2 this stopped wor...
- 30201 Views
- 7 replies
- 3 kudos
- 3 kudos
I'm coming back to provide an updated solution that doesn't rely on the implementation detail of the user name (e.g., libraries) - which is not considered a contract and could potentially change and break in the future.The key is to use the --global ...
- 3 kudos
- 616 Views
- 1 replies
- 0 kudos
Internal Error : report
I'm getting tis error while running any cell in notebook. On the top middle it is coming like this. "Uncaught TypeError: Cannot redefine property: googletagReload the page and try again. If the error persists, contact support. Reference error code: 7...
- 616 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @_raman_ , Which DBR are you facing this issue? Most likely the issue is related to this: https://github.com/shadcn-ui/ui/issues/2837 If you are having this issue might be because of some browserextension. A quick test to confirm this theory is to...
- 0 kudos
- 1295 Views
- 2 replies
- 1 kudos
In databricks deployment .py files getting converted to notebooks
A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...
- 1295 Views
- 2 replies
- 1 kudos
- 1 kudos
Experiencing a similar issue that we are looking to resolve except the files are .sql. We have a process that has 1 orchestration notebook , calling multiple .sql files. These .sql files are being converted to regular databricks notebooks when deploy...
- 1 kudos
- 1080 Views
- 1 replies
- 0 kudos
Can't setup dbt with streaming tables
Hey community,i'm struggling integrate Delta Live Tables and dbt with one another.Basically i'm trying to complete this tutorial. https://www.databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricksSome further information:Crea...
- 1080 Views
- 1 replies
- 0 kudos
- 0 kudos
i forgot to add this further discription, sorryi added the linked github repo to my databricks workspace, successfully ran the helper notebook and created a job which runs a dbt task based on the dbt project contained in the GitHub.This task complete...
- 0 kudos
- 1579 Views
- 1 replies
- 0 kudos
Tracking File Arrivals in Nested Folders Using Databricks File Arrival Trigger
Hi Team,I'm currently exploring a file arrival trigger with Data-bricks, but my data is organized into nested folders representing various sources. For instance: source1 |-- file1 |-- file.csv |-- file2 |-- file.csv My goal is to dete...
- 1579 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod did a LLM Bot write the above response for you? You link to a Stackoverflow post which uses Azure Data Factory, and you text contains concepts which do not apply to Databricks ("Use a lookup activity or a Get Metadata Activity to fetch t...
- 0 kudos
- 1104 Views
- 0 replies
- 0 kudos
Create Databricks model serving endpoint in Azure DevOps yaml
Hello,I need to create and destroy a model endpoint as part of CI/CD. I tried with mlflow deployments create-endpoint, giving databricks as --target however it errors saying that --endpoint is not a known argument when clearly --endpoint is required....
- 1104 Views
- 0 replies
- 0 kudos
- 2570 Views
- 3 replies
- 0 kudos
Notebook Dashboard to html to pdf issues
I have created a dashboard using the notebook dashboard interface, rather than the SQL warehouse dashboards. This means that the tables and visualizations on the dashboard, as well as the dashboard itself, are directly tied to a notebook and the outp...
- 2570 Views
- 3 replies
- 0 kudos
- 0 kudos
@dataVaughan - you can use the Lakeview dashboard which can provide an URL that is shareable outside of the Databricks workspace. https://www.databricks.com/blog/announcing-public-preview-lakeview-dashboards In your current scenario, you can clone ...
- 0 kudos
- 1348 Views
- 1 replies
- 0 kudos
fetching metadata for tables in a database stored in unity catalogue
Hi everyoneiam trying to fetch the metadata of every columns from an table and every tables from the database under an catalogue for that iam trying to use the samples catalogue that provided by databricks and get details for tpch database that provi...
- 1348 Views
- 1 replies
- 0 kudos
- 0 kudos
@sai_sathya - you can use DESCRIBE EXTENDED command to get the metadata of the given table. Also, you can query the information_schema.columns within your UC catalog to check the column details of a given table.
- 0 kudos
- 3630 Views
- 5 replies
- 1 kudos
Resolved! Can't run .py file using workflows anymore
Dear all,Greetings!I have been trying to run a workflow job which runs successfully when a task is created using a Notebook file from a folder present in the Workspace but when the same task's type is changed to python script and a .py file is select...
- 3630 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi,Have found the solution. It was due to following option being enabled under the Feature Enablement tab under Databricks_Account_Console -- > Settings. Thank you for all your help and the try!Regards,Uday
- 1 kudos
- 1352 Views
- 4 replies
- 0 kudos
Parameterizing DLT Pipelines
Hi Everyone,I have DLTP pipeline which I need to execute for difference source systems. Need advise on how to parametrize this.I have gone through many articles on the web, but it seems there is no accurate information available.Can anyone please hel...
- 1352 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you @AmanSehgal ,I have done that and was able to execute the pipeline successfully. Bu t I need to change the parameter value at run time, so that the same pipeline can be used for multiple sources.Can we pass parameters from Job to DLT Pipeli...
- 0 kudos
- 1628 Views
- 2 replies
- 0 kudos
Resolved! Databrciks: failure logs
Hello Team,I am new to Databrciks. Generally where all the logs will be stored in Databricks. I see if any job fails below the command i could see some error messages.Otherwise in real time how to check the log files/error messages in Databricks UI.T...
- 1628 Views
- 2 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »