- 1032 Views
- 2 replies
- 1 kudos
Databricks sql API <- Delta tables
We have an integration flow where we want to expose databricks data for querying through odata(webapp). For this piecedatabricks sql API <- Delta tables :2 questions here:1. can you share link/documentation on how we can integrate databricks <-delta ...
- 1032 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Ruby8376 - can you please review the similar posts where the resolution is provided https://community.databricks.com/t5/warehousing-analytics/databricks-sql-restful-api-to-query-delta-table/td-p/8617 https://www.databricks.com/blog/2023/03/07/da...
- 1 kudos
- 601 Views
- 0 replies
- 0 kudos
Databrick Dashboard state not cleared when login as other user.
Hi all, I am using Databricks and created a notebook and would like to run in Dashboard. It works correctly. I share the Dashboard with another user UserA with "Can Run" permission When I login as a UserA and login and accesses Dashboard then does a...
- 601 Views
- 0 replies
- 0 kudos
- 31803 Views
- 7 replies
- 3 kudos
Resolved! Private PyPI repos on DBR 13+
We use a private PyPI repo (AWS CodeArtifact) to publish custom python libraries. We make the private repo available to DBR 12.2 clusters using an init-script as prescribed here in the Databricks KB. When we tried to upgrade to 13.2 this stopped wor...
- 31803 Views
- 7 replies
- 3 kudos
- 3 kudos
I'm coming back to provide an updated solution that doesn't rely on the implementation detail of the user name (e.g., libraries) - which is not considered a contract and could potentially change and break in the future.The key is to use the --global ...
- 3 kudos
- 758 Views
- 1 replies
- 1 kudos
Resolved! Internal Error : report
I'm getting tis error while running any cell in notebook. On the top middle it is coming like this. "Uncaught TypeError: Cannot redefine property: googletagReload the page and try again. If the error persists, contact support. Reference error code: 7...
- 758 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @_raman_ , Which DBR are you facing this issue? Most likely the issue is related to this: https://github.com/shadcn-ui/ui/issues/2837 If you are having this issue might be because of some browserextension. A quick test to confirm this theory is to...
- 1 kudos
- 1170 Views
- 1 replies
- 0 kudos
Can't setup dbt with streaming tables
Hey community,i'm struggling integrate Delta Live Tables and dbt with one another.Basically i'm trying to complete this tutorial. https://www.databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricksSome further information:Crea...
- 1170 Views
- 1 replies
- 0 kudos
- 0 kudos
i forgot to add this further discription, sorryi added the linked github repo to my databricks workspace, successfully ran the helper notebook and created a job which runs a dbt task based on the dbt project contained in the GitHub.This task complete...
- 0 kudos
- 1728 Views
- 1 replies
- 0 kudos
Tracking File Arrivals in Nested Folders Using Databricks File Arrival Trigger
Hi Team,I'm currently exploring a file arrival trigger with Data-bricks, but my data is organized into nested folders representing various sources. For instance: source1 |-- file1 |-- file.csv |-- file2 |-- file.csv My goal is to dete...
- 1728 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod did a LLM Bot write the above response for you? You link to a Stackoverflow post which uses Azure Data Factory, and you text contains concepts which do not apply to Databricks ("Use a lookup activity or a Get Metadata Activity to fetch t...
- 0 kudos
- 1135 Views
- 0 replies
- 0 kudos
Create Databricks model serving endpoint in Azure DevOps yaml
Hello,I need to create and destroy a model endpoint as part of CI/CD. I tried with mlflow deployments create-endpoint, giving databricks as --target however it errors saying that --endpoint is not a known argument when clearly --endpoint is required....
- 1135 Views
- 0 replies
- 0 kudos
- 2985 Views
- 3 replies
- 0 kudos
Notebook Dashboard to html to pdf issues
I have created a dashboard using the notebook dashboard interface, rather than the SQL warehouse dashboards. This means that the tables and visualizations on the dashboard, as well as the dashboard itself, are directly tied to a notebook and the outp...
- 2985 Views
- 3 replies
- 0 kudos
- 0 kudos
@dataVaughan - you can use the Lakeview dashboard which can provide an URL that is shareable outside of the Databricks workspace. https://www.databricks.com/blog/announcing-public-preview-lakeview-dashboards In your current scenario, you can clone ...
- 0 kudos
- 1639 Views
- 1 replies
- 0 kudos
Resolved! fetching metadata for tables in a database stored in unity catalogue
Hi everyoneiam trying to fetch the metadata of every columns from an table and every tables from the database under an catalogue for that iam trying to use the samples catalogue that provided by databricks and get details for tpch database that provi...
- 1639 Views
- 1 replies
- 0 kudos
- 0 kudos
@sai_sathya - you can use DESCRIBE EXTENDED command to get the metadata of the given table. Also, you can query the information_schema.columns within your UC catalog to check the column details of a given table.
- 0 kudos
- 3910 Views
- 5 replies
- 1 kudos
Resolved! Can't run .py file using workflows anymore
Dear all,Greetings!I have been trying to run a workflow job which runs successfully when a task is created using a Notebook file from a folder present in the Workspace but when the same task's type is changed to python script and a .py file is select...
- 3910 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi,Have found the solution. It was due to following option being enabled under the Feature Enablement tab under Databricks_Account_Console -- > Settings. Thank you for all your help and the try!Regards,Uday
- 1 kudos
- 1446 Views
- 4 replies
- 0 kudos
Parameterizing DLT Pipelines
Hi Everyone,I have DLTP pipeline which I need to execute for difference source systems. Need advise on how to parametrize this.I have gone through many articles on the web, but it seems there is no accurate information available.Can anyone please hel...
- 1446 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you @AmanSehgal ,I have done that and was able to execute the pipeline successfully. Bu t I need to change the parameter value at run time, so that the same pipeline can be used for multiple sources.Can we pass parameters from Job to DLT Pipeli...
- 0 kudos
- 1674 Views
- 2 replies
- 0 kudos
Resolved! Databrciks: failure logs
Hello Team,I am new to Databrciks. Generally where all the logs will be stored in Databricks. I see if any job fails below the command i could see some error messages.Otherwise in real time how to check the log files/error messages in Databricks UI.T...
- 1674 Views
- 2 replies
- 0 kudos
- 2024 Views
- 2 replies
- 1 kudos
Resolved! Cannot create delta location with mount path
Hi all,I'm trying to create a Table but cannot use a predifined mount path like '/mnt/silver/' but if i use a full path of azure blob container it will create susscessfully like this:`CREATE TABLE IF NOT EXISTS nhan_databricks.f1_processed.circuits (...
- 2024 Views
- 2 replies
- 1 kudos
- 1 kudos
Oh thanks for you answer, actually I'm using Unity Catalog
- 1 kudos
- 1101 Views
- 1 replies
- 0 kudos
Can't Create a Workspace using Google Cloud
Trying to create my first workspace. I hit create my space and I see 3 buckets being created on my GCP, but nothing shows up in the actual 'workspaces' in my databricks console. the only thing is the 'create workspace' button' also, there is no erro...
- 1101 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 923 Views
- 1 replies
- 0 kudos
Ingesting Non-Incremental Data into Delta
Hello,I have non-incremental data landing in a storage account. This data contains old data from before as well as new data. I would like to avoid doing a complete table deletion and table creation just to upload the data from storage and have an upd...
- 923 Views
- 1 replies
- 0 kudos
- 0 kudos
Well, if you know the conditions to separate new data from old data, then while reading the data in to your dataframe, use filter or where clause to select new data and ingest it in to your delta table.This is how you can do in general. But if you ha...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »