cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3127 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3127 Views
  • 0 replies
  • 0 kudos
sjmb
by New Contributor
  • 6557 Views
  • 3 replies
  • 0 kudos

Which object to use in which layer

I completed the Data Engineering Lakehouse course and I am familiar with different objects and concepts of databricks and lakehouse but I cant tie them together in my mind.Where do you typically use managed and non-managed tables? Bronze layer? Or no...

Warehousing & Analytics
Databricks
datalake
Lakehouse
  • 6557 Views
  • 3 replies
  • 0 kudos
Latest Reply
gabsylvain
Databricks Employee
  • 0 kudos

Hi @sjmb , In the Medallion architecture, the usage of managed and non-managed tables, autoloader/cloud files, and the apply changes into function can vary depending on the layer (Bronze, Silver, Gold) and the specific use case. As a general rule, us...

  • 0 kudos
2 More Replies
Eric_Kieft
by New Contributor III
  • 3058 Views
  • 2 replies
  • 0 kudos

Unity Catalog "this table is deprecated" Functionality

We found a post on LinkedIn that revealed if "this table is deprecated" is added to a table comment, the table will appear with a strikethrough in notebooks and SQL editor windows.  Is this functionality GA?  Is there any documentation on the use of ...

  • 3058 Views
  • 2 replies
  • 0 kudos
Latest Reply
Eric_Kieft
New Contributor III
  • 0 kudos

Thanks, @arpit! Is there any documentation on this feature?

  • 0 kudos
1 More Replies
Shailu1
by New Contributor II
  • 3509 Views
  • 3 replies
  • 2 kudos

Resolved! Snowflake vs Databricks SQL Endpoint for Datawarehousing which is more persistent

Snowflake vs Databricks SQL Endpoint for Datawarehousing which is more persistent

  • 3509 Views
  • 3 replies
  • 2 kudos
Latest Reply
Pritesh2
New Contributor II
  • 2 kudos

Databricks and Snowflake are both powerful platforms designed to address different aspects of data processing and analytics. Databricks shines in big data processing, machine learning, and AI workloads, while Snowflake excels in data warehousing, sto...

  • 2 kudos
2 More Replies
BobDobalina
by New Contributor II
  • 3352 Views
  • 3 replies
  • 2 kudos

Resolved! Add timestamp to table name using SQL Editor

Hi, am sure am missing something as this should be something trivial but am struggling to find how to add a suffix with a date to a table name.Does anyone have a way to do this?Thanks

  • 3352 Views
  • 3 replies
  • 2 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 2 kudos

Hi @BobDobalina  - Dynamic naming of table name is not allowed in DBSQL. However, you can try something similar %python from datetime import datetime date_suffix = datetime.now().strftime("%Y%m%d") table_name = f"students{date_suffix}" spark.sql(f"CR...

  • 2 kudos
2 More Replies
subbaram
by New Contributor II
  • 1587 Views
  • 1 replies
  • 0 kudos

Create alert and send notification to owner of table

We have a use case where we need to send notification to owners of each table/volume in a schema if the creation date for table/volume is greater than 30 days by triggering a notebook script or through Rest API. Will there be a chance that we get the...

  • 1587 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

Hi @subbaram  - you can create a simple python script by quering system tables - system.information_schema.tables  and create a dynamic list where creation date > 30 days and alert the table_owner via email.  Hope this helps !!! Thanks, Shan 

  • 0 kudos
EminBoz
by New Contributor II
  • 5109 Views
  • 4 replies
  • 1 kudos

"Revoke" permissions for SQL-Warehouse with API

Hello dear community,   i am trying to revoke permissions with the API for SQL-Warehouse. Granting permissions isn't a problem and works like a charme. But revoking won't function. I tried "NO_PERMISSIONS", "NO PERMISSIONS", "DENY", "REVOKE" But i al...

image.png
  • 5109 Views
  • 4 replies
  • 1 kudos
Latest Reply
Haris12
New Contributor II
  • 1 kudos

Cuphead APK is your go-to destination for the latest versions of the beloved game, Cuphead, on Android. We provide a curated selection of up-to-date APK files, ensuring that you can enjoy the thrilling adventures of Cuphead on your Android device has...

  • 1 kudos
3 More Replies
SaugatMukherjee
by New Contributor III
  • 6370 Views
  • 0 replies
  • 2 kudos

SQL Alert Email with QUERY_RESULTS_TABLE results in table with no border

Hi,I am sending databricks sql alerts to an email. I am trying to get the query results table in the body of the email.I have used a custom template with {{QUERY_RESULT_TABLE}}and this works fine for a teams alert. In Teams, I can see the table prope...

  • 6370 Views
  • 0 replies
  • 2 kudos
RobinK
by Contributor
  • 3297 Views
  • 2 replies
  • 1 kudos

Resolved! Use SQL Command LIST Volume for Alerts

Hi,we have implemented a Databricks Workflow that saves an Excel Sheet to a Databricks Volume. Now we want to notify users with an Alert, when new data arrives in the volume.In the docs I found the SQL command LIST which returns the columns path, nam...

  • 3297 Views
  • 2 replies
  • 1 kudos
Latest Reply
gabsylvain
Databricks Employee
  • 1 kudos

Hi @RobinK , I've tested your code and I was able to reproduce the error. Unfortunately, I haven't found a pure SQL alternative to selecting the results of the LIST command as part of a subquery or CTE, and create an alert based on that.  Fortunately...

  • 1 kudos
1 More Replies
heymiky
by New Contributor
  • 2352 Views
  • 0 replies
  • 0 kudos

Enabling HTML content in Dashboard Visuals

Hii'm seeking some help creating visuals using HTML in SQL queries similar to those in the Retail Revenue & Supply Chain sample dashboards.  When I create my queries based on these my results display the HTML code instead of the HTML formatted result...

  • 2352 Views
  • 0 replies
  • 0 kudos
adisalj
by New Contributor II
  • 10559 Views
  • 3 replies
  • 1 kudos

TABLE_OR_VIEW_NOT_FOUND of deep clones

Hello community,We're cloning (deep clones) data objects of the production catalog to our non-production catalog weekly. The non-production catalog is used to run our DBT transformation to ensure we're not breaking any production models. Lately, we h...

  • 10559 Views
  • 3 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@adisalj have a small question how you are handling deep cloned data in target, are you created managed table with data that is being clone into target. can you please post sample query that you are using between your catalogs to do deep clone.i am f...

  • 1 kudos
2 More Replies
Jennifer
by New Contributor III
  • 10174 Views
  • 2 replies
  • 0 kudos

Resolved! Why does readStream filter go through all records?

Hello,I am running spark structured streaming, reading from one table table_1, do some aggregation and then write results to another table. table_1 is partitioned by ["datehour", "customerID"]My code is like this:spark.readStream.format("delta").tabl...

  • 10174 Views
  • 2 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

To define the initial position please check this:https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/delta-lake#specify-initial-position

  • 0 kudos
1 More Replies
nijhuist
by New Contributor II
  • 3268 Views
  • 1 replies
  • 0 kudos

Import warnings when running DBT as a package on Databricks 13.3LTS

Executing dbt as a Python package triggers about 200 import warnings when ran on Databricks Runtime 13.3 but not on 12.2. The warnings are all the same:  <frozen importlib._bootstrap>:914: ImportWarning: ImportHookFinder.find_spec() not found; fallin...

  • 3268 Views
  • 1 replies
  • 0 kudos
SivaPK
by New Contributor II
  • 6261 Views
  • 1 replies
  • 0 kudos

How to find the distinct count of the below listed result from the table?

Hi,How to get the distinct count from the below listed image,keywords = column nametable = appCatalogkeywords (column)"[""data"",""cis"",""mining"",""financial"",""pso"",""value""]""[""bzo"",""employee news"",""news""]""[""core.store"",""fbi""]""[""d...

  • 6261 Views
  • 1 replies
  • 0 kudos
Amy
by New Contributor II
  • 6740 Views
  • 3 replies
  • 1 kudos

Gantt chart in dashboard

I'd like to create Gantt charts using the dashboard function. It seems like this could be possible by adding some additional parameters in the bar plot functionality, but I don't see how to do it currently (if there is a way, would love an example!)....

Screen Shot 2022-06-23 at 1.48.05 PM
  • 6740 Views
  • 3 replies
  • 1 kudos
Latest Reply
alexiswl
Contributor
  • 1 kudos

Hi @Vidula,I don't think this has been resolved.  I think gantt charts would look fantastic in a Lakeview Dashboard.  

  • 1 kudos
2 More Replies
Avin_Kohale
by New Contributor
  • 51046 Views
  • 4 replies
  • 4 kudos

Import python files as modules in workspace

I'm deploying a new workspace for testing the deployed notebooks. But when trying to import the python files as module in the newly deployed workspace, I'm getting an error saying "function not found".Two points to note here:1. If I append absolute p...

  • 51046 Views
  • 4 replies
  • 4 kudos
Latest Reply
TimReddick
Contributor
  • 4 kudos

Hi @Retired_mod, I see your suggestion to append the necessary path to the sys.path. I'm curious if this is the recommendation for projects deployed via Databricks Asset Bundles. I want to maintain a project structure that looks something like this:p...

  • 4 kudos
3 More Replies