cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Warehousing, Analytics, and BI

Forum Posts

MadelynM
by Contributor II
  • 637 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 637 Views
  • 0 replies
  • 0 kudos
Shailu1
by New Contributor II
  • 2107 Views
  • 3 replies
  • 2 kudos

Resolved! Snowflake vs Databricks SQL Endpoint for Datawarehousing which is more persistent

Snowflake vs Databricks SQL Endpoint for Datawarehousing which is more persistent

  • 2107 Views
  • 3 replies
  • 2 kudos
Latest Reply
Pritesh2
New Contributor II
  • 2 kudos

Databricks and Snowflake are both powerful platforms designed to address different aspects of data processing and analytics. Databricks shines in big data processing, machine learning, and AI workloads, while Snowflake excels in data warehousing, sto...

  • 2 kudos
2 More Replies
BobDobalina
by New Contributor II
  • 1876 Views
  • 3 replies
  • 2 kudos

Resolved! Add timestamp to table name using SQL Editor

Hi, am sure am missing something as this should be something trivial but am struggling to find how to add a suffix with a date to a table name.Does anyone have a way to do this?Thanks

  • 1876 Views
  • 3 replies
  • 2 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 2 kudos

Hi @BobDobalina  - Dynamic naming of table name is not allowed in DBSQL. However, you can try something similar %python from datetime import datetime date_suffix = datetime.now().strftime("%Y%m%d") table_name = f"students{date_suffix}" spark.sql(f"CR...

  • 2 kudos
2 More Replies
subbaram
by New Contributor II
  • 963 Views
  • 1 replies
  • 0 kudos

Create alert and send notification to owner of table

We have a use case where we need to send notification to owners of each table/volume in a schema if the creation date for table/volume is greater than 30 days by triggering a notebook script or through Rest API. Will there be a chance that we get the...

  • 963 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

Hi @subbaram  - you can create a simple python script by quering system tables - system.information_schema.tables  and create a dynamic list where creation date > 30 days and alert the table_owner via email.  Hope this helps !!! Thanks, Shan 

  • 0 kudos
Ismail1
by New Contributor III
  • 2399 Views
  • 2 replies
  • 0 kudos

Migrating from Databases Postgres MySQL to Databricks.

Hi all, working on this project, my team plans to migrate some data from some databases to Databricks. We plan to run this migration by submitting queries to a warehouse through python on a local machine.Now I was wondering what would be the best app...

  • 2399 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rom
New Contributor III
  • 0 kudos

Hi,Your solution it good.but if i'm in charge of this migration i will:create the architecture of all tables with ist constraints in databrick warehouseexport all data in tables of mysql database as csv of txt fileswrite notebook with pyspark code to...

  • 0 kudos
1 More Replies
EminBoz
by New Contributor II
  • 2949 Views
  • 4 replies
  • 1 kudos

"Revoke" permissions for SQL-Warehouse with API

Hello dear community,i am trying to revoke permissions with the API for SQL-Warehouse. Granting permissions isn't a problem and works like a charme. But revoking won't function. I tried "NO_PERMISSIONS", "NO PERMISSIONS", "DENY", "REVOKE" But i alway...

image.png
  • 2949 Views
  • 4 replies
  • 1 kudos
Latest Reply
Haris12
New Contributor II
  • 1 kudos

Cuphead APK is your go-to destination for the latest versions of the beloved game, Cuphead, on Android. We provide a curated selection of up-to-date APK files, ensuring that you can enjoy the thrilling adventures of Cuphead on your Android device has...

  • 1 kudos
3 More Replies
SaugatMukherjee
by New Contributor III
  • 2472 Views
  • 0 replies
  • 1 kudos

SQL Alert Email with QUERY_RESULTS_TABLE results in table with no border

Hi,I am sending databricks sql alerts to an email. I am trying to get the query results table in the body of the email.I have used a custom template with {{QUERY_RESULT_TABLE}}and this works fine for a teams alert. In Teams, I can see the table prope...

  • 2472 Views
  • 0 replies
  • 1 kudos
RobinK
by Contributor
  • 1960 Views
  • 2 replies
  • 1 kudos

Resolved! Use SQL Command LIST Volume for Alerts

Hi,we have implemented a Databricks Workflow that saves an Excel Sheet to a Databricks Volume. Now we want to notify users with an Alert, when new data arrives in the volume.In the docs I found the SQL command LIST which returns the columns path, nam...

  • 1960 Views
  • 2 replies
  • 1 kudos
Latest Reply
gabsylvain
Contributor
  • 1 kudos

Hi @RobinK , I've tested your code and I was able to reproduce the error. Unfortunately, I haven't found a pure SQL alternative to selecting the results of the LIST command as part of a subquery or CTE, and create an alert based on that.  Fortunately...

  • 1 kudos
1 More Replies
heymiky
by New Contributor
  • 1122 Views
  • 0 replies
  • 0 kudos

Enabling HTML content in Dashboard Visuals

Hii'm seeking some help creating visuals using HTML in SQL queries similar to those in the Retail Revenue & Supply Chain sample dashboards.  When I create my queries based on these my results display the HTML code instead of the HTML formatted result...

  • 1122 Views
  • 0 replies
  • 0 kudos
adisalj
by New Contributor II
  • 9510 Views
  • 3 replies
  • 1 kudos

TABLE_OR_VIEW_NOT_FOUND of deep clones

Hello community,We're cloning (deep clones) data objects of the production catalog to our non-production catalog weekly. The non-production catalog is used to run our DBT transformation to ensure we're not breaking any production models. Lately, we h...

  • 9510 Views
  • 3 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@adisalj have a small question how you are handling deep cloned data in target, are you created managed table with data that is being clone into target. can you please post sample query that you are using between your catalogs to do deep clone.i am f...

  • 1 kudos
2 More Replies
Jennifer
by New Contributor III
  • 8164 Views
  • 2 replies
  • 0 kudos

Resolved! Why does readStream filter go through all records?

Hello,I am running spark structured streaming, reading from one table table_1, do some aggregation and then write results to another table. table_1 is partitioned by ["datehour", "customerID"]My code is like this:spark.readStream.format("delta").tabl...

  • 8164 Views
  • 2 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

To define the initial position please check this:https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/delta-lake#specify-initial-position

  • 0 kudos
1 More Replies
nijhuist
by New Contributor II
  • 2209 Views
  • 1 replies
  • 0 kudos

Import warnings when running DBT as a package on Databricks 13.3LTS

Executing dbt as a Python package triggers about 200 import warnings when ran on Databricks Runtime 13.3 but not on 12.2. The warnings are all the same:  <frozen importlib._bootstrap>:914: ImportWarning: ImportHookFinder.find_spec() not found; fallin...

  • 2209 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Executing dbt as a Python package triggers about 200 import warnings when ran on Databricks Runtime 13.3 but not on 12.2. The warnings are all the same:  <frozen importlib._bootstrap>:914: ImportWarning: ImportHookFinder.find_spec() not found; fallin...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
SivaPK
by New Contributor II
  • 3411 Views
  • 1 replies
  • 0 kudos

How to find the distinct count of the below listed result from the table?

Hi,How to get the distinct count from the below listed image,keywords = column nametable = appCatalogkeywords (column)"[""data"",""cis"",""mining"",""financial"",""pso"",""value""]""[""bzo"",""employee news"",""news""]""[""core.store"",""fbi""]""[""d...

  • 3411 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hi,How to get the distinct count from the below listed image,keywords = column nametable = appCatalogkeywords (column)"[""data"",""cis"",""mining"",""financial"",""pso"",""value""]""[""bzo"",""employee news"",""news""]""[""core.store"",""fbi""]""[""d...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
Amy
by New Contributor II
  • 4223 Views
  • 3 replies
  • 1 kudos

Gantt chart in dashboard

I'd like to create Gantt charts using the dashboard function. It seems like this could be possible by adding some additional parameters in the bar plot functionality, but I don't see how to do it currently (if there is a way, would love an example!)....

Screen Shot 2022-06-23 at 1.48.05 PM
  • 4223 Views
  • 3 replies
  • 1 kudos
Latest Reply
alexiswl
Contributor
  • 1 kudos

Hi @Vidula,I don't think this has been resolved.  I think gantt charts would look fantastic in a Lakeview Dashboard.  

  • 1 kudos
2 More Replies
Avin_Kohale
by New Contributor
  • 26292 Views
  • 4 replies
  • 4 kudos

Import python files as modules in workspace

I'm deploying a new workspace for testing the deployed notebooks. But when trying to import the python files as module in the newly deployed workspace, I'm getting an error saying "function not found".Two points to note here:1. If I append absolute p...

  • 26292 Views
  • 4 replies
  • 4 kudos
Latest Reply
TimReddick
Contributor
  • 4 kudos

Hi @Retired_mod, I see your suggestion to append the necessary path to the sys.path. I'm curious if this is the recommendation for projects deployed via Databricks Asset Bundles. I want to maintain a project structure that looks something like this:p...

  • 4 kudos
3 More Replies
tranbau
by New Contributor
  • 666 Views
  • 0 replies
  • 0 kudos

Dynamic Spark Structured Streaming: Handling Stream-Stream Joins with Changing

I want to create a simple application using Spark Structured Streaming to alert users (via email, SMS, etc.) when stock price data meets certain requirements.I have a data stream: data_streamHowever, I'm strugging to address the main issue: how users...

Warehousing & Analytics
kafka
spark
spark-structured-streaming
stream-stream join
  • 666 Views
  • 0 replies
  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors