cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 2989 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 2989 Views
  • 0 replies
  • 0 kudos
Avin_Kohale
by New Contributor
  • 50095 Views
  • 4 replies
  • 4 kudos

Import python files as modules in workspace

I'm deploying a new workspace for testing the deployed notebooks. But when trying to import the python files as module in the newly deployed workspace, I'm getting an error saying "function not found".Two points to note here:1. If I append absolute p...

  • 50095 Views
  • 4 replies
  • 4 kudos
Latest Reply
TimReddick
Contributor
  • 4 kudos

Hi @Retired_mod, I see your suggestion to append the necessary path to the sys.path. I'm curious if this is the recommendation for projects deployed via Databricks Asset Bundles. I want to maintain a project structure that looks something like this:p...

  • 4 kudos
3 More Replies
tranbau
by New Contributor
  • 1217 Views
  • 0 replies
  • 0 kudos

Dynamic Spark Structured Streaming: Handling Stream-Stream Joins with Changing

I want to create a simple application using Spark Structured Streaming to alert users (via email, SMS, etc.) when stock price data meets certain requirements.I have a data stream: data_streamHowever, I'm strugging to address the main issue: how users...

Warehousing & Analytics
kafka
spark
spark-structured-streaming
stream-stream join
  • 1217 Views
  • 0 replies
  • 0 kudos
Zer
by New Contributor II
  • 6767 Views
  • 4 replies
  • 4 kudos

SQL Editor, Tab Stops Indenting

Greetings,I use Databricks through Azure. Frequently when I'm working in the SQL Editor, the tab button fails to indent-- instead, it forces my cursor to a seemingly random part of the page. It's been doing this since when I started working in the pl...

  • 6767 Views
  • 4 replies
  • 4 kudos
Latest Reply
HannesM
New Contributor II
  • 4 kudos

Same issue here, sometimes it works by selecting a single but complete line and then hitting tab. If it works, then indentation works on multiple lines again as well. However, the single line select doesn't always work either. Pretty inconsistent beh...

  • 4 kudos
3 More Replies
Mat
by New Contributor III
  • 15563 Views
  • 4 replies
  • 3 kudos

Connect to Databricks SQL Endpoint using Programming language

Hi, I would like to know whether there is a feasibility/options available to connect to databricks sql endpoint using a programming language like java/scala/c#. I can see JDBC URL, but would like to whether it can be considered as any other jdbc conn...

  • 15563 Views
  • 4 replies
  • 3 kudos
Latest Reply
StephanieAlba
Databricks Employee
  • 3 kudos

I found a similar question on Stackoverflow https://stackoverflow.com/questions/77477103/ow-to-properly-connect-to-azure-databricks-warehouse-from-c-sharp-net-using-jdb

  • 3 kudos
3 More Replies
bradleyjamrozik
by New Contributor III
  • 1264 Views
  • 0 replies
  • 0 kudos

Server ODBC Connection

Is there a preferred method for hosting an odbc connection to a warehouse on a server for use by a report server (SSRS/PBIRS)? I know the odbc driver doesn't support pass-through authentication, so is there a way to configure it with an unattended ac...

  • 1264 Views
  • 0 replies
  • 0 kudos
cyong
by New Contributor II
  • 1439 Views
  • 1 replies
  • 0 kudos

Dynamic measure calculation based on filter in Excel

Hi, currently we are using Power BI as the semantic layer because it allows us to build custom measures to do aggregates and business logic calculation, and provides native connection to Excel. I am thinking to move these logics to Databricks using S...

  • 1439 Views
  • 1 replies
  • 0 kudos
Latest Reply
cyong
New Contributor II
  • 0 kudos

Thanks @Retired_mod , I think Power Query can only perform pre-data transformations, not on-the-fly calculations in response to user filters.

  • 0 kudos
scrimpton
by New Contributor II
  • 2446 Views
  • 1 replies
  • 0 kudos

Resolved! Delta Sharing with Power BI

Using Delta Sharing connector with Power BI, does it only works for import and currently no support for direct query?

Warehousing & Analytics
DELTA SHARING
Power BI
  • 2446 Views
  • 1 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@scrimpton currently it only supports import https://learn.microsoft.com/en-us/power-query/connectors/delta-sharing

  • 0 kudos
Datbth
by New Contributor
  • 888 Views
  • 0 replies
  • 0 kudos

Cancel SQL statement using ODBC driver

Hi,I'm implementing a Databricks connector using the ODBC driver and currently working on the functionality to Cancel an ongoing SQL statement.However, I can't seem to find any ODBC function or SQL function to do so.The only other alternative I see i...

  • 888 Views
  • 0 replies
  • 0 kudos
ckwan48
by New Contributor III
  • 8072 Views
  • 5 replies
  • 10 kudos

Trying to connect to DBeaver from Databricks and getting error message

I am trying to connect to DBeaver from Databricks and getting this error message:   [Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: javax.net.ssl.SSLHandshakeException: PKIX path building fa...

  • 8072 Views
  • 5 replies
  • 10 kudos
Latest Reply
Hardy
New Contributor III
  • 10 kudos

I have the same issue after upgrading cluster to DBR 12.2. Working fine with DBR 10.4

  • 10 kudos
4 More Replies
Yahya24
by New Contributor III
  • 3612 Views
  • 2 replies
  • 1 kudos

Resolved! API Query

Hello,I created a sql warehouse (cluster size = 2X-Small) and I wanted to use it to execute a query using the sql query api:- url : https://databricks-host/api/2.0/preview/sql/statements- params = {'warehouse_id': 'warehouse_id','statement': 'SELECT ...

  • 3612 Views
  • 2 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@Yahya24 can you please remove preview in query, they are not in preview any more "/api/2.0/sql/statements/", you should see json response, can you please check drop down menu and change to json, some times it may be setted into text, but usual respo...

  • 1 kudos
1 More Replies
gmiguel
by Contributor
  • 5080 Views
  • 2 replies
  • 2 kudos

Resolved! Does "Merge Into" skip files when reading target table to find files to be touched?

I've been doing some testing with Partitions vs Z-Ordering to optimize the merge process.As the documentation says, tables smaller than 1TB should not be partitioned and can benefit from the Z-Ordering process to optimize the reading process.Analyzin...

  • 5080 Views
  • 2 replies
  • 2 kudos
Mswedorske
by New Contributor II
  • 2165 Views
  • 1 replies
  • 2 kudos

Resolved! Historical Reporting

How do you handle reporting monthly trends within a data lakehouse?  Can this be done with timetravel to get the table state at the end of each month or is it better practice to build a data warehouse with SCD types?  We are new to databricks and lak...

  • 2165 Views
  • 1 replies
  • 2 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 2 kudos

@Mswedorske IMO it would be better to use SCD.When you do VACUUM on a table, it removes the data files that are necessary for Time Travel, so it's not a best choice to rely on Time Travel.

  • 2 kudos
BamBam
by New Contributor III
  • 2835 Views
  • 1 replies
  • 0 kudos

Where are driver logs for SQL Pro Warehouse?

In an All-Purpose Cluster, it is pretty easy to get at the Driver logs.  Where do I find the Driver Logs for a SQL Pro Warehouse?  The reason I ask is because sometimes in a SQL Editor we get generic error messages like "Task failed while writing row...

Warehousing & Analytics
SQLProWarehouse
  • 2835 Views
  • 1 replies
  • 0 kudos
Kaz
by New Contributor II
  • 4351 Views
  • 4 replies
  • 1 kudos

Automatically importing packages in notebooks

Within our team, there are certain (custom) python packages we always use and import in the same way. When starting a new notebook or analysis, we have to import these packages every time. Is it possible to automatically make these imports available ...

  • 4351 Views
  • 4 replies
  • 1 kudos
Latest Reply
Tharun-Kumar
Databricks Employee
  • 1 kudos

@Kaz  You can install these libraries using the Libraries section in the Compute.  All of the libraries mentioned here would be installed whenever the cluster is spun up.

  • 1 kudos
3 More Replies