cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

aswinvishnu
by New Contributor II
  • 582 Views
  • 3 replies
  • 1 kudos

Exporting table to GCS bucket using job

Hi all,Usecase: I want to send the result of a query to GCS bucket location in json format.Approach: From my java based application I create a job and that job will be running a notebook`. Notebook will have something like this```query = "SELECT * FR...

  • 582 Views
  • 3 replies
  • 1 kudos
Latest Reply
LorelaiSpence
New Contributor II
  • 1 kudos

Consider using GCS signed URLs or access tokens for secure access.

  • 1 kudos
2 More Replies
Maverick1
by Valued Contributor II
  • 4692 Views
  • 6 replies
  • 6 kudos

How to infer the online feature store table via an mlflow registered model, which is deployed to a sagemaker endpoint?

Can an mlflow registered model automatically infer the online feature store table, if that model is trained and logged via a databricks feature store table and the table is pushed to an online feature store (like AWS RDS)?

  • 4692 Views
  • 6 replies
  • 6 kudos
Latest Reply
Janifer45
New Contributor II
  • 6 kudos

Thanks for this

  • 6 kudos
5 More Replies
BrianLind
by New Contributor II
  • 245 Views
  • 2 replies
  • 0 kudos

Need access to browse onprem SQL data

 Our BI team has started using Databricks and would like to browse our local (onprem) SQL database servers from within Databricks. I'm not sure if that's even possible.So far, I've set up Databricks Secure Cluster Connectivity (SCC), created a privat...

  • 245 Views
  • 2 replies
  • 0 kudos
Latest Reply
Renu_
Contributor
  • 0 kudos

Hi, based on what you’ve shared, it seems you’ve already completed many of the necessary steps. Just a few things to double-check as you move forward:SQL Warehouses used for BI tools need to run in Pro mode, not serverless, since only Pro or Classic ...

  • 0 kudos
1 More Replies
muano_makhokha
by New Contributor II
  • 284 Views
  • 1 replies
  • 1 kudos

Resolved! Row filtering and Column masking not working even when requirements the are met

I have been trying to use the Row filtering and Column masking feature to redacted columns and and filter rows based on the group a user is in.I have all the necessary permissions and I've used cluster's with version 15.4 and higher.When I run the fo...

  • 284 Views
  • 1 replies
  • 1 kudos
Latest Reply
BigRoux
Databricks Employee
  • 1 kudos

Here are some things to consider/try:   The UnityCatalogServiceException error you are encountering, ABORTED.UC_DBR_TRUST_VERSION_TOO_OLD, generally indicates that the Databricks Runtime (DBR) version you are using no longer supports the operation, s...

  • 1 kudos
meret
by New Contributor II
  • 164 Views
  • 1 replies
  • 0 kudos

Column Default Propagation

Hi Today I found I somewhat strange behavior when it comes to default values in columns. Apparently, column defaults are propagated to a new table, when you select the column without any operation on it. This is a bit unexpected for me. Here a short...

  • 164 Views
  • 1 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

The behavior you described regarding the propagation of default column values is expected and is tied to the specific usage of the delta.feature.allowColumnDefaults table property in Delta Lake. Here’s an explanation: Default Propagation Without Tra...

  • 0 kudos
Dinesh6351
by New Contributor II
  • 217 Views
  • 2 replies
  • 3 kudos
  • 217 Views
  • 2 replies
  • 3 kudos
Latest Reply
amos
New Contributor III
  • 3 kudos

Esse erro ocorre quando sua conta Azure excede a cota regional de núcleos disponíveis, impedindo a criação do cluster no Databricks. Isso significa que o cluster tentou utilizar mais recursos do que o permitido na sua região.1-Revise a configuração d...

  • 3 kudos
1 More Replies
gsouza
by New Contributor II
  • 762 Views
  • 2 replies
  • 3 kudos

Databricks asset bundle occasionally duplicating jobs

Since last year, we have adopted Databricks Asset Bundles for deploying our workflows to the production and staging environments. The tool has proven to be quite effective, and we currently use Azure DevOps Pipelines to automate bundle deployment, tr...

gsouza_0-1743021507944.png
  • 762 Views
  • 2 replies
  • 3 kudos
Latest Reply
isabelgontijo
New Contributor II
  • 3 kudos

Hi @NandiniN Thank you for your reply, but we would like to know if there is any way for us to monitor the progress of the solution to this problem. It is a recurring error and it is very disruptive when it occurs.Best regards.

  • 3 kudos
1 More Replies
jonhieb
by New Contributor III
  • 1095 Views
  • 6 replies
  • 3 kudos

Resolved! [Databricks Asset Bundles] Triggering Delta Live Tables

I would like to know how to schedule a DLT pipeline using DAB's.I'm trying to trigger a Delta Live Table pipeline using Databricks Asset Bundles. Below is my YAML code:resources:  pipelines:    data_quality_pipelines:      name: data_quality_pipeline...

  • 1095 Views
  • 6 replies
  • 3 kudos
Latest Reply
Walter_C
Databricks Employee
  • 3 kudos

As of now, Databricks Asset Bundles do not support direct scheduling of DLT pipelines using cron expressions within the bundle configuration. Instead, you can achieve scheduling by creating a Databricks job that triggers the DLT pipeline and then sch...

  • 3 kudos
5 More Replies
LasseL
by New Contributor III
  • 1499 Views
  • 4 replies
  • 1 kudos

How to use change data feed when schema is changing between delta table versions?

How to use change data feed when delta table schema changes between delta table versions?I tried to read change data feed in parts (in code snippet I read version 1372, because 1371 and 1373 schema versions are different), but getting errorUnsupporte...

  • 1499 Views
  • 4 replies
  • 1 kudos
Latest Reply
LRALVA
Honored Contributor
  • 1 kudos

@LasseL When you read from the change data feed in batch mode, Delta Lake always uses a single schema:By default, it uses the latest table version’s schema, even if you’re only reading an older versionOn Delta Runtime ≥ 12.2 LTS with column mapping e...

  • 1 kudos
3 More Replies
Yunky007
by New Contributor
  • 609 Views
  • 3 replies
  • 0 kudos

ETL pipeline

I have an ETL pipeline in workflows which I am using to create materialized view. I want to schedule the pipeline for 10 hours only starting from 10 am. How can I schedule that? I can only see hourly basis schedule or cron syntax. I want the compute ...

  • 609 Views
  • 3 replies
  • 0 kudos
Latest Reply
KaelaniBraster
New Contributor II
  • 0 kudos

Use cron syntax with a stop condition after 10 hours runtime.

  • 0 kudos
2 More Replies
nito
by New Contributor
  • 115 Views
  • 0 replies
  • 0 kudos

New remote (dbfs) caching python library

I had some problems getting much speedup at all from spark or DB disk cache, which I think is essential when developing PySpark code iteratively in notebooks. So I developed a handy caching-library for this which has recently been open sourced, see h...

  • 115 Views
  • 0 replies
  • 0 kudos
Prashant2
by New Contributor II
  • 408 Views
  • 4 replies
  • 0 kudos

import pymssql fails on DLT Serverless

I have a delta live table pipeline which works fine on normal DLT job cluster.But as soon as we switch it to use serverless compute it fails.The failure happens at "import pymssql" after doing pip install pymssql as first statement of the source code...

  • 408 Views
  • 4 replies
  • 0 kudos
Latest Reply
eniwoke
New Contributor III
  • 0 kudos

Hi @Prashant2 I am curious to know how you installed the library in your notebook. Did you use%pip install pymssqlIf so, could you try using a shell command instead, like:!pip install pymssqlI’ve had success using !pip install in serverless compute e...

  • 0 kudos
3 More Replies
Takuya-Omi
by Valued Contributor III
  • 749 Views
  • 3 replies
  • 0 kudos

Limitations When Using Instance Profiles to Connect to Kinesis

I encountered an issue where I couldn’t successfully connect to Kinesis Data Streams using instance profile authentication while working with Delta Live Tables (DLT) in a Unity Catalog (UC)-enabled environment.According to the documentation, instance...

  • 749 Views
  • 3 replies
  • 0 kudos
Latest Reply
am1go
New Contributor II
  • 0 kudos

I'm in the same boat - tried every workaround possible, nothing works for me. Databricks is pushing Unity Catalog hard so i find it unsettling that there is no solution for this issue other than reverting back to using hive metastore.

  • 0 kudos
2 More Replies
israelst
by New Contributor III
  • 3726 Views
  • 8 replies
  • 5 kudos

DLT can't authenticate with kinesis using instance profile

When running my notebook using personal compute with instance profile I am indeed able to readStream from kinesis. But adding it as a DLT with UC, while specifying the same instance-profile in the DLT pipeline setting - causes a "MissingAuthenticatio...

Data Engineering
Delta Live Tables
Unity Catalog
  • 3726 Views
  • 8 replies
  • 5 kudos
Latest Reply
am1go
New Contributor II
  • 5 kudos

Has anyone figured it out? Tried all solutions posted in this thread, nothing works for me...

  • 5 kudos
7 More Replies
Shaurya_greyhou
by New Contributor
  • 385 Views
  • 3 replies
  • 0 kudos

Inquiry on How to Return Visualizations, Images, or URLs from Databricks Genie API

Question 1)I am currently working on integrating Databricks Genie with Microsoft Teams and am looking for guidance on how to return visualizations, images, or URLs in a format that can be rendered in Teams. Specifically, I am trying to figure out how...

  • 385 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

For Q1 -  You can share a Databricks dashboard's URL to allow rendering or interaction in Microsoft Teams: Create a dashboard in Databricks, ensuring it contains the required visualizations or interaction widgets.Publish the dashboard and ensure that...

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels