cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Jerry01
by New Contributor III
  • 11321 Views
  • 3 replies
  • 2 kudos

Is ABAC feature enabled?

Can anyone please share me the example of how it works in terms of access controls?

  • 11321 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Naveena G​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 2 kudos
2 More Replies
santhoshKumarV
by New Contributor II
  • 1851 Views
  • 2 replies
  • 2 kudos

Code coverage on Databricks notebook

I have a scenario where my application code a scala package and notebook code[Scala] under /resources folder is being maitained.I am trying to look for a easiest way to perform code coverage on my notebook , does Databricks provide any option for it....

  • 1851 Views
  • 2 replies
  • 2 kudos
Latest Reply
santhoshKumarV
New Contributor II
  • 2 kudos

Important thing which missed to add in post is , we do maintan notebook code as .scala under resources and maitian in github. Files(.scala) from resources gets deployed as notebook using github action.With my approach of moving under package, I will ...

  • 2 kudos
1 More Replies
yvishal519
by Contributor
  • 4234 Views
  • 8 replies
  • 2 kudos

Handling Audit Columns and SCD Type 1 in Databricks DLT Pipeline with Unity Catalog: Circular Depend

I am working on a Delta Live Tables (DLT) pipeline with Unity Catalog, where we are reading data from Azure Data Lake Storage (ADLS) and creating a table in the silver layer with Slowly Changing Dimensions (SCD) Type 1 enabled. In addition, we are ad...

yvishal519_0-1729619599002.png
  • 4234 Views
  • 8 replies
  • 2 kudos
Latest Reply
yvishal519
Contributor
  • 2 kudos

@NandiniN  @RBlum I haven’t found an ideal solution for handling audit columns effectively in Databricks Delta Live Tables (DLT) when implementing SCD Type 1. It seems there’s no straightforward way to incorporate these columns into the apply_changes...

  • 2 kudos
7 More Replies
Deloitte_DS
by New Contributor II
  • 8725 Views
  • 5 replies
  • 1 kudos

Resolved! Unable to install poppler-utils

Hi,I'm trying to install system level package "Poppler-utils" for the cluster. I added the following line to the init.sh script.sudo apt-get -f -y install poppler-utilsI got the following error: PDFInfoNotInstalledError: Unable to get page count. Is ...

  • 8725 Views
  • 5 replies
  • 1 kudos
Latest Reply
Raghavan93513
Databricks Employee
  • 1 kudos

Hi Team, If you use a single user cluster and use the below init script, it will work: sudo rm -r /var/lib/apt/lists/* sudo apt clean && sudo apt update --fix-missing -ysudo apt-get install poppler-utils tesseract-ocr -y But if you are using a shared...

  • 1 kudos
4 More Replies
vvk
by New Contributor II
  • 6082 Views
  • 2 replies
  • 0 kudos

Unable to upload a wheel file in Azure DevOps pipeline

Hi, I am trying to upload a wheel file to Databricks workspace using Azure DevOps release pipeline to use it in the interactive cluster. I tried "databricks workspace import" command, but looks like it does not support .whl files. Hence, I tried to u...

  • 6082 Views
  • 2 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

Hi @vvk - The HTTP 403 error typically indicates a permissions issue. Ensure that the SP has the necessary permissions to perform the fs cp operation on the specified path. Verify that the path specified in the fs cp command is correct and that the v...

  • 0 kudos
1 More Replies
stvayers
by New Contributor
  • 6118 Views
  • 1 replies
  • 0 kudos

How to mount AWS EFS via NFS on a Databricks Cluster

I'm trying to read in ~500 million small json files into an spark autoloader pipeline, and I seem to be slowed down massively by S3 request limits, so I want to explore using AWS EFS instead. I found this blog post: https://www.databricks.com/blog/20...

  • 6118 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

Hi @stvayers Please refer to this doc. https://docs.databricks.com/api/workspace/clusters/create It has instructions on how to mount using EFS.  

  • 0 kudos
Bepposbeste1993
by New Contributor III
  • 1896 Views
  • 4 replies
  • 0 kudos

Resolved! select 1 query not finishing

Hello,I have the issue that even a query like "select 1" is not finishing. The sql warehouse runs infinite. I have no idea where to look for any issues because in the SPARK UI I cant see any error.What is intresting is that also allpurpose clusters (...

  • 1896 Views
  • 4 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Bepposbeste1993, Do you have the case ID raised for this issue? 

  • 0 kudos
3 More Replies
cmilligan
by Contributor II
  • 4203 Views
  • 4 replies
  • 0 kudos

Undescriptive error when trying to insert overwrite into a table

I have a query that I'm trying to insert overwrite into a table. In an effort to try and speed up the query I added a range join hint. After adding it I started getting the error below.I can get around this though by creating a temporary view of the ...

Screenshot_20230118_104626
  • 4203 Views
  • 4 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Could you share your code and the full error stack trace please? Check the driver logs for the full stack trace.

  • 0 kudos
3 More Replies
pranitha
by New Contributor II
  • 815 Views
  • 3 replies
  • 0 kudos
  • 815 Views
  • 3 replies
  • 0 kudos
Latest Reply
MadhuB
Valued Contributor
  • 0 kudos

Hi @pranitha Use this query to get the cluster details along with cost info as well. WITH hourly_metrics AS (  SELECT    date_trunc('hour', usage_start_time) as hour,    usage_metadata.cluster_id,    sku_name,    MAX(usage_quantity) as max_usage,    ...

  • 0 kudos
2 More Replies
abelian-grape
by New Contributor III
  • 992 Views
  • 1 replies
  • 0 kudos

Near real time processing with CDC from snowflake to databricks

Hi I would like to configure near real time streaming on Databricks to process data as soon as a new data finish processing on snowflake e.g. with DLT pipelins and Auto Loader. Which option would be better for this setup? Option A)Export the Snowpark...

  • 992 Views
  • 1 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 0 kudos

it is like latency vs complexity and cost. you have to choose for yourself for me option A sounds reasonable

  • 0 kudos
Sans
by New Contributor III
  • 5353 Views
  • 9 replies
  • 3 kudos

Unable to create new compute in community databricks

Hi Team,I am unable to create computer in databricks community due to below error. Please advice.Bootstrap Timeout:Node daemon ping timeout in 780000 ms for instance i-0ab6798b2c762fb25 @ 10.172.246.217. Please check network connectivity between the ...

  • 5353 Views
  • 9 replies
  • 3 kudos
Latest Reply
drag7ter
Contributor
  • 3 kudos

The same get this error regularly in eu-west-1 workspace. So many issues. Did databricks try to check this issue, as it could be a bug? No any response so far? 

  • 3 kudos
8 More Replies
jyothib
by New Contributor II
  • 2299 Views
  • 2 replies
  • 3 kudos

Resolved! System tables latency

How much time is the latency of system tables#unitycatalog

  • 2299 Views
  • 2 replies
  • 3 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 3 kudos

@jyothib at the current moment, system tables are still under Public Preview stage (more details at: https://docs.databricks.com/en/admin/system-tables/index.html)We don’t offer data freshness SLOs for system tables at this point and there are no pla...

  • 3 kudos
1 More Replies
Kanna
by New Contributor II
  • 1659 Views
  • 1 replies
  • 4 kudos

Resolved! Autoloader clarification

Hi team,Good day! I would like to know how we can perform an incremental load using Autoloader.I am uploading one file to DBFS and writing it into a table. When I upload a similar file to the same directory, it does not perform an incremental load; i...

  • 1659 Views
  • 1 replies
  • 4 kudos
Latest Reply
boitumelodikoko
Valued Contributor
  • 4 kudos

Hi @Kanna,Good day! Based on the issue you’re encountering, I believe the problem stems from missing deduplication or upsert logic in your current implementation. Here's an approach that combines the power of Databricks Autoloader and Delta Lake to h...

  • 4 kudos
harlemmuniz
by New Contributor II
  • 3932 Views
  • 8 replies
  • 1 kudos

Issue with Job Versioning with “Run Job” tasks and Deployments between envinronments

Hello,I am writing to bring to your attention an issue that we have encountered while working with Databricks and seek your assistance in resolving it.When running a Job of Workflow with the task "Run Job" and clicking on "View YAML/JSON," we have ob...

  • 3932 Views
  • 8 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 1 kudos

Hi , Sorry if I don't understand your usecase, are your trying to start/stop databricks job via terraform? for this reason do you want to harcode job-id??

  • 1 kudos
7 More Replies
Kumar4567
by New Contributor II
  • 7252 Views
  • 4 replies
  • 0 kudos

disable downloading files for specific group of users ?

I see we can disable/enable download button for entire workspace using download button for notebook results.is there a way to disable/enable this just for specific group of users ?

  • 7252 Views
  • 4 replies
  • 0 kudos
Latest Reply
_anonymous
New Contributor II
  • 0 kudos

To future adventurers, the feature described by responder to OP does not exist.

  • 0 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels