cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

kenmyers-8451
by Contributor
  • 821 Views
  • 4 replies
  • 0 kudos

bug with using parameters in a sql task

I am trying to make a sql task that runs using a serverless sql warehouse that takes a variable and uses that in the sql file that it is running in a serverless warehouse, however I am getting errors because databricks keeps formatting it first with ...

kenmyers8451_0-1747677585574.png
  • 821 Views
  • 4 replies
  • 0 kudos
Latest Reply
asuvorkin
New Contributor
  • 0 kudos

I have been trying to use templates as well and got the following string:LOCATION 's3://s3-company-data-' dev '-' 1122334455 '-eu-central-1/path_to_churn/main/'

  • 0 kudos
3 More Replies
data-grassroots
by New Contributor III
  • 187 Views
  • 4 replies
  • 1 kudos

ExcelWriter and local files

I have a couple things going on here.First, to explain what I'm doing, I'm passing an array of objects in to a function that contain a dataframe per item. I want to write those dataframes to an excel workbook - one dataframe per worksheet. That part ...

  • 187 Views
  • 4 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Hello @data-grassroots! Were you able to resolve this? If any of the suggestions shared above helped, or if you found another solution, it would be great if you could mark it as the accepted solution or share your approach with the community.

  • 1 kudos
3 More Replies
fjrodriguez
by New Contributor III
  • 86 Views
  • 5 replies
  • 4 kudos

Resolved! Self Dependency TumblingWindowTrigger in adf

Hey !I would like to migrate one ADF batch ingestion which has a TumblingWindowTrigger  on top of the pipeline which pretty much check each 15 min if a file is landing, normally the files land in daily basis so will process it accordingly once in a d...

  • 86 Views
  • 5 replies
  • 4 kudos
Latest Reply
fjrodriguez
New Contributor III
  • 4 kudos

Hi @szymon_dybczak ,sounds reasonable, will propone this approach. Thanks

  • 4 kudos
4 More Replies
Hritik_Moon
by New Contributor
  • 178 Views
  • 12 replies
  • 17 kudos

Accessing Spark UI in free edition

Hello, is it possible to access Spark UI in free edition, I want to check task and stages.Ultimately I am working on how to check data skewness.

  • 178 Views
  • 12 replies
  • 17 kudos
Latest Reply
Hritik_Moon
New Contributor
  • 17 kudos

@szymon_dybczak @BS_THE_ANALYST Is there a specific guide or a flow to be a better databricks data engineer. I am learning as the topic comes up.Finding it really difficult to maintain a flow and I lose track.

  • 17 kudos
11 More Replies
perenehal
by New Contributor
  • 45 Views
  • 2 replies
  • 1 kudos

Unable to Verify Account – "User is not a member of this workspace" Community addition..

I am encountering an issue during the verification process while trying to access my Databricks account.I received the verification code via email; however, when I attempt to verify it, I receive the following error message:"User is not a member of t...

  • 45 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Hello @perenehal! The error indicates that you haven’t created an account in the Databricks Community Edition. Currently, creating new accounts for Community Edition is not possible. However, if you already had an existing account and are still seein...

  • 1 kudos
1 More Replies
JanAkhi919
by New Contributor
  • 3163 Views
  • 2 replies
  • 1 kudos

How agentic ai is different from Ai agents

How agentic ai different from ai agents

  • 3163 Views
  • 2 replies
  • 1 kudos
Latest Reply
CrossMLPvtLtd
New Contributor
  • 1 kudos

Hey there! Let me break this down simply.What's an AI Agent?An AI agent is like a smart worker who follows instructions for one specific job. Think of a customer service chatbot that answers "Where's my order?" perfectly. But ask something complex li...

  • 1 kudos
1 More Replies
bunny1174
by New Contributor
  • 34 Views
  • 1 replies
  • 1 kudos

Spark Streaming Loading 1kto 5k rows only delta table

Hi Team,I have 4-5 millions of files in s3 files around 1.5gb data only with 9 million records, when i try to use autoloader to read the data using read stream and writing to delta table the processing is taking too much time, it is loading from 1k t...

  • 34 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @bunny1174 ,You have 4-5 millions of files in s3 and their size is 1.5gb - this clearly indicates small files problem. You need compact those files to bigger size. There's no way your pipeline will be performant if you have such many files and the...

  • 1 kudos
ext07_rvoort
by New Contributor
  • 33 Views
  • 1 replies
  • 0 kudos

Databricks Asset Bundles: issue with python_file path in spark_python_task

Hi,I am trying to run a python file which is stored in src folder. However, I am getting the following error: Error: cannot update job: Invalid python file reference: src/get_git_credentials.py. Please visit the Databricks user guide for supported py...

ext07_rvoort_2-1759997838194.png
Data Engineering
DAB
Databricks Asset Bundles
  • 33 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @ext07_rvoort ,Could you try specify it in relative way:python_file: ../src/get_git_credentials.pySo, in my setup I have following directory structure:root folder:  - databricks.yml file  - src/  - resources/job.ymlAnd then to refer in job.yml to ...

  • 0 kudos
liu
by New Contributor III
  • 96 Views
  • 7 replies
  • 4 kudos

configure AWS authentication for serverless Spark

I only have an AWS Access Key ID and Secret Access Key, and I want to use this information to access S3.However, the official documentation states that I need to set the AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID environment variables, but I cannot ...

  • 96 Views
  • 7 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Hi @liu ,The proper way is to go to your cluster and in advanced section you can set them up. In that way they will be scoped at cluster level.  It's recommended to store values itself in a secret scopes as environment variables:Use a secret in a Spa...

  • 4 kudos
6 More Replies
touchyvivace
by New Contributor
  • 32 Views
  • 1 replies
  • 1 kudos

is there another way to authen to azure databricks using MSI on Java

Hi I am try to connect to azure databricks using MSI on Java but on a documenthttps://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-miit saidThe Databricks SDK for Java has not yet implemented Azure managed identities authentication...

  • 32 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @touchyvivace Unfortunately not, the documentation is up to date. In the Java SDK, MSI has not been implemented yet.And here's an open issue on github:[FEATURE] Add support for Azure Managed Identity authentication (system and user-assigned) · Iss...

  • 1 kudos
IONA
by New Contributor III
  • 51 Views
  • 1 replies
  • 2 kudos

Dev/Pie/Prd and the same workspace

Hi all!I'm appealing to all you folk who are clever than I for some advice on Databricks dev ops.I was asked by my team leader to expand our singular environment to a devops style dev/pie/prd system, potentially using Dabs to promote code to higher e...

  • 51 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @IONA ,I guess you can still use DABs to simulate different environments on single workspace. In targets define 3 different environments but with the same for all of them (something similar to the picture below).Then your intuition is good - it's ...

  • 2 kudos
ckanzabedian
by New Contributor
  • 45 Views
  • 1 replies
  • 1 kudos

ServiceNow LakeFlow Connector - Using TABLE API only for tables and NOT views

The current Databricks ServiceNow Lakeflow connector relies on ServiceNow REST TABLE API to capture data. And for some reason, it is unable to list a user defined view as a data source to be configured, even though ServiceNow user defined views are a...

  • 45 Views
  • 1 replies
  • 1 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 1 kudos

Hi @ckanzabedian, have you checked out the documentation yet for the ServiceNow connector?https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-limits The link above is about the limits. I can't see a mention about ...

  • 1 kudos
excavator-matt
by New Contributor III
  • 160 Views
  • 4 replies
  • 2 kudos

How do use Databricks Lakeflow Declarative Pipeline on AWS DMS data?

Hi!I am trying to replicate an AWS RDS PostgreSQL database in Databricks. I have successfully manage to enable CDC using AWS DMS that writes an initial load file and continuous CDC files in parquet.I have been trying to follow the official guide Repl...

Data Engineering
AUTO CDC
AWS DMS
declarative pipelines
LakeFlow
  • 160 Views
  • 4 replies
  • 2 kudos
Latest Reply
mmayorga
Databricks Employee
  • 2 kudos

Hi @excavator-matt , Yes, you are correct. CloudFiles/Autoloader handles idempotency on the file level.  From the guide's perspective, the View is created from the source files in the specified location. This view captures all files and their corresp...

  • 2 kudos
3 More Replies
smoortema
by New Contributor III
  • 74 Views
  • 1 replies
  • 1 kudos

How to make FOR cycle and dynamic SQL and variables work together

I am working on a testing notebook where the table that is tested can be given as a widget. I wanted to write it in SQL. The notebook does the following steps in a cycle that should run 10 times:1. Store the starting version of a delta table in a var...

  • 74 Views
  • 1 replies
  • 1 kudos
Latest Reply
mmayorga
Databricks Employee
  • 1 kudos

Hi @smoortema , Thank you for reaching out! You are very close to getting the “start_version”; you just need to include “INTO start_version” after the “EXECUTE IMMEDIATE”. Here is the updated code BEGIN DECLARE sum INT DEFAULT 0; DECLARE start_ve...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels