cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

blobbles78
by New Contributor II
  • 728 Views
  • 6 replies
  • 2 kudos

Resolved! SQL run on cluster creates table different to SQL Warehouse endpoint

I have a Personal cluster version 15.4 LTS (includes Apache Spark 3.5.0, Scala 2.12) and a SQL Warehouse in a databricks environment.  When I use the following code to create a table in a catalog, it gives me different column types when run on the cl...

  • 728 Views
  • 6 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

It seems that as per docs as of now this setting is only true by default in warehouses in clusters it is set to false: https://docs.databricks.com/en/sql/language-manual/sql-ref-ansi-compliance.html#ansi-compliance-in-databricks-runtime 

  • 2 kudos
5 More Replies
SureshRajG
by New Contributor
  • 263 Views
  • 1 replies
  • 0 kudos
  • 263 Views
  • 1 replies
  • 0 kudos
Latest Reply
Stefan-Koch
Valued Contributor II
  • 0 kudos

HiYou can achieve that with pandas. See the following example code:%pip install openpyxlimport pandas as pd file_location_xls = "path/to/excel/1.xlsx" # read the sheet with Name Financials1" into a pandas dataframe pdf = pd.read_excel(file_location...

  • 0 kudos
NhanNguyen
by Contributor III
  • 855 Views
  • 1 replies
  • 0 kudos

How to handle timeout exception in Error Handle Task

Dear team, I have a workflow like this, task_a, task_b and handle_error. How I handle any timeout exception from task_a and task_b or any future task in future and log into handle error task at the end.Best regards,Jensen Nguyen 

NhanNguyen_1-1727750059373.png
  • 855 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

@NhanNguyen thanks for your question! Have you maybe consider to: Define a Global Error Task: Add a handle_error task in the workflow that runs conditionally on task failure.Set Failure Conditions: In the UI or through JSON, configure the handle_erro...

  • 0 kudos
NaeemS
by New Contributor III
  • 901 Views
  • 1 replies
  • 0 kudos

Static Parameters in Feature Functions

Hi,I'm implementing a machine learning pipeline using feature stores and I'm running into a limitation with feature functions. I'd like to perform multiple calculations on my columns with some minor adjustments, but I need to pass a static parameter ...

  • 901 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @NaeemS thanks for your question! Yes, you can pass a static parameter to a feature function to control its behavior in Databricks Feature Store. This allows you to perform multiple calculations on your columns with minor adjustments without defin...

  • 0 kudos
Kaviprakash
by New Contributor
  • 538 Views
  • 1 replies
  • 0 kudos

ORA-01830: date format picture ends before converting entire input string

Hi,Recently we are migrating our hive metastore workloads to unity catalog. As part of which, se are running into an following error with 15.4 DBR (UC) version where as it's working with 10.4 DBR (Hive). The requirement is to read the data from a tab...

  • 538 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

@Kaviprakash thanks for your question! Is this also probably specific to a cluster type ? Shared vs Single user? If Shared Mode, can you please try restarting your cluster with the following Spark Configuration:spark.connect.perserveOptionCasing true...

  • 0 kudos
dh
by New Contributor
  • 1924 Views
  • 1 replies
  • 1 kudos

Data Lineage without Spark, but with Polars (and Delta Lake) instead

Some context: I am completely new to Databricks; have heard good stuff, but also some things that worry me.One thing that worries me is the performance (and eventual costs) of running Spark with smaller (sub 1TB) datasets. However, one requirement fr...

  • 1924 Views
  • 1 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

Hi @dh thanks for your question! I believe It’s possible to run Polars with Delta Lake on Databricks, but automatic data lineage tracking is not native outside of Spark jobs. You would likely need to implement custom lineage tracking or integrate ext...

  • 1 kudos
cmilligan
by Contributor II
  • 4494 Views
  • 4 replies
  • 4 kudos

Dropdown for parameters in a job

I want to be able to denote the type of run from a predetermined list of values that a user can choose from when kicking off a run using different parameters. Our team does standardized job runs on a weekly cadence but can have timeframes that change...

  • 4494 Views
  • 4 replies
  • 4 kudos
Latest Reply
Leon_K
New Contributor II
  • 4 kudos

I'm looking to this too. Wonder if there a way to make as a drop down for a job parameter

  • 4 kudos
3 More Replies
Mithos
by New Contributor
  • 208 Views
  • 1 replies
  • 0 kudos

ZCube Tags not present in Databricks Delta Tables

The design doc for Liquid Clustering for Delta refer to Z-Cube to enable  incremental clustering in batches. This is the link - https://docs.google.com/document/d/1FWR3odjOw4v4-hjFy_hVaNdxHVs4WuK1asfB6M6XEMw/edit?pli=1&tab=t.0.It is also mentioned th...

  • 208 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @Mithos thanks for the question! This is the OSS version of LC applicable to OSS Delta. Databricks has a different implementation, so you won't be able to find it in a liquid table written by DBR. 

  • 0 kudos
templier2
by New Contributor II
  • 1388 Views
  • 3 replies
  • 0 kudos

Log jobs stdout to an Azure Logs Analytics workspace

Hello,I have enabled cluster logs sending through an mspnp/spark-monitoring, but I don't see there stdout/stderr/log4j logs.Is it supported?

  • 1388 Views
  • 3 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @templier2  If it works, it’s not duct tape and chewing gum; it’s a paperclip away from advanced engineering!  You're right, I forgot this option is only there for AWS/S3. So, yeah I think that's the current and only way, mount points.

  • 0 kudos
2 More Replies
theanhdo
by New Contributor III
  • 1183 Views
  • 3 replies
  • 1 kudos

Run continuous job for a period of time

Hi there,I have a job where the Trigger type is configured as Continuous. I want to only run the Continuous job for a period of time per day, e.g. 8AM - 5PM. I understand that we can achieve it by manually starting and cancelling the job on the UI, o...

  • 1183 Views
  • 3 replies
  • 1 kudos
Latest Reply
theanhdo
New Contributor III
  • 1 kudos

Hi @MuthuLakshmi , thank you for your answer. However, your answer doesn't help with my question. Let me rephrase my question.In short, my question is how to configure a Continuous job to run for a period of time, e.g. from 8AM to 5PM every day, and ...

  • 1 kudos
2 More Replies
jkb7
by New Contributor III
  • 818 Views
  • 6 replies
  • 2 kudos

Resolved! Keep history of task runs in Databricks Workflows while moving it from one job to another

We are using Databricks Asset Bundles (DAB) to orchestrate multiple workflow jobs, each containing multiple tasks.The execution schedules is managed on the job level, i.e., all tasks within a job start together.We often face the issue of rescheduling...

  • 818 Views
  • 6 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

You can submit it through https://docs.databricks.com/en/resources/ideas.html#ideas

  • 2 kudos
5 More Replies
vickytscv
by New Contributor II
  • 451 Views
  • 3 replies
  • 0 kudos

Adobe query support from databricks

Hi Team,     We are working with Adobe tool for campaign metrics. which needs to pull data from AEP using explode option, when we pass query it is taking long time and performance is also very. Is there any better way to pull data from AEP, Please le...

  • 451 Views
  • 3 replies
  • 0 kudos
Latest Reply
jodbx
Databricks Employee
  • 0 kudos

https://github.com/Adobe-Marketing-Cloud/aep-cloud-ml-ecosystem 

  • 0 kudos
2 More Replies
Steve_Harrison
by New Contributor III
  • 1015 Views
  • 2 replies
  • 0 kudos

Invalid Path when getting Notebook Path

The undocumented feature to get a notebook path as is great but it does not actually return a valid path that can be used in python, e.g.:from pathlib import Pathprint(Path(dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPat...

  • 1015 Views
  • 2 replies
  • 0 kudos
Latest Reply
Steve_Harrison
New Contributor III
  • 0 kudos

I actually think the major issue is that the above is undocumented and not supported. A supported and documented way of doing this would be much appreciated.

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 7680 Views
  • 10 replies
  • 10 kudos

Delta Live Table name dynamically

Hi Team,Can we pass Delta Live Table name dynamically [from a configuration file, instead of hardcoding the table name]? We would like to build a metadata-driven pipeline.

  • 7680 Views
  • 10 replies
  • 10 kudos
Latest Reply
bmhardy
New Contributor III
  • 10 kudos

Is this post referring to Direct Publishing Mode? As we are multi-tenanted we have to have separate schema per client, which currently means a single pipeline per client. This is not cost effective at all, so we are very much reliant on DPM. I believ...

  • 10 kudos
9 More Replies
maikl
by New Contributor III
  • 462 Views
  • 4 replies
  • 0 kudos

Resolved! DABs job name must start with a letter or underscore

Hi,In UI I used the pipeline name 00101_source_bronze. I wanted to do the same in the Databricks Asset Bundles.but when the configuration is refreshed against Databricks Workspace I see this error:I found that this issue can be connect to Terraform v...

maikl_0-1733912307017.png maikl_1-1733912509922.png
  • 462 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

As mentioned above, this is a limitation directly with Terraform due to this our engineering team is limited on the actions that can be done, you can find more information about this limitation on the Terraform documentation: https://developer.hashic...

  • 0 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels