cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kaviprakash
by New Contributor
  • 257 Views
  • 1 replies
  • 0 kudos

ORA-01830: date format picture ends before converting entire input string

Hi,Recently we are migrating our hive metastore workloads to unity catalog. As part of which, se are running into an following error with 15.4 DBR (UC) version where as it's working with 10.4 DBR (Hive). The requirement is to read the data from a tab...

  • 257 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

@Kaviprakash thanks for your question! Is this also probably specific to a cluster type ? Shared vs Single user? If Shared Mode, can you please try restarting your cluster with the following Spark Configuration:spark.connect.perserveOptionCasing true...

  • 0 kudos
dh
by New Contributor
  • 627 Views
  • 1 replies
  • 1 kudos

Data Lineage without Spark, but with Polars (and Delta Lake) instead

Some context: I am completely new to Databricks; have heard good stuff, but also some things that worry me.One thing that worries me is the performance (and eventual costs) of running Spark with smaller (sub 1TB) datasets. However, one requirement fr...

  • 627 Views
  • 1 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

Hi @dh thanks for your question! I believe It’s possible to run Polars with Delta Lake on Databricks, but automatic data lineage tracking is not native outside of Spark jobs. You would likely need to implement custom lineage tracking or integrate ext...

  • 1 kudos
cmilligan
by Contributor II
  • 3922 Views
  • 4 replies
  • 4 kudos

Dropdown for parameters in a job

I want to be able to denote the type of run from a predetermined list of values that a user can choose from when kicking off a run using different parameters. Our team does standardized job runs on a weekly cadence but can have timeframes that change...

  • 3922 Views
  • 4 replies
  • 4 kudos
Latest Reply
Leon_K
New Contributor II
  • 4 kudos

I'm looking to this too. Wonder if there a way to make as a drop down for a job parameter

  • 4 kudos
3 More Replies
Mithos
by New Contributor
  • 124 Views
  • 1 replies
  • 0 kudos

ZCube Tags not present in Databricks Delta Tables

The design doc for Liquid Clustering for Delta refer to Z-Cube to enable  incremental clustering in batches. This is the link - https://docs.google.com/document/d/1FWR3odjOw4v4-hjFy_hVaNdxHVs4WuK1asfB6M6XEMw/edit?pli=1&tab=t.0.It is also mentioned th...

  • 124 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @Mithos thanks for the question! This is the OSS version of LC applicable to OSS Delta. Databricks has a different implementation, so you won't be able to find it in a liquid table written by DBR. 

  • 0 kudos
templier2
by New Contributor II
  • 184 Views
  • 3 replies
  • 0 kudos

Log jobs stdout to an Azure Logs Analytics workspace

Hello,I have enabled cluster logs sending through an mspnp/spark-monitoring, but I don't see there stdout/stderr/log4j logs.Is it supported?

  • 184 Views
  • 3 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @templier2  If it works, it’s not duct tape and chewing gum; it’s a paperclip away from advanced engineering!  You're right, I forgot this option is only there for AWS/S3. So, yeah I think that's the current and only way, mount points.

  • 0 kudos
2 More Replies
theanhdo
by New Contributor III
  • 485 Views
  • 3 replies
  • 0 kudos

Run continuous job for a period of time

Hi there,I have a job where the Trigger type is configured as Continuous. I want to only run the Continuous job for a period of time per day, e.g. 8AM - 5PM. I understand that we can achieve it by manually starting and cancelling the job on the UI, o...

  • 485 Views
  • 3 replies
  • 0 kudos
Latest Reply
theanhdo
New Contributor III
  • 0 kudos

Hi @MuthuLakshmi , thank you for your answer. However, your answer doesn't help with my question. Let me rephrase my question.In short, my question is how to configure a Continuous job to run for a period of time, e.g. from 8AM to 5PM every day, and ...

  • 0 kudos
2 More Replies
jkb7
by New Contributor III
  • 351 Views
  • 6 replies
  • 1 kudos

Resolved! Keep history of task runs in Databricks Workflows while moving it from one job to another

We are using Databricks Asset Bundles (DAB) to orchestrate multiple workflow jobs, each containing multiple tasks.The execution schedules is managed on the job level, i.e., all tasks within a job start together.We often face the issue of rescheduling...

  • 351 Views
  • 6 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

You can submit it through https://docs.databricks.com/en/resources/ideas.html#ideas

  • 1 kudos
5 More Replies
vickytscv
by New Contributor II
  • 217 Views
  • 3 replies
  • 0 kudos

Adobe query support from databricks

Hi Team,     We are working with Adobe tool for campaign metrics. which needs to pull data from AEP using explode option, when we pass query it is taking long time and performance is also very. Is there any better way to pull data from AEP, Please le...

  • 217 Views
  • 3 replies
  • 0 kudos
Latest Reply
jodbx
Databricks Employee
  • 0 kudos

https://github.com/Adobe-Marketing-Cloud/aep-cloud-ml-ecosystem 

  • 0 kudos
2 More Replies
T_I
by New Contributor II
  • 327 Views
  • 4 replies
  • 0 kudos

Connect Databricks to Airflow

Hi,I have Databricks on top of aws. I have a Databricks connection on Airflow (mwaa). I am able to conect and execute a Datbricks job via Airflow using a personal access token. I believe the best practice is to conect using a service principal. I und...

  • 327 Views
  • 4 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @T_I, Instead of the PAT token you have to specify the below settings to be able to use the Service Principal: For workspace-level operations, set the following environment variables: DATABRICKS_HOST, set to the Databricks workspace URL, for exam...

  • 0 kudos
3 More Replies
Steve_Harrison
by New Contributor III
  • 417 Views
  • 2 replies
  • 0 kudos

Invalid Path when getting Notebook Path

The undocumented feature to get a notebook path as is great but it does not actually return a valid path that can be used in python, e.g.:from pathlib import Pathprint(Path(dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPat...

  • 417 Views
  • 2 replies
  • 0 kudos
Latest Reply
Steve_Harrison
New Contributor III
  • 0 kudos

I actually think the major issue is that the above is undocumented and not supported. A supported and documented way of doing this would be much appreciated.

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 6869 Views
  • 10 replies
  • 10 kudos

Delta Live Table name dynamically

Hi Team,Can we pass Delta Live Table name dynamically [from a configuration file, instead of hardcoding the table name]? We would like to build a metadata-driven pipeline.

  • 6869 Views
  • 10 replies
  • 10 kudos
Latest Reply
bmhardy
New Contributor II
  • 10 kudos

Is this post referring to Direct Publishing Mode? As we are multi-tenanted we have to have separate schema per client, which currently means a single pipeline per client. This is not cost effective at all, so we are very much reliant on DPM. I believ...

  • 10 kudos
9 More Replies
maikl
by New Contributor III
  • 331 Views
  • 4 replies
  • 0 kudos

Resolved! DABs job name must start with a letter or underscore

Hi,In UI I used the pipeline name 00101_source_bronze. I wanted to do the same in the Databricks Asset Bundles.but when the configuration is refreshed against Databricks Workspace I see this error:I found that this issue can be connect to Terraform v...

maikl_0-1733912307017.png maikl_1-1733912509922.png
  • 331 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

As mentioned above, this is a limitation directly with Terraform due to this our engineering team is limited on the actions that can be done, you can find more information about this limitation on the Terraform documentation: https://developer.hashic...

  • 0 kudos
3 More Replies
Anonymous
by Not applicable
  • 334 Views
  • 1 replies
  • 1 kudos

Resolved! workflow set maximum queued items

Hi all,I have a question regarding Workflows and queuing of job runs. I'm running into a case where jobs are running longer than expected and result in job runs being queued, which is expected and desired. However, in this particular case we only nee...

  • 334 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Unfortunately there is no way to control the number of jobs that will be moved to queue status when queuing is enabled.

  • 1 kudos
alcatraz96
by New Contributor II
  • 280 Views
  • 3 replies
  • 0 kudos

Guidance Needed for Developing CI/CD Process in Databricks Using Azure DevOps

Hi everyone,I am working on setting up a complete end-to-end CI/CD process for my Databricks environment using Azure DevOps. So far, I have developed a build pipeline to create a Databricks artifact (DAB). Now, I need to create a release pipeline to ...

alcatraz96_1-1733897791930.png
  • 280 Views
  • 3 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @alcatraz96 ,One question, why don't you use Databricks Assets Bundles? Then the whole process would be much simplerHere you have a good end to end example:CI/CD Integration with Databricks Workflows - Databricks Community - 81821

  • 0 kudos
2 More Replies
Nes_Hdr
by New Contributor III
  • 827 Views
  • 10 replies
  • 0 kudos

Limitations for Unity Catalog on single user access mode clusters

Hello! According to Databricks documentation on azure :"On Databricks Runtime 15.3 and below, fine-grained access control on single user compute is not supported. Specifically:You cannot access a table that has a row filter or column mask.You cannot ...

Nes_Hdr_0-1732872787713.png
  • 827 Views
  • 10 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@Nes_Hdr Single user compute uses fine-grained access control to access the tables with RLS/CLM enabled.There is no specific details about OPTIMIZE being supported in Single user mode. Under this doc limitations of FGAC mentions that  "No support for...

  • 0 kudos
9 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels