cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

aseufert
by New Contributor III
  • 7379 Views
  • 7 replies
  • 4 kudos

Dynamic Value References Not Working

I can't get the dynamic value references to work in my jobs. I can use the deprecate references (e.g. job_id) but not the new references (e.g. job.id). As a test, I set a text widget called MyJobID following the example that will receive the dynamic ...

  • 7379 Views
  • 7 replies
  • 4 kudos
Latest Reply
themattmorris
New Contributor III
  • 4 kudos

For what it's worth, it looks like job-level parameters were added with this update as well. I was wondering why I was unable to use those, but those are also working for me now.

  • 4 kudos
6 More Replies
thibault
by Contributor III
  • 3783 Views
  • 1 replies
  • 0 kudos

Resolved! Import notebook content into a python file

Hi, I have a workflow based on python scripts. How can I import the content of a notebook where a class and functions are defined?I know how to import python files into notebooks, but the other way around doesn't seem as straight forward.

  • 3783 Views
  • 1 replies
  • 0 kudos
Latest Reply
thibault
Contributor III
  • 0 kudos

Found a solution executing a notebook, using the databricks api to download the notebook content as bytes :1. set environment variables DATABRICKS_HOST and DATABRICKS_TOKEN2. w = WorkspaceClient()with w.workspace.download(notebook_path) as n: note...

  • 0 kudos
pgruetter
by Contributor
  • 3698 Views
  • 1 replies
  • 1 kudos

Help me understand streaming logic with Delta Tables

Hello allI have a delta table in bronze layer, let's call it BRZ. It contains 25B rows and many duplicates. It has a version 0 and a version 1, nothing else yet. I then create a silver table SLV by running one deduplication batch job. This creates ve...

  • 3698 Views
  • 1 replies
  • 1 kudos
Latest Reply
pgruetter
Contributor
  • 1 kudos

Thanks for the confirmation. Not sure I see everything as your text gets truncated, but it basically confirms that it should work.Anyway: It looks like the incremental load is working. The problem here is, that we receive late arriving facts that tou...

  • 1 kudos
alexiswl
by Contributor
  • 11376 Views
  • 3 replies
  • 0 kudos

Resolved! Create a UDF Table Function with DLT in UC

Hello, I am trying to generate a DLT but need to use a UDF Table Function in the process.  This is what I have so far, everything works (without e CREATE OR REFRESH LIVE TABLE wrapper)```sqlCREATE OR REPLACE FUNCTION silver.portal.get_workflows_from_...

  • 11376 Views
  • 3 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@alexiswl - could you please use CREATE OR REPLACE function instead of CREATE OR REFRESH LIVE table?

  • 0 kudos
2 More Replies
Nathant93
by New Contributor III
  • 4865 Views
  • 1 replies
  • 0 kudos

Resolved! Date formatting

Does anyone know how to change the format of a date like this Dec 17 2016 8:22PMinto yyyy-MM-dd hh:mm:ss?Thanks

  • 4865 Views
  • 1 replies
  • 0 kudos
Latest Reply
Krishnamatta
Contributor
  • 0 kudos

 Convert to timestamp first and then format to stringselect  date_format(to_timestamp('Dec 17 2016 8:22PM', 'MMM dd yyyy h:ma'), "yyyy-MM-dd HH:mm:ss")Here is the documentation for this:https://docs.databricks.com/en/sql/language-manual/sql-ref-datet...

  • 0 kudos
Chris_sh
by New Contributor II
  • 1376 Views
  • 0 replies
  • 0 kudos

Enhancement Request: DLT: Infer Schema Logic/Merge Logic

Currently when DLT runs it observes NULL values in a column and infers that that column should be a string by default. The next time that table runs numeric values are added and it infers that it is now a numeric column. DLT tries to merge these two ...

  • 1376 Views
  • 0 replies
  • 0 kudos
Randy
by New Contributor III
  • 1851 Views
  • 1 replies
  • 0 kudos

Resolved! Unable to Write Table to Synapse 'x' has a data type that cannot participate in a columnstore index.

We have a process that creates a table in Synapse then attempts to write the Data generated in Databricks to it. We are able to create the table no problem but when we go to copy the data we keep getting an error that the column has a data type that ...

  • 1851 Views
  • 1 replies
  • 0 kudos
Latest Reply
Randy
New Contributor III
  • 0 kudos

Resolved

  • 0 kudos
learnerbricks
by New Contributor II
  • 7485 Views
  • 4 replies
  • 0 kudos

Unable to save file in DBFS

I have took the azure datasets that are available for practice. I got the 10 days data from that dataset and now I want to save this data into DBFS in csv format. I have facing an error :" No such file or directory: 'No such file or directory: '/dbfs...

  • 7485 Views
  • 4 replies
  • 0 kudos
Latest Reply
pardosa
New Contributor II
  • 0 kudos

Hi,after some exercise you need to aware folder create in dbutils.fs.mkdirs("/dbfs/tmp/myfolder") it's created in /dbfs/dbfs/tmp/myfolderif you want to access path to_csv("/dbfs/tmp/myfolder/mytest.csv") you should created with this script dbutils.fs...

  • 0 kudos
3 More Replies
MarcintheCloud
by New Contributor II
  • 1337 Views
  • 0 replies
  • 1 kudos

Is it possible to clone/read an existing external Iceberg table in Databricks?

Hello, I've been experimenting with trying to read and/or clone an existing Iceberg table into Databricks/Delta. I have an Azure Blob Storage container (configured to use absf for access) that contains an existing Iceberg table structure (data in par...

  • 1337 Views
  • 0 replies
  • 1 kudos
ilarsen
by Contributor
  • 2785 Views
  • 1 replies
  • 0 kudos

Auto Loader and source file structure optimisation

Hi.  I have a question, and I've not been able to find an answer.  I'm sure there is one...I just haven't found it through searching and browsing the docs. How much does it matter (if it is indeed that simple) if source files read by auto loader are ...

  • 2785 Views
  • 1 replies
  • 0 kudos
Rubini_MJ
by New Contributor
  • 10252 Views
  • 1 replies
  • 0 kudos

Resolved! Other memory of the driver is high even in a newly spun cluster

Hi Team Experts,    I am experiencing a high memory consumption in the other part in the memory utilization part in the metrics tab. Right now am not running any jobs but still out of 8gb driver memory 6gb is almost full by other and only 1.5 gb is t...

  • 10252 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16539034020
Databricks Employee
  • 0 kudos

Hello,  Thanks for contacting Databricks Support.  Seems you are concern with high memory consumption in the "other" category in the driver node of a Spark cluster. As there are no logs/detail information provided, I only can address several potentia...

  • 0 kudos
saiprasadambati
by New Contributor III
  • 5472 Views
  • 7 replies
  • 1 kudos

Resolved! examples on python sdk for install libraries

Hi Everyone,I'm planning to use databricks python cli "install_libraries"can some one pls post examples on function install_libraries https://github.com/databricks/databricks-cli/blob/main/databricks_cli/libraries/api.py

  • 5472 Views
  • 7 replies
  • 1 kudos
Latest Reply
Loop-Insist
New Contributor II
  • 1 kudos

Here you go using Python SDKfrom databricks.sdk import WorkspaceClientfrom databricks.sdk.service import computew = WorkspaceClient(host="yourhost", token="yourtoken")# Create an array of Library objects to be installedlibraries_to_install = [compute...

  • 1 kudos
6 More Replies
JVesely
by New Contributor III
  • 1898 Views
  • 1 replies
  • 0 kudos

Resolved! DLT CDC SCD-1 pipeline not showing stats when reading from parquet file

Hi,I followed the tutorial here: https://docs.databricks.com/en/delta-live-tables/cdc.html#how-is-cdc-implemented-with-delta-live-tablesThe only change I did is that data is not appended to a table but is read from a parquet file. In practice this me...

  • 1898 Views
  • 1 replies
  • 0 kudos
Latest Reply
JVesely
New Contributor III
  • 0 kudos

My bad - waiting a bit and doing a proper screen refresh does show the numbers. 

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels