cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dean_Lovelace
by New Contributor III
  • 21292 Views
  • 13 replies
  • 2 kudos

How can I deploy workflow jobs to another databricks workspace?

I have created a number of workflows in the Databricks UI. I now need to deploy them to a different workspace.How can I do that?Code can be deployed via Git, but the job definitions are stored in the workspace only.

  • 21292 Views
  • 13 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

@itacdonev great option provided, @Dean_Lovelace you can also select the option View JSON on the Workflow and move to the option create, with this code you can use the API https://docs.databricks.com/api/workspace/jobs/create and create the job in th...

  • 2 kudos
12 More Replies
HariharaSam
by Contributor
  • 26962 Views
  • 8 replies
  • 4 kudos

Resolved! To get Number of rows inserted after performing an Insert operation into a table

Consider we have two tables A & B.qry = """INSERT INTO Table ASelect * from Table B where Id is null """spark.sql(qry)I need to get the number of records inserted after running this in databricks.

  • 26962 Views
  • 8 replies
  • 4 kudos
Latest Reply
GRCL
New Contributor III
  • 4 kudos

Almost same advice than Hubert, I use the history of the delta table :df_history.select(F.col('operationMetrics')).collect()[0].operationMetrics['numOutputRows']You can find also other 'operationMetrics' values, like 'numTargetRowsDeleted'.

  • 4 kudos
7 More Replies
chhavibansal
by New Contributor III
  • 1014 Views
  • 1 replies
  • 0 kudos

What is the upper bound limit for dataSkippingNumIndexedCols, to keeps stats in delta log file?

Is there an upper bound of number that i can assign to delta.dataSkippingNumIndexedCols for computing statistics. Is there some tradeoff benchmark available for increasing this number beyond 32.

  • 1014 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Chhavi Bansal​ :The delta.dataSkippingNumIndexedCols configuration property controls the maximum number of columns that Delta Lake will build statistics on during data skipping. By default, this value is set to 32. There is no hard upper bound on th...

  • 0 kudos
youssefmrini
by Databricks Employee
  • 2136 Views
  • 1 replies
  • 4 kudos
  • 2136 Views
  • 1 replies
  • 4 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 4 kudos

You can now use cluster policies to restrict the number of clusters a user can create. For more information https://docs.databricks.com/administration-guide/clusters/policies.html#cluster-limit

  • 4 kudos
Labels