cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

youssefmrini
by Databricks Employee
  • 3031 Views
  • 1 replies
  • 4 kudos
  • 3031 Views
  • 1 replies
  • 4 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 4 kudos

You can now use cluster policies to restrict the number of clusters a user can create. For more information https://docs.databricks.com/administration-guide/clusters/policies.html#cluster-limit

  • 4 kudos
youssefmrini
by Databricks Employee
  • 2868 Views
  • 1 replies
  • 2 kudos
  • 2868 Views
  • 1 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

Clone can now be used to create and incrementally update Delta tables that mirror Apache Parquet and Apache Iceberg tables. You can update your source Parquet table and incrementally apply the changes to their cloned Delta table with the clone comman...

  • 2 kudos
youssefmrini
by Databricks Employee
  • 1673 Views
  • 1 replies
  • 2 kudos
  • 1673 Views
  • 1 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

You can now use OAuth to authenticate to Power BI and Tableau. For more information, see Configure OAuth (Public Preview) for Power BI and Configure OAuth (Public Preview) for Tableau.https://docs.databricks.com/integrations/configure-oauth-powerbi.h...

  • 2 kudos
156190
by New Contributor III
  • 4973 Views
  • 4 replies
  • 3 kudos

Resolved! Is 'run_as' user available from jobs api 2.1?

I know that the run_as user generally defaults to the creator_user, but I would like to find the defined run_as user for each of our jobs. Unfortunately, I'm unable to locate that field in the api.

  • 4973 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Keller, Michael​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 3 kudos
3 More Replies
SagarK1
by New Contributor
  • 6292 Views
  • 4 replies
  • 2 kudos

Managing the permissions using MLFlow APIs

Hello All,I am trying to manage the permissions on the experiments using the MLFLow API. Do we have any MLFlow API which helps to manage the permissions of Can Read ,Can Edit , Can Manage.Example :I create the model using MLFlow APIs and through my c...

  • 6292 Views
  • 4 replies
  • 2 kudos
Latest Reply
jsan
New Contributor II
  • 2 kudos

Hey folks, did we get any workaround for this or what @Sean Owen​ said is true ?

  • 2 kudos
3 More Replies
zeta_load
by New Contributor II
  • 25238 Views
  • 1 replies
  • 1 kudos

Resolved! Is it possible to restart a cluster from a Notebook without using the UI

I have some code that occasionally wrong executed, meaning that every n-th time a calculation in a table is wrong. If that happens, I want to be able to restart the cluster from the Notebook.- I'm therefore lookong for a piece of code that can accomp...

  • 25238 Views
  • 1 replies
  • 1 kudos
Latest Reply
daniel_sahal
Databricks MVP
  • 1 kudos

@Lukas Goldschmied​ It is. You'll need to use Databricks API.Here you can find an example:https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/clusters-long-running-optional-restart.html

  • 1 kudos
332588
by New Contributor II
  • 2102 Views
  • 3 replies
  • 3 kudos

We are using the Databricks managed MLflow to log experiment runs for quite some time already and never experienced issues. However, now we seem to have encountered a bug in the associated Databricks UI.

We observe the following behavior when we keep adding new runs to an experiment:- In the beginning, the runs are still displayed correctly in the UI.- After a certain number of total runs, the following bug occurs in the UI:   - In the UI, there are ...

  • 2102 Views
  • 3 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi @Timo Burmeister​ Apologies for the delay! I went through the video, does it happen all the time? I see after sorting it with different filter the list appears.

  • 3 kudos
2 More Replies
prasadvaze
by Valued Contributor II
  • 8649 Views
  • 3 replies
  • 0 kudos

Error loading MANAGED table in unity catalog delta lake on azure. Anyone seen this issue? "ErrorClass=INVALID_PARAMETER_VALUE] Input path <file system name>.dfs.core.windows.net overlaps with other external tables"

00007160: 2023-01-30T14:22:06 [TARGET_LOAD ]E: Failed (retcode -1) to execute statement: 'COPY INTO `e2underwriting_dbo`.`product` FROM(SELECT cast(_c0 as INT) as `ProductID`, _c1 as `ShortName`, cast(_c2 as INT) as `Status`, cast(_c3 as TIMESTA...

  • 8649 Views
  • 3 replies
  • 0 kudos
Latest Reply
prasadvaze
Valued Contributor II
  • 0 kudos

we have solved this issue related to Qlik replicate copying data into delta table

  • 0 kudos
2 More Replies
youssefmrini
by Databricks Employee
  • 1688 Views
  • 1 replies
  • 1 kudos
  • 1688 Views
  • 1 replies
  • 1 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 1 kudos

You can ensure there is always an active run of your Databricks job with the new continuous trigger type. https://docs.databricks.com/workflows/jobs/jobs.html#continuous-jobs

  • 1 kudos
tw1
by New Contributor III
  • 13390 Views
  • 9 replies
  • 3 kudos

Resolved! Can't write / overwrite delta table with error: oxxxx.saveAsTable. (Driver Error: OutOfMemory)

Current Cluster Config:Standard_DS3_v2 (14GB, 4 Cores) 2-6 workersStandard_DS3_v2 (14GB, 4Cores) for driverRuntime: 10.4x-scala2.12We want to overwrite a temporary delta table with new records. The records will be load by another delta table and tran...

image image
  • 13390 Views
  • 9 replies
  • 3 kudos
Latest Reply
tw1
New Contributor III
  • 3 kudos

Hi,thank you for your help!We tested the configuration settings and it runs without any errors.Could you give us some more information, where we can find some documentation about such settings. We searched hours to fix our problem. So we contacted th...

  • 3 kudos
8 More Replies
Lulka
by New Contributor II
  • 5837 Views
  • 2 replies
  • 2 kudos

Resolved! How limit input rate reading delta table as stream?

Hello to everyone!I am trying to read delta table as a streaming source using spark. But my microbatches are disbalanced - one very small and the other are very huge. How I can limit this? I used different configurations with maxBytesPerTrigger and m...

  • 5837 Views
  • 2 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

besides the parameters you mention, I don't know of any other which controls the batch size.did you check if the delta table is not horribly skewed?

  • 2 kudos
1 More Replies
RyanHager
by Contributor
  • 4866 Views
  • 5 replies
  • 2 kudos

Are there any plans to add functions on the partition by fields of a delta table definition such as day() ? A similar capability exists in iceberg.

Benefit: This will help simplify the where clauses of the consumers of the tables? Just query on the main date field if I need all the data for a day. Not an extra day field we had to make.

  • 4866 Views
  • 5 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

@Ryan Hager​ , yes it is possible using AUTO GENERATED COLUMNS since delta lake 1.2For example, you can automatically generate a date column (for partitioning the table by date) from the timestamp column; any writes into the table need only specify t...

  • 2 kudos
4 More Replies
hare
by New Contributor III
  • 4328 Views
  • 4 replies
  • 3 kudos

Implementation of Late arriving dimension in databricks

Hi Team, Can you please suggest to me how to implement the late arriving dimension or early arriving fact with examples or any sample script for reference? I have to implement the same using pyspark.Thanks.

  • 4328 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Hare Krishnan​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 3 kudos
3 More Replies
none_ranjeet
by New Contributor III
  • 4123 Views
  • 3 replies
  • 2 kudos

Resolved! Passed the Fundamentals of the Databricks Lakehouse Platform Accreditation, but no badge recieved. Tried "https://v2.accounts.accredible.com/retrieve-credentials?" showing no badge.

Passed the Fundamentals of the Databricks Lakehouse Platform Accreditation, but no badge recieved. Tried "https://v2.accounts.accredible.com/retrieve-credentials?" showing no badge. 

  • 4123 Views
  • 3 replies
  • 2 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 2 kudos

Hi @Ranjeet Ahlawat​ ,Congratulations on the certification. For any certification you take in the databricks you will be receiving the certificate and the badge in 24-48 hours and sometimes in lesser time as well. All the best for your future certifi...

  • 2 kudos
2 More Replies
asami34
by New Contributor II
  • 5761 Views
  • 7 replies
  • 0 kudos

Cannot reset password, no support

I cannot log in to my Databricks community account. I have already tried to receive support and no real support has been given. I attempt to reset my password, the link gets sent, but once I enter the new password it gets stuck permanently loading. I...

  • 5761 Views
  • 7 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Ahmet Korkmaz​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 0 kudos
6 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels