cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

771407
by New Contributor II
  • 2222 Views
  • 3 replies
  • 3 kudos

Resolved! R code that works perfectly on Rstudio does not run here

Hi,I have a "simple" R script that I need to import into Databricks and am running into errors.For example:TipoB <- Techtb %>% dplyr::filter(grepl('being evaluated', Comentarios))#TipoB$yearsSpec <- NATipoB$yearsSpec <- str_replace(TipoB$Comentarios,...

  • 2222 Views
  • 3 replies
  • 3 kudos
Latest Reply
771407
New Contributor II
  • 3 kudos

R studio version 2022.12.0.R latest version available on 08/FEB/2023. I don't know where to find the DBR version and configuration. Can you direct me?

  • 3 kudos
2 More Replies
MikeJohnsonZa
by New Contributor
  • 2607 Views
  • 3 replies
  • 0 kudos

Resolved! Importing irregularly formatted json files

HiI'm importing a large collection of json files, the problem is that they are not what I would expect a well-formatted json file to be (although probably still valid), each file consists of only a single record that looks something like this (this i...

  • 2607 Views
  • 3 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Hi @Michael Johnson​,I would like to share the following notebook which contains examples on how to process complex data types, like JSON. Please check the following link and let us know if you still need help https://docs.databricks.com/optimization...

  • 0 kudos
2 More Replies
youssefmrini
by Databricks Employee
  • 2071 Views
  • 1 replies
  • 4 kudos
  • 2071 Views
  • 1 replies
  • 4 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 4 kudos

You can now use cluster policies to restrict the number of clusters a user can create. For more information https://docs.databricks.com/administration-guide/clusters/policies.html#cluster-limit

  • 4 kudos
youssefmrini
by Databricks Employee
  • 2178 Views
  • 1 replies
  • 2 kudos
  • 2178 Views
  • 1 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

Clone can now be used to create and incrementally update Delta tables that mirror Apache Parquet and Apache Iceberg tables. You can update your source Parquet table and incrementally apply the changes to their cloned Delta table with the clone comman...

  • 2 kudos
youssefmrini
by Databricks Employee
  • 1187 Views
  • 1 replies
  • 2 kudos
  • 1187 Views
  • 1 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

You can now use OAuth to authenticate to Power BI and Tableau. For more information, see Configure OAuth (Public Preview) for Power BI and Configure OAuth (Public Preview) for Tableau.https://docs.databricks.com/integrations/configure-oauth-powerbi.h...

  • 2 kudos
156190
by New Contributor III
  • 4023 Views
  • 4 replies
  • 3 kudos

Resolved! Is 'run_as' user available from jobs api 2.1?

I know that the run_as user generally defaults to the creator_user, but I would like to find the defined run_as user for each of our jobs. Unfortunately, I'm unable to locate that field in the api.

  • 4023 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Keller, Michael​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 3 kudos
3 More Replies
SagarK1
by New Contributor
  • 4365 Views
  • 4 replies
  • 2 kudos

Managing the permissions using MLFlow APIs

Hello All,I am trying to manage the permissions on the experiments using the MLFLow API. Do we have any MLFlow API which helps to manage the permissions of Can Read ,Can Edit , Can Manage.Example :I create the model using MLFlow APIs and through my c...

  • 4365 Views
  • 4 replies
  • 2 kudos
Latest Reply
jsan
New Contributor II
  • 2 kudos

Hey folks, did we get any workaround for this or what @Sean Owen​ said is true ?

  • 2 kudos
3 More Replies
zeta_load
by New Contributor II
  • 9717 Views
  • 1 replies
  • 1 kudos

Resolved! Is it possible to restart a cluster from a Notebook without using the UI

I have some code that occasionally wrong executed, meaning that every n-th time a calculation in a table is wrong. If that happens, I want to be able to restart the cluster from the Notebook.- I'm therefore lookong for a piece of code that can accomp...

  • 9717 Views
  • 1 replies
  • 1 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 1 kudos

@Lukas Goldschmied​ It is. You'll need to use Databricks API.Here you can find an example:https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/clusters-long-running-optional-restart.html

  • 1 kudos
332588
by New Contributor II
  • 1495 Views
  • 3 replies
  • 3 kudos

We are using the Databricks managed MLflow to log experiment runs for quite some time already and never experienced issues. However, now we seem to have encountered a bug in the associated Databricks UI.

We observe the following behavior when we keep adding new runs to an experiment:- In the beginning, the runs are still displayed correctly in the UI.- After a certain number of total runs, the following bug occurs in the UI:   - In the UI, there are ...

  • 1495 Views
  • 3 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi @Timo Burmeister​ Apologies for the delay! I went through the video, does it happen all the time? I see after sorting it with different filter the list appears.

  • 3 kudos
2 More Replies
prasadvaze
by Valued Contributor II
  • 7238 Views
  • 3 replies
  • 0 kudos

Error loading MANAGED table in unity catalog delta lake on azure. Anyone seen this issue? "ErrorClass=INVALID_PARAMETER_VALUE] Input path <file system name>.dfs.core.windows.net overlaps with other external tables"

00007160: 2023-01-30T14:22:06 [TARGET_LOAD ]E: Failed (retcode -1) to execute statement: 'COPY INTO `e2underwriting_dbo`.`product` FROM(SELECT cast(_c0 as INT) as `ProductID`, _c1 as `ShortName`, cast(_c2 as INT) as `Status`, cast(_c3 as TIMESTA...

  • 7238 Views
  • 3 replies
  • 0 kudos
Latest Reply
prasadvaze
Valued Contributor II
  • 0 kudos

we have solved this issue related to Qlik replicate copying data into delta table

  • 0 kudos
2 More Replies
youssefmrini
by Databricks Employee
  • 1183 Views
  • 1 replies
  • 1 kudos
  • 1183 Views
  • 1 replies
  • 1 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 1 kudos

You can ensure there is always an active run of your Databricks job with the new continuous trigger type. https://docs.databricks.com/workflows/jobs/jobs.html#continuous-jobs

  • 1 kudos
tw1
by New Contributor III
  • 9989 Views
  • 9 replies
  • 3 kudos

Resolved! Can't write / overwrite delta table with error: oxxxx.saveAsTable. (Driver Error: OutOfMemory)

Current Cluster Config:Standard_DS3_v2 (14GB, 4 Cores) 2-6 workersStandard_DS3_v2 (14GB, 4Cores) for driverRuntime: 10.4x-scala2.12We want to overwrite a temporary delta table with new records. The records will be load by another delta table and tran...

image image
  • 9989 Views
  • 9 replies
  • 3 kudos
Latest Reply
tw1
New Contributor III
  • 3 kudos

Hi,thank you for your help!We tested the configuration settings and it runs without any errors.Could you give us some more information, where we can find some documentation about such settings. We searched hours to fix our problem. So we contacted th...

  • 3 kudos
8 More Replies
Lulka
by New Contributor II
  • 4746 Views
  • 2 replies
  • 2 kudos

Resolved! How limit input rate reading delta table as stream?

Hello to everyone!I am trying to read delta table as a streaming source using spark. But my microbatches are disbalanced - one very small and the other are very huge. How I can limit this? I used different configurations with maxBytesPerTrigger and m...

  • 4746 Views
  • 2 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

besides the parameters you mention, I don't know of any other which controls the batch size.did you check if the delta table is not horribly skewed?

  • 2 kudos
1 More Replies
Erik
by Valued Contributor III
  • 19627 Views
  • 18 replies
  • 14 kudos

How to enable/verify cloud fetch from PowerBI

I tried to benchmark the Powerbi Databricks connector vs the powerbi Delta Lake reader on a dataset of 2.15million rows. I found that the delta lake reader used 20 seconds, while importing through the SQL compute endpoint took ~75 seconds. When I loo...

query_statistics query_profile_tree_view
  • 19627 Views
  • 18 replies
  • 14 kudos
Latest Reply
pulkitm
New Contributor III
  • 14 kudos

Guys, is there any way to switch off CloudFetch and fall back to ArrowResultSet by default irrespective of size? using the latest version of Spark Simba ODBC driver?

  • 14 kudos
17 More Replies
RyanHager
by Contributor
  • 3232 Views
  • 5 replies
  • 2 kudos

Are there any plans to add functions on the partition by fields of a delta table definition such as day() ? A similar capability exists in iceberg.

Benefit: This will help simplify the where clauses of the consumers of the tables? Just query on the main date field if I need all the data for a day. Not an extra day field we had to make.

  • 3232 Views
  • 5 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

@Ryan Hager​ , yes it is possible using AUTO GENERATED COLUMNS since delta lake 1.2For example, you can automatically generate a date column (for partitioning the table by date) from the timestamp column; any writes into the table need only specify t...

  • 2 kudos
4 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels