- 1118 Views
- 3 replies
- 0 kudos
Delta table generates new file for every insert or update on table and keep the old version files also for versioning and time travel history . I have 1tb data as delta table and every 30 minutes , 90 percent data getting updated so file size will b...
- 1118 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @vinay kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
2 More Replies
- 702 Views
- 2 replies
- 1 kudos
I have around 200 SQL queries id like to run in databricks python notebooks. Id like to avoid creating an ETL process for each of the 200 SQL processes.Any suggestions on how to run the queries in a way that it loops through them so i have minimum am...
- 702 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Chris French​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...
1 More Replies
- 856 Views
- 3 replies
- 0 kudos
There is a business requirement for some of our accounts to have their passwords rotated. This currently requires an admin to go in and manually reset the password for the account via UI. I wanted to know if there's a more automated way to handle thi...
- 856 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Jordan Gray​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
2 More Replies
- 444 Views
- 1 replies
- 2 kudos
I have passed the test today for Lakehouse Fundamentals Accreditation, but have not yet received the badge yet.Please let me know how and when I can receive the badge for this passed test.Thank you.
- 444 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @Chang Su Lee​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training and our team will get back to you shortly.
by
Mado
• Valued Contributor II
- 2383 Views
- 3 replies
- 2 kudos
Hi,I want to access the Databricks Audit Logs to check user activity. For example, the number of times that a table was viewed by a user.I have a few questions in this regard. 1) Where the log files are stored? Are they stored on DBFS?2) Can I read l...
- 2383 Views
- 3 replies
- 2 kudos
Latest Reply
Hi @Mohammad Saber​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
2 More Replies
- 765 Views
- 2 replies
- 0 kudos
Maybe I'm completely wrong, but from my understanding delta lake only calculates a table at certain points, for instance when you display your data. Before that point, operations are only written to the log file and are not executed (meaning no chang...
- 765 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Lukas Goldschmied​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you....
1 More Replies
- 6814 Views
- 5 replies
- 0 kudos
This is the error which is coming while processing concurrent merge in delta lake tables in Azure Databricks .ConcurrentAppendException: Files were added to the root of the table by a concurrent update. Please try the operation again.. What are the o...
- 6814 Views
- 5 replies
- 0 kudos
Latest Reply
Hi @Abhishek Dutta​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
4 More Replies
- 1324 Views
- 3 replies
- 1 kudos
Hi, Im trying to use the databricks platform to do the pytorch distributed training, but I didnt find any info about this. What I expected is using multiple clusters to run a common job using pytorch distributed data parallel (DDP) with the code belo...
- 1324 Views
- 3 replies
- 1 kudos
Latest Reply
With Databricks MLR, HorovodRunner is provided which supports distributed training and inference with PyTorch. Here's an example notebook for your reference: PyTorchDistributedDeepLearningTraining - Databricks.
2 More Replies
- 816 Views
- 3 replies
- 0 kudos
Time travel and version control- can create custom version control for each day data load when multiple updates happening in a day. For example , let’s assume we are doing multiple operation on table in a day every minute and want to keep time travel...
- 816 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @vinay kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
2 More Replies
- 1930 Views
- 4 replies
- 2 kudos
Hi, Since it is not very well explained, I want to know if the table history is a snapshot of the whole table at that point of time containing all the data or it tracks only some metadata of the table changes.To be more precise : if I have a table in...
- 1930 Views
- 4 replies
- 2 kudos
Latest Reply
Hi @data engineer​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so...
3 More Replies
- 2048 Views
- 6 replies
- 7 kudos
How can I delete a file in DBFS with Illegal character?Someone put the file named "planejamento_[4098.]___SHORT_SAIA_JEANS__.xlsx" inside the folder /FileStore and I can delete it, because of this error: java.net.URISyntaxException: Illegal character...
- 2048 Views
- 6 replies
- 7 kudos
Latest Reply
try this %sh ls -li /dbfsif the file is located in a subdirectory you can change the path mentioned above.the %sh magic command gives you access to linux shell commands.
5 More Replies
- 1524 Views
- 2 replies
- 0 kudos
Hi all, we have a databricks instance on Azure with a Compute Cluster version 7.3 LTS. Currently the cluster has 4 max workers (min workers: 1) of type: Standard_D13_v2 and 1 driver of the same type. There are several jobs that are running on this cl...
- 1524 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @EDDatabricks EDDatabricks​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear f...
1 More Replies
- 1983 Views
- 5 replies
- 5 kudos
Hi All,I just wanted to know if is there any option to reduce time while loading Pyspark Dataframe into the Azure synapse table using Databricks.like..I have a pyspark dataframe that has around 40k records and I am trying to load data into the azure ...
- 1983 Views
- 5 replies
- 5 kudos
Latest Reply
Hi @Tinendra Kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
4 More Replies
- 3659 Views
- 3 replies
- 3 kudos
Can I run multiple jobs(for example: 100+) in parallel that refers the same notebook? I supply each job with a different parameter. If we can do this, what would be the impact? (for example: reliability, performance, troubleshooting etc. )Example: N...
- 3659 Views
- 3 replies
- 3 kudos
Latest Reply
Hi @Murthy Ramalingam​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you....
2 More Replies