Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Can we use Databricks or code in data bricks without learning Pyspark in depth which is used for ETL purpose and data engineering perspective. can someone throw some light on this. Currently learning Pyspark (basics of Pythion in handling the data) a...
Hi @asma jabeen​ , We haven’t heard from you since the last response from @Renaud Mathieu​ and @Pat Sienkiewicz​​, and I was checking back to see if their suggestions helped you.Or else, If you have any solution, please share it with the community, a...
Hi all,Could you please help suggest me some resource to prepare for " Databricks Data Engineer Professional" exam?I have also take the course in Databricks Accademy but seems not enough for this exam?Thank you so much!!!Best Regards,Nhan Nguyen
I have a notebook that uses a Selenium Web Driver for Chrome and it works the first time I run the notebook. If I run the notebook again, it will not work and gives the error message: WebDriverException: Message: unknown error: unable to discover op...
Hi, @Dagart Allison​ . I've created a new version of the selenium with the databricks manual. Please look here https://community.databricks.com/s/feed/0D58Y00009SWgVuSAL
Hi @Ranjeeth Rikkala​ , We haven’t heard from you since the last response from @Pat Sienkiewicz​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpf...
Navigate and discover content more efficiently with Search in DatabricksHi all- Justin Kim here, I'm the Databricks product manager responsible for content organization and navigation in our product, which includes Search. Great to see you on the Com...
@Justin Kim​ Thank you for quick reply, usually Last Modified is Recent changes right (that can be last 24hrs or cap limit that we add), whereas anytime they should show all Notebooks or Tables from start. that is where i got confused
The End of Support (EOS) date sneaked up on us and we are now wondering if we can delay our upgrade post the EOS date. Could you please help us analyze the risks of operating a DBR version post EOS date?
Hi @Syed Zaffar​ Does @Prabakar Ammeappin​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
Two related questions:1: There has been several mentions in this forum about "Databricks Tunnel", which should allow us to connect from our local IDE to a remote databricks cluster and develop stuff locally. The roumors said early 2022, is there some...
Hi there @Erik Parmann​ Does @Youssef Mrini​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks
Our company uses Gitlab enterprise edition and we link our repos up to databricks through this. Randomly we will get errors when trying to push the repo and we have to spend hours debugging trying to figure out what is causing the push error on datab...
Hey there @Mark Patrick​ Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.Cheers!
I was running shell scrip in data bricks using %sh magic command.I am having requirement where I need to pass parameters/arguments to the script. Is there any way we can get this done with scala as base language.
I'm facing problem while connecting Data bricks with AWS cloud watch, I want to send certain logs to cloud watch but seems like there is some connectivity issue between the 2 parties
Hi @Tushar Dua​ , please follow the below blog which has details on how to monitor Databricks using Cloudwatch.How to Monitor Databricks with AWS CloudWatch
I am trying to execute a local PySpark script on a Databricks cluster via dbx utility to test how passing arguments to python works in Databricks when developing locally. However, the test arguments I am passing are not being read for some reason. Co...
You can pass parameters using dbx launch --parametersIf you want to define it in the deployment template please try to follow exactly databricks API 2.1 schema https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsCreate (for examp...
Hello,We are working to migrate to databricks runtime 10.4 LTS from 9.1 LTS but we're running into weird behavioral issues. Our existing code works up until runtime 10.3 and in 10.4 it stopped working.Problem:We have a nested json file that we are fl...