I understand databricks can send diagnostic/audit logs to log analytics in azure. There is a standard 'DatabricksNotebook' table that provides audit log for notebook actions. In this table there is an action called 'runCommand' but this does not show...
Hi @arun laksh​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
I do have some legacy pandas codes which I want to migrate to spark to leaverage parellelization in Databricks. I see datadricks has launched a wrapper package on top of pandas which uses pandas nomenclature but use spark engine in the backend.I comf...
Hi @mahesh vardhan gandhi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from ...
Hi @Paras Gadhiya​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...
When ever a legitimate alert is triggered, I get a False Alert with 0.00 triggered at 12.00AM the next day. I tried Altering the Query but its still the same. Not posting examples as data is not shareable but I can give an example. if the alert is se...
Hi @justin moorthy​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
Hey everyone,​I am creating databricks Jobs using ADO pipelines. I am creating the json content using python and in thr release pipeline i call databricks cli create command with the json. ​W​hat I would like to do is that in my CI pipeline, I need t...
Hi @Mustafa Akilli​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedba...
Hi,Does anyone know how to access data explorer in the community edition? I would like to have an overview of what files are saved in the FileStore. This is what happens when I select "Data" in the left-hand side menu
Hi @Konrad Kawka​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback...
HiOn Feb 27th ,I attempted the Databricks Certified Data Engineer Associate exam for 1st time , unfortunately I ended up by failing grade. My passing grade was 70%, and I received 64.88%.I am planning to reattempt the exam, Could you kindly give me a...
Hi @Sanmati Mahesh Undodi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from ...
Hi @Ankit Gangwal​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...
I have cleared the certification exam on 26th January 2023, but still haven't received the certificate. I had given the exam with a different mail ID but I'm not receiving any emails from Databricks to that mail ID.​Kindly help me resolve the issue.
Hi @Naeemah Khatib​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
I want read a json from IO variable using PySpark.My code using pandas:io = BytesIO()ftp.retrbinary('RETR '+ file_name, io.write)io.seek(0)# With pandasdf = pd.read_json(io)What I tried using PySpark, but don't work: io = BytesIO() ftp.retrbinary('...
All I could find in terms of API bindings for python is https://pypi.org/project/databricks-cli/, and this does not include the Account API and it is also not official.I will just use the OpenAPI spec, but just want to be sure I'm not doing unnecessa...
@Iwan Aucamp​ : Yes, there are Python API bindings available for the Databricks Account API.For Databricks Account API with Python, please refer to the Databricks documentation: https://docs.databricks.com/dev-tools/api/latest/accounts.html#python-ap...
I have created a dropdown (say B) in my notebook whose input depend on dropdown( say B). So if select some value in dropdown A, it corresponding value appears in B dropdown & i'm selecting one amongst it. Now if i change the value in dropdown A, then...
If the previously selected value of B is not meant to be in the list of values for newly selected dropdown A value, then you could set a default value (ie: 'No selection') that the B dropdown should have when first created. In a method to define how ...
hello everyone, I'm creating a regex expression to fetch only the value of a string, but some values ​​are negative. I am not able to create the rule to compose the negative value. can you help me?from pyspark.sql.functions import regexp_extract
fro...
Have you found the answer? If you are a student in college or school searching for free essay examples online, you may want to visit the website https://writinguniverse.com/free-essay-examples/soccer/ here you will find a vast collection of free essa...
I saved the content of an older Databricks Workspace by clicking on the Dropdown next to Workspace -> Export -> DBC Archive and saved it on my local machine.In a new Databricks Workspace, I now want to import That .DBC archive to restore the previous...
@Sebastian K​ :It looks like the error you are facing while importing the DBC archive could be due to the version incompatibility between the Databricks instance where you created the DBC archive and the one where you are trying to import it. Can you...