- 6957 Views
- 10 replies
- 9 kudos
Hi,I gave Databricks Certified Associate Developer for Apache Spark 3.0 exam today but missed by one percent. I got 68.33% and pass is 70%.I am planning to reattempt the exam, could you kindly give me another opportunity and provide reattempt voucher...
- 6957 Views
- 10 replies
- 9 kudos
Latest Reply
Hi,I gave Databricks Certified Associate Developer for Apache Spark 3.0 Python exam yesterday but missed by three percent. I got 66.66% and pass is 70%.I am planning to reattempt the exam, could you kindly give me another opportunity and provide reat...
9 More Replies
- 1138 Views
- 1 replies
- 0 kudos
I have registreded account via AWS marketplace.Also I have deployed workspaces with Terraform.When I log in admin console, It redirects me to https://accounts.cloud.databricks.com/onboardingwhere I need to create workspace manually, but I don't want ...
- 1138 Views
- 1 replies
- 0 kudos
Latest Reply
Hi Team, Would you mind telling us how you have provisioned? Are you using the same account id which you have used while creation. If so, Could you please try to login through incognito and see if that works?
- 2187 Views
- 3 replies
- 4 kudos
I have a notebook that sets up parameters for the run based on some job parameters set by the user as well as the current date of the run. I want to supersede some of this logic and just use the manual values if kicked off manually. Is there a way to...
- 2187 Views
- 3 replies
- 4 kudos
Latest Reply
You can create widgets by using this- dbutils.widgets.text("widgetName", "")To get the value for that widget:- dbutils.widgets.get("widgetName")So by using this you can manually create widgets (variable) and can run the process by giving desired valu...
2 More Replies
- 3442 Views
- 1 replies
- 2 kudos
I have two variables StartTimeStmp and EndTimeStmp, i am going to assign the Start timestamp to it based on Last Successful Job Runtime and EndTimeStamp would be current time of system.SET StartTimeStmp = '2022-03-24 15:40:00.000';SET EndTimeStmp = '...
- 3442 Views
- 1 replies
- 2 kudos
Latest Reply
@Srinivas Gannavaram , in python:spark.sql(f"""
SELECT
CI.CORPORATE_ITEM_INTEGRATION_ID ,
CI.CORPORATE_ITEM_CD
WHERE
CI.DW_CREATE_TS < '{my_timestamp_variable}' ;
""")
by
tonyp
• New Contributor II
- 15582 Views
- 1 replies
- 1 kudos
How to pass a python variables to shell script.in databricks notebook, The python parameters can passed from the 1 st cmd to next %sh cmd .?
- 15582 Views
- 1 replies
- 1 kudos
Latest Reply
I found the answer here: https://stackoverflow.com/questions/54662605/how-to-pass-a-python-variables-to-shell-script-in-azure-databricks-notebookbles
basically:
%python
import os
l =['A','B','C','D']
os.environ['LIST']=' '.join(l)print(os.getenv('L...