Is it possible to set environment variables at the notebook level instead of the cluster level? Will they be available in the workers in addition to the driver? Can they override the env variables set at the cluster level?
Hi @Martim Lobao​, We haven't heard from you on the last response from @Prabakar​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.Als...
I'm new to python and databricks so I'm still running tests on features, and not sure how much of this can be run without databricks which I guess requires an AWS or Google cloud account? Can I do all three stages without the AWS databricks or how fa...
Hi @Andrew Schell​, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.@Hubert Dudek​, Thank you for your response.
Hi, what is the maximum number of jobs we can execute in an hour for a given workspace?This page mentions 5000https://docs.microsoft.com/en-us/azure/databricks/data-engineering/jobs/jobsThe number of jobs a workspace can create in an hour is limited ...
Hi @E H​ , We haven’t heard from you on the last response from @Sivaprasad C S​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.
I have created an Azure AD Group in "Microsoft 365" type with its own email address, which being added to the Notification of a Databricks Job (on failure). But there is no mail sent to the Azure Group mailbox when the job fails.I am able to send a d...
Hi @Md Tahseen Anam​ Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best? If not, please tell us so we can help you.Thanks!
I've seen many posts here in the Community as potential solutions to this error, but none seem to be a solution for us. We are trying to launch the 14 day free trial of Databricks from the AWS Marketplace and are getting the error below. Moreover, ...
Here are some answers:copyObject error - we were using a Databricks provided cloudformation template but this error goes away when we use the AWS provided templatecreateWorkspace error - we had subscribed>unsubscribed>resubscribed to Databricks via t...
Hi,Currently, I'm using structure streaming to insert/update/delete to a table. A row will be deleted if value in 'Operation' column is 'deleted'. Everything seems to work fine until there's a new column.Since I don't need 'Operation' column in the t...
I have the following basic script that works fine using pycharm on my machine.from pyspark.sql import SparkSessionprint("START")spark = SparkSession \ .Builder() \ .appName("myapp") \ .master('local[*, 4]') \ .getOrCreate()print(spark)dat...
There does not seem to be a way to log into and view the recent "paid" training sessions from the 2022 Data/AI Summit. I was able to log in and view the videos yesterday, but the website currently posted has no option for logging in/access. Is the...
Hey there @Christopher Warner​ Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!
I have a Delta table spark101.airlines (sourced from `/databricks-datasets/airlines/`) partitioned by `Year`. My `spark.sql.shuffle.partitions` is set to default 200. I run a simple query:select Origin, count(*)
from spark101.airlines
group by Origi...
Hello all,I've been experiencing the error described below, where I try to query a table from Snowflake which is about ~5.5B rows and ~30columns, and it fails almost systematically; specifically, either the Spark Job doesn't even start or I get the ...
Hey there @hamzatazib96​ Does @Kaniz Fatma​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
Hello all,We are working on one of the client requirements to implement suitable data encryption in Azure Databricks.We should be able to encrypt and decrypt the data based on the access, we explored fernet library but client denied it saying it degr...
Hi @purushotham Chanda​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you...
I recently tried to create my account with Databricks Community Edition. I have singed up for it and received verification email. After that I have to reset my password. But while doing so I am always getting the following error. Can someone help me ...
I have a few suggestions for UI improvement on Databricks console -- Or maybe if anyone has figured out a way (using greasemonkey or similar scripts) to make some changes to Databricks UI -- i would like to know. # 1 - Workspace NavigationCan we have...
Great ideas.I know that regarding #1 new file manager is in development.#3 I also proposed when we discussed possible improvements.@Lindsay Olson​ @Jose Gonzalez​ @Prabakar Ammeappin​ maybe we can push it as user feedback as that are great ideas with...
We have a couple sources we'd already set up to stream to prod using a 3p system. Is there a way to sync this directly to our dev workspace to build pipelines? eg. directly connecting to a cluster in prod and pull with a job cluster, dump to S3 and u...
Hi @Erik Louie​ , We haven't heard from you on the last response from @Debayan Mukherjee​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to oth...
So, I have a super simple left join from one table to another it's purpose to retrieve the date of birth for a customer from the customer ID FK in the transaction table to the customer ID PK in the customer table. A customer will have several transac...
Hi @Faye Hughes​ Thank you so much for getting back to us. It's really great of you to send in the solution and mark the answer as best. We really appreciate your time.Wish you a great Databricks journey ahead!