Resolved! Change Databricks Community User Name
How do I change my Databricks Community user name?
- 2501 Views
- 7 replies
- 4 kudos
- 4 kudos
Hi Sujitha,Thanks for the quick help! I cleaned the caches, and it works now.
- 4 kudos
How do I change my Databricks Community user name?
Hi Sujitha,Thanks for the quick help! I cleaned the caches, and it works now.
KB Feedback DiscussionIn addition to the Databricks Community, we have a Support team that maintains a Knowledge Base (KB). The KB contains answers to common questions about Databricks, as well as information on optimisation and troubleshooting.These...
Dear @Werner Stinckens​ and @Tyler Retzlaff​ We would like to express our gratitude for your participation and dedication in the Databricks Community last week. Your interactions with customers have been valuable and we truly appreciate the time...
Can a member create a new group in the databricks community, if yes can anyone let me know the steps, or where can I able to create a new group?
Any updates to this topic ? I'd like to create a group for my company as well.
Presenting top 3 members who contributed to Community last week between 11th June-17th June- ​ @Tyler Heflin​ @Werner Stinckens​ and @Bharathan K​ We would like to express our gratitude for your participation and dedication in the Databricks Commun...
Wow!!!Exciting metrics - @Werner Stinckens​ , @Tyler Heflin​ , and @Bharathan K​ !Congratulations!!!
Hello Databricks Community,I am seeking assistance understanding the possibility and procedure of implementing a workflow restriction mechanism in Databricks. Our aim is to promote a better workflow management and ensure the quality of the notebooks ...
Hello Nistrate,If I understand the question correctly, the ask is to create an approval framework/workflow for workflows/jobs changes/commits, I don't believe this is currently supported however this can be supported through the use of source control...
Dear Community- Get ready to mark your calendars for the upcoming Databricks Community Social event! Happening on June 16th, 2023, this event promises to be the ultimate monthly gathering for everyone in the Databricks Community.Join us for an hour ...
In the spirit of the Holiday season, share us a picture of reward/s you received from Databricks Community Rewards Store below!
Your tshirt is super cool n awesome
Weekly Raffle to Win Ticket to Data + AI Summit 2023 NO PURCHASE NECESSARY TO ENTER OR WIN. A PURCHASE OF ANY KIND WILL NOT INCREASE YOUR CHANCES OF WINNING. VOID WHERE PROHIBITED.We are giving away one ticket to Data + AI Summit 2023 every week ...
@Fjoraldo Mamutaj​ , @Aviral Bhardwaj​ and @Shubham Soni​ ​ : we have emailed you about some clarifications on your week 1 participation . Request you to reply to my email
I enrolled myself through the partner academy for Advanced Data Engineering with Databricks (North America) (Instructor-Led course) Just received this message saying my registration for Advanced Data Engineering with Databricks (North America) has be...
Adding @Vidula Khanna​ and @Kaniz Fatma​ for visibility
I am currently doing the Scalable Machine Learning Course and I observed that the menu options available in the videos are a bit different from what I have on my edition.Is it because I am using the Community edition or there is some setting to make ...
Hi @Pradeep Gadkari​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answe...
I was following the tutorial about data transformation with azure databricks, and it says before loading data into azure synapse analytics, the data transformed by azure databricks would be saved on temp storage in azure blob storage first before loa...
@Ajay Pandey​ Saving the transformed data to temporary storage in Azure Blob Storage before loading into Azure Synapse Analytics provides a number of benefits to ensure that the data is accurate, optimized, and performs well in the target environmen...
Hi Databricks Community,I want to set environment variables for all clusters in my workspace.The goal is to have environment (dev, prod) specific environment variables values.Instead of set the environment variables for each cluster, a global script ...
We have set the env variable at Global Init script as below,sudo echo DATAENV=DEV >> /etc/environmentand we try to access the variable in notebook that run with "Shared" cluster mode. import os print(os.getenv("DATAENV"))But the env variable is not a...
Hi databricks community,I have searched quite a while through the internet but did not find an answer. If I have configured the azure datalake connection in Unity data catalog, is it possible to grant the access to users for a specific file or a fold...
As @werners said service principal needs to have access to the file level.In the unity catalog, you can use "READ FILES"/"WRITE FILES" permission to give someone the possibility of reading files from the storage level (but through databricks).
I have taken a trial version of Databricks and wanted to configure it with AWS. but after login it was showing as blank screen since 20 hours. can someone help me with this. Note: strictly i have to use AWS with Databricks for configuration.
try to reach your account manager