cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

CraiMacl_23588
by New Contributor
  • 813 Views
  • 0 replies
  • 0 kudos

Init scripts in legacy workspace (pre-E2)

Hello,I've got a legacy workspace (not E2) and I am trying to move my cluster scoped init script to the workspace area (from DBFS). It doesn't seem to be possible to store a shell script in the workspace area (Accepted formats: .dbc, .scala, .py, .sq...

  • 813 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor II
  • 2970 Views
  • 2 replies
  • 1 kudos

Resolved! Databricks SQL warehouse best practices

How best we can design Databricks SQL warehouse for multiple environments, and multiple data marts, is there any best practices or guidelines?

  • 2970 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Phani1  We haven't heard from you since the last response from @Vinay_M_R , and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpful to others.  Also...

  • 1 kudos
1 More Replies
Rijuta
by New Contributor II
  • 2365 Views
  • 2 replies
  • 3 kudos

Amazing Women in Data AI session at Data AI Summit 2023

Great first time experience attending the keynotes, sessions, trainings and certifications. It was a great opportunity to connect with like minded individuals and women in Data AI panel and learn about the community.

  • 2365 Views
  • 2 replies
  • 3 kudos
Latest Reply
BarbaraCastee
New Contributor II
  • 3 kudos

I agreed with you 100%. I really like it.

  • 3 kudos
1 More Replies
User16845049068
by Databricks Employee
  • 962 Views
  • 0 replies
  • 0 kudos

AWS S3 Bucket Access from Unity Catalog?

Asking for a new logo customer…Let's say my unity catalog is in account A of AWS. The buckets that I need to access are in account B of AWS. The unity catalog is unable to create an external location based of this bucket even though all the necessary...

  • 962 Views
  • 0 replies
  • 0 kudos
cberho
by New Contributor II
  • 1459 Views
  • 0 replies
  • 1 kudos

Card billing issues

 Does anybody know if this is an ongoing issue (see screenshot below)? Trying to do business here, but not able to go through. We tried using two different cards using different networks. 

Screenshot 2023-07-21 at 10.35.50.png
  • 1459 Views
  • 0 replies
  • 1 kudos
Ninad
by New Contributor
  • 2355 Views
  • 3 replies
  • 1 kudos

Databricks on AWS

I want to host databricks on AWS. I want to know if we create databricks on top of AWS, will it be created in same account's VPC or will it be created out of my AWS account?If it is going to be created in my account, will it create a new VPC for me?T...

  • 2355 Views
  • 3 replies
  • 1 kudos
Latest Reply
Siebert_Looije
Contributor
  • 1 kudos

Hi, If you want to know more about the how to properly setup the databricks on top of AWS. I would really recommend to do the AWS platform administrator course of Databricks. In this everything is explained what you need to know. Hopes this helps.Kin...

  • 1 kudos
2 More Replies
zbowden2010
by New Contributor II
  • 2795 Views
  • 1 replies
  • 0 kudos

503 Error from Databricks when Cluster Inactive/Starting Up via Alteryx

Hello,I have been connecting to Databricks via Alteryx. It works fine when our cluster is active, but returns a 503 Service Unavailable error if the Cluster is inactive/starting up. I have previously reached out to Alteryx, but they have told me this...

  • 2795 Views
  • 1 replies
  • 0 kudos
Latest Reply
zbowden2010
New Contributor II
  • 0 kudos

I should have mentioned in the original post, we are using Microsoft Azure and a Simba Spark ODBC Driver.

  • 0 kudos
rohit-1989
by New Contributor
  • 1191 Views
  • 0 replies
  • 0 kudos

How to access ADLS Gen2 hdfs from a databricks cluster which has credential passthrough enabled?

When executing through a Databricks cluster with credential passthrough enabled, I wish to obtain supplementary file attributes in ADLS, such as the file's last modified time, which are currently unavailable in the databricks dbutils.fs.ls function.W...

Get Started Discussions
credential-passthrough
Databricks
  • 1191 Views
  • 0 replies
  • 0 kudos
KVNARK
by Honored Contributor II
  • 3611 Views
  • 1 replies
  • 1 kudos

No points shown in databricks new community page

There are no points displayed in Databricks new community page. Is it the same for all or only for me if I have done something wrong.

  • 3611 Views
  • 1 replies
  • 1 kudos
Latest Reply
KaKa
Contributor
  • 1 kudos

same concerns with you. on my account also didnot find where place display how many point in my account 

  • 1 kudos
Sujitha
by Databricks Employee
  • 21613 Views
  • 19 replies
  • 49 kudos

The updated Databricks Community welcomes you!

To ensure we continue to evolve and mature to deliver greater value to you, we are happy to unveil this revamped Databricks Community experience and platform with an improved user interface, content and discussion categories organised based on areas ...

Screenshot 2023-06-26 at 1.42.56 PM.png
  • 21613 Views
  • 19 replies
  • 49 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 49 kudos

Does anyone know how to change the email address in the new community?

  • 49 kudos
18 More Replies
yogu
by Honored Contributor III
  • 4877 Views
  • 1 replies
  • 4 kudos

Resolved! Raffle contest Swag recevied

Hello Everyone,Today i Recevied DAIS swag.Thank you databricks for providing such nice swag @Retired_mod @Sujitha  

yogu_0-1689846314794.png
  • 4877 Views
  • 1 replies
  • 4 kudos
Latest Reply
FabriceDeseyn
Contributor
  • 4 kudos

Jealous! Need to get me some merch too

  • 4 kudos
Obulreddy
by New Contributor
  • 4534 Views
  • 3 replies
  • 1 kudos

Unable to access S3 objects from Databricks using IAM access keys in both AWS and Azure Databricks

Hi Team,We are trying to connect to Amazon S3 bucket from both Databricks running on AWS and Azure using IAM access keys directly through Scala code in Notebook and we are facing com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden; with stat...

  • 4534 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Obulreddy  We haven't heard from you since the last response from @KaKa â€‹, and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpful to others.  Also,...

  • 1 kudos
2 More Replies
Mado
by Valued Contributor II
  • 25779 Views
  • 5 replies
  • 5 kudos

What are the best practices for spark DataFrame caching?

Hi,When caching a DataFrame, I always use "df.cache().count()".However, in this reference, it is suggested to save the cached DataFrame into a new variable:When you cache a DataFrame create a new variable for it cachedDF = df.cache(). This will allow...

Get Started Discussions
best practice
cache
Dataframe
  • 25779 Views
  • 5 replies
  • 5 kudos
Latest Reply
Lakshay
Databricks Employee
  • 5 kudos

In addition to other comments, I will just add that make sure you do the cache only when necessary. i.e. if you need to save a data frame for a time being to be referenced later in the code, then you should consider doing a cache. But if your code ha...

  • 5 kudos
4 More Replies
KVNARK
by Honored Contributor II
  • 2678 Views
  • 0 replies
  • 2 kudos

Databricks Community rewards portal is down

When can we expect the Databricks Community rewards portal to be up and running. The page shows the warning message as website is under construction. Attached the screen shot of the message for your reference. Kindly resolve the issue and reload the ...

KVNARK_0-1689834615331.png
  • 2678 Views
  • 0 replies
  • 2 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels