cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pranav5
by New Contributor II
  • 1168 Views
  • 3 replies
  • 0 kudos

Unable to add a microsoft security group as Workspace Admin

I'm a workspace admin for a databricks workspace. I can add a microsoft security group in the workspace. When I click on the group to view it I can view the members of the group same in the Azure AD reflecting correctly but it throws an error on the ...

  • 1168 Views
  • 3 replies
  • 0 kudos
Latest Reply
lingareddy_Alva
Honored Contributor III
  • 0 kudos

@pranav5 This issue usually occurs because of how Databricks handles group provisioning via SCIM, especially with external groups from Azure AD.SCIM 404 Error: This generally means Databricks cannot find a matching SCIM identity for the Azure AD grou...

  • 0 kudos
2 More Replies
Sarathk
by New Contributor
  • 2982 Views
  • 1 replies
  • 0 kudos

Data bricks is not mounting with storage account giving java lang exception error 480

Hi Everyone,I am currently facing an issue with in our Test Environment where Data bricks is not able to mount with the storage account and we are using the same mount in other environments those are Dev,Preprod and Prod and it works fine there witho...

  • 2982 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Checking.

  • 0 kudos
adrianojsoares1
by New Contributor
  • 1049 Views
  • 2 replies
  • 0 kudos

JDBC Driver cannot connect when using TokenCachePassPhrase property

Hello all, I'm looking for suggestions on enabling the token cache when using browser based SSO login. I'm following the instructions found here: Databricks-JDBC-Driver-Install-and-Configuration-Guide For my users, I would like to enable the token ca...

  • 1049 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

For the error encountered (Cannot invoke "java.nio.file.attribute.AclFileAttributeView.setAcl(...)" because "<local6>" is null)  might be permission or file system issues where the token cache store is being accessed.  When EnableTokenCache=0, the to...

  • 0 kudos
1 More Replies
crvkumar
by New Contributor II
  • 4829 Views
  • 3 replies
  • 1 kudos

Databricks Notebook says "Connecting.." for some users

For some users, after clicking on a notebook the screen says "connecting..." and the notebook does not open.The users are using Chrome browser and the same happens with Edge as well.What could be the reason?

  • 4829 Views
  • 3 replies
  • 1 kudos
Latest Reply
AakashYadav
New Contributor II
  • 1 kudos

Even I am facing the same issue. It always keep saying, opening the notepad. Luckily once it is opened and when connected with the cluster, then its getting timeout.

  • 1 kudos
2 More Replies
mithileshtiwar1
by New Contributor II
  • 2962 Views
  • 8 replies
  • 1 kudos

Notebook Detached Error: exception when creating execution context: java.net.SocketTimeoutException:

Hello Community,I have been facing this issue since yesterday. After attaching the cluster to a notebook and running a cell, I get the following error in the community edition of the databricks:Notebook detached:exception when creating execution cont...

  • 2962 Views
  • 8 replies
  • 1 kudos
Latest Reply
Shweta0623
New Contributor II
  • 1 kudos

Hi All,I am also facing issue.if anyone knows how to resolve this, please post the solution here.

  • 1 kudos
7 More Replies
mrstevegross
by Contributor III
  • 2362 Views
  • 7 replies
  • 0 kudos

preloaded_docker_images: how do they work?

At my org, when we start a databricks cluster, it oftens takes awhile to become available (due to (1) instance provisioning, (2) library loading, and (3) init script execution). I'm exploring whether an instance pool could be a viable strategy for im...

  • 2362 Views
  • 7 replies
  • 0 kudos
Latest Reply
naha3456
New Contributor II
  • 0 kudos

Hello, when we specify docker image with credentials in instance pool configuration, should we also specify credentials in cluster configuration?. as we already have image pulled into the pool instance.

  • 0 kudos
6 More Replies
jimbender
by New Contributor II
  • 3496 Views
  • 2 replies
  • 0 kudos

Is there an automated way to strip notebook outputs prior to pushing to github?

We have a team that works in Azure Databricks on notebooks.We are not allowed to push any data to Github per corporate policy.Instead of everyone having to always remember to clear their notebook outputs prior to commit and push, is there a way this ...

  • 3496 Views
  • 2 replies
  • 0 kudos
Latest Reply
brycejune
New Contributor III
  • 0 kudos

Hi,pushing to GitHub isn’t allowed, but clearing notebook outputs before internal version control is still important, you can automate this process by using a pre-commit hook or a script within your internal CI/CD pipeline (if one exists). Tools like...

  • 0 kudos
1 More Replies
j_h_robinson
by New Contributor II
  • 1648 Views
  • 3 replies
  • 1 kudos

Replacing Excel with Databricks

I have a client that currently uses a lot of Excel with VBA and advanced calculations. Their source data is often stored in SQL Server.I am trying to make the case to move to Databricks. What's a good way to make that case? What are some advantages t...

  • 1648 Views
  • 3 replies
  • 1 kudos
Latest Reply
BigAlThePal
New Contributor III
  • 1 kudos

To add on this, my team and I have been using Databricks in an enterprise environment to replace Excel based calculation relying on SQL stored data with Calculations served as model serving endpoints (API) - the initial 'translation' work can be tedi...

  • 1 kudos
2 More Replies
BigAlThePal
by New Contributor III
  • 832 Views
  • 2 replies
  • 1 kudos

Resolved! Search page to search code inside .py files

Hello, hope you are doing good.When on the search page, it seems it's not searching for code inside .py files but rather only the filename.Is there an option somewhere I'm missing to be able to search inside .py files ? Best,Alan

  • 832 Views
  • 2 replies
  • 1 kudos
Latest Reply
BigAlThePal
New Contributor III
  • 1 kudos

Hello, so it seems Databricks does not allow it - an easy workaround for us is to search directly on our Azure DevOps Repos.

  • 1 kudos
1 More Replies
Kishore23
by New Contributor
  • 1042 Views
  • 1 replies
  • 0 kudos

COMMUNITY EDITION CLUSTER DETACH JAVA.UTIL.TIMEOUTEXCEPTION.

Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a socket limit and disables any...

  • 1042 Views
  • 1 replies
  • 0 kudos
Latest Reply
dale65a
New Contributor II
  • 0 kudos

@Kishore23 paturnpikewrote:Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a so...

  • 0 kudos
scharly3
by New Contributor II
  • 50301 Views
  • 11 replies
  • 1 kudos

Error: Folder xxxx@xxx.com is protected

Hello, On Azure Databricks i'm trying to remove a folder on the Repos folder using the following command : databricks workspace delete "/Repos/xxx@xx.com"I got the following error message:databricks workspace delete "/Repos/xxxx@xx.com"Error: Folder ...

  • 50301 Views
  • 11 replies
  • 1 kudos
Latest Reply
winstonharder
New Contributor II
  • 1 kudos

Hello Databricks Forums,When you see the Azure Databricks error message "Folder xxxx@xxx.com is protected," it means that you are attempting to remove a system-protected folder, which is usually connected to a user's workspace, particularly under the...

  • 1 kudos
10 More Replies
kweks970
by New Contributor
  • 626 Views
  • 1 replies
  • 0 kudos

dev and prod

"SELECT * FROM' data call on my table in PROD is giving all the rows of data, but a call on my table in DEV is giving me just one row of data. what could be the problem??

  • 626 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Tell us more about your environment.  Are you using Unity Catalog? What is the table format? What cloud platform are you on?  More information is needed.

  • 0 kudos
htd350
by New Contributor II
  • 2382 Views
  • 3 replies
  • 2 kudos

Resolved! Cluster by auto pyspark

I can find documentation to enable automatic liquid clustering with SQL code: CLUSTER BY AUTO. But how do I do this with Pyspark? I know I can do it with spark.sql("ALTER TABLE CLUSTER BY AUTO") but ideally I want to pass it as an .option().Thanks in...

  • 2382 Views
  • 3 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Not at the moment.  You have to use the SQL DDL commands either at table creation or via alter table command. Hope this help, Louis.

  • 2 kudos
2 More Replies
nachii_rajput
by New Contributor
  • 2502 Views
  • 0 replies
  • 0 kudos

Issue with Disabled "Repair DAG", "Repair All DAGs" Buttons in Airflow UI, functionality is working.

We are encountering an issue in the Airflow UI where the 'Repair DAG' and 'Repair All DAGs' options are disabled when a specific task fails. While the repair functionality itself is working properly (i.e., the DAGs can still be repaired through execu...

  • 2502 Views
  • 0 replies
  • 0 kudos
charliemerrell
by New Contributor
  • 639 Views
  • 2 replies
  • 0 kudos

Will auto loader read files if it doesn't need to?

I want to run auto loader on some very large json files. I don't actually care about the data inside the files, just the file paths of the blobs. If I do something like```    spark.readStream        .format("cloudFiles")        .option("cloudFiles.fo...

  • 639 Views
  • 2 replies
  • 0 kudos
Latest Reply
lingareddy_Alva
Honored Contributor III
  • 0 kudos

Hi @charliemerrell Yes, Databricks will still open and parse the JSON files, even if you're only selecting _metadata.It must infer schema and perform basic parsing, unless you explicitly avoid it.So, even if you do:.select("_metadata")It doesn't skip...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels