cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mithileshtiwar1
by Visitor
  • 220 Views
  • 4 replies
  • 1 kudos

Notebook Detached Error: exception when creating execution context: java.net.SocketTimeoutException:

Hello Community,I have been facing this issue since yesterday. After attaching the cluster to a notebook and running a cell, I get the following error in the community edition of the databricks:Notebook detached:exception when creating execution cont...

  • 220 Views
  • 4 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Hello All, There is a similar thread where another user encountered the same issue and shared a solution that worked for them. I suggest reviewing that thread to see if the solution is helpful in your case as well.

  • 1 kudos
3 More Replies
Kishore23
by New Contributor
  • 100 Views
  • 1 replies
  • 0 kudos

COMMUNITY EDITION CLUSTER DETACH JAVA.UTIL.TIMEOUTEXCEPTION.

Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a socket limit and disables any...

  • 100 Views
  • 1 replies
  • 0 kudos
Latest Reply
dale65a
Visitor
  • 0 kudos

@Kishore23 paturnpikewrote:Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a so...

  • 0 kudos
kweks970
by New Contributor
  • 155 Views
  • 1 replies
  • 0 kudos

dev and prod

"SELECT * FROM' data call on my table in PROD is giving all the rows of data, but a call on my table in DEV is giving me just one row of data. what could be the problem??

  • 155 Views
  • 1 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Tell us more about your environment.  Are you using Unity Catalog? What is the table format? What cloud platform are you on?  More information is needed.

  • 0 kudos
BigAlThePal
by New Contributor II
  • 111 Views
  • 1 replies
  • 0 kudos

Search page to search code inside .py files

Hello, hope you are doing good.When on the search page, it seems it's not searching for code inside .py files but rather only the filename.Is there an option somewhere I'm missing to be able to search inside .py files ? Best,Alan

  • 111 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
New Contributor III
  • 0 kudos

Hi @BigAlThePal As per my understanding, there isn’t a built-in option in the Databricks workspace to search inside .py files directly.You could try a few workarounds though, like using the Databricks REST API to list .py files and programmatically s...

  • 0 kudos
j_h_robinson
by New Contributor II
  • 422 Views
  • 2 replies
  • 1 kudos

Replacing Excel with Databricks

I have a client that currently uses a lot of Excel with VBA and advanced calculations. Their source data is often stored in SQL Server.I am trying to make the case to move to Databricks. What's a good way to make that case? What are some advantages t...

  • 422 Views
  • 2 replies
  • 1 kudos
Latest Reply
j_h_robinson
New Contributor II
  • 1 kudos

This is very helpful, thank you.  

  • 1 kudos
1 More Replies
charliemerrell
by New Contributor
  • 133 Views
  • 2 replies
  • 0 kudos

Will auto loader read files if it doesn't need to?

I want to run auto loader on some very large json files. I don't actually care about the data inside the files, just the file paths of the blobs. If I do something like```    spark.readStream        .format("cloudFiles")        .option("cloudFiles.fo...

  • 133 Views
  • 2 replies
  • 0 kudos
Latest Reply
LRALVA
Contributor II
  • 0 kudos

Hi @charliemerrell Yes, Databricks will still open and parse the JSON files, even if you're only selecting _metadata.It must infer schema and perform basic parsing, unless you explicitly avoid it.So, even if you do:.select("_metadata")It doesn't skip...

  • 0 kudos
1 More Replies
Dulce42
by New Contributor
  • 137 Views
  • 1 replies
  • 0 kudos

Trusted assets vs query examples

¡Hi community! In recent days I explored trusted assets in my genie space and this working very well! but I feel a little confused :sIn my genie space I have many queries examples when I create a new function with the same query example for verify th...

  • 137 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @Dulce42! It depends on your use case. If your function covers the scenario well, you don’t need a separate query example. Having both for the same purpose can create redundancy and make things more complex. Choose the option that best fits you...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now