cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

samarth_solanki
by New Contributor II
  • 4579 Views
  • 1 replies
  • 0 kudos

Resolved! Unittesting databricks.sdk.runtime

How to mock a code that uses dbutils from"from databricks.sdk.runtime import dbutils"it shows databricks-sdk has no attribute runtime

  • 4579 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @samarth_solanki , I Hope you are doing well!  Based on the information you have shared, it seems like you're trying to import dbutils from databricks.sdk.runtime, but you're encountering an error that says "databricks-sdk has no attribute runtime...

  • 0 kudos
halcho
by New Contributor III
  • 14146 Views
  • 6 replies
  • 9 kudos

Notebook scrolling as I select

When I select text in a notebook cell the whole notebook scrolls up as a select. This happens when I use the mouse wheel and with shift+arrow key. It varies by cell - happens in some cells, but not other cells within the same notebook. When I refresh...

  • 14146 Views
  • 6 replies
  • 9 kudos
Latest Reply
Israel_H
New Contributor III
  • 9 kudos

I'm relieved to know that I'm not the only one experiencing this issue.Please, address this as soon as possible. It's significantly impacting my productivity.

  • 9 kudos
5 More Replies
QBexperts
by New Contributor II
  • 1876 Views
  • 1 replies
  • 3 kudos

Resolved! How do i fix QB Error code 6000 301?

When I try to access my company files, I keep getting the error 6000 301 in QB. Please assist me in fixing this mistake.

  • 1876 Views
  • 1 replies
  • 3 kudos
Latest Reply
mikejeson254
New Contributor III
  • 3 kudos

Are you facing QB error 6000 301 while trying to log in to your company file in QB Desktop and don’t know what should be done next? If yes, then you should not panic at all because through this post I am going to tell you everything you need to know ...

  • 3 kudos
ledbutter
by New Contributor II
  • 2820 Views
  • 2 replies
  • 0 kudos

Resolved! Create Storage Credential 500 Response

I'm trying to create storage credentials for an Azure Databricks Connector at the workspace level with a service principal that has the CREATE_STORAGE_CREDENTIAL but is NOT an account admin. For this test, the SP has the owner role on the connector.I...

  • 2820 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @ledbutter , hope you are doing well today!  I have gone through the details and this issue might be related to https://github.com/databricks/cli/issues/1080 Please refer to this for more details: https://github.com/databricks/cli/issues/1108 Plea...

  • 0 kudos
1 More Replies
TheoLoiseau
by New Contributor II
  • 1451 Views
  • 2 replies
  • 0 kudos

Cells' outputs getting appended at each run - Databricks Notebook

Hello Community,I have the following issue. When I am running cells from a notebook, I have the print outputs from the previous cells that are appended to the current print output (meaning running cell 1 gives output 1, running cell 2 gives output 1 ...

  • 1451 Views
  • 2 replies
  • 0 kudos
Latest Reply
TheoLoiseau
New Contributor II
  • 0 kudos

This seems to be linked by installing pycaret

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 2214 Views
  • 0 replies
  • 1 kudos

cost finding and optimization

 Hi Team,Could you please suggest the best way to track the cost of Databricks objects/components? Could you please share any best practices for optimizing costs and conducting detailed cost analysis?Regards,Phanindra

  • 2214 Views
  • 0 replies
  • 1 kudos
Phani1
by Valued Contributor II
  • 2023 Views
  • 1 replies
  • 0 kudos

Capture changes at the object level in Databricks

Could you please suggest how to capture changes at the object level in Databricks, such as notebooks changes, table DDL changes, view DDL , functions DDL, and workflows etc. changes? We would like to build a dashboard for changes.

  • 2023 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @Phani1 , Good Day!  Could you kindly clarify on your question about the capture change? What type of change do you want to capture? If you want to see what modifications the user made to the Notebooks, tables, and workflow, you can check the audi...

  • 0 kudos
Ruchita
by New Contributor
  • 1679 Views
  • 1 replies
  • 0 kudos

Getting internal server error while creating a new query definition

Hi, I am trying to create a query definition using API '/api/2.0/preview/sql/queries' in postman but getting internal server error. Below is the snap for the snap. Let me know If I am doing anything wrong here.

Capture.PNG
  • 1679 Views
  • 1 replies
  • 0 kudos
Latest Reply
UplightDrew
New Contributor II
  • 0 kudos

I'm interested to know how this error was resolved. I'm getting an "Internal Server Error" returned when trying to create queries with version 1.36.1 of the Databricks Terraform Provider. The error provides no other information.

  • 0 kudos
cesarc
by New Contributor II
  • 7474 Views
  • 2 replies
  • 0 kudos

Parallel jobs with individual contexts

I was wondering if someone could help us with implementation here. Our current program will spin up 5 jobs through the Databricks API using the same Databricks cluster but each one needs their own spark context (specifically each one will connect to ...

  • 7474 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 0 kudos

 you can set up buckets with different credentials, endpoints, and so on.https://docs.databricks.com/en/connect/storage/amazon-s3.html#per-bucket-configuration 

  • 0 kudos
1 More Replies
wellington
by New Contributor III
  • 1988 Views
  • 1 replies
  • 0 kudos

Log notebook activities

Hi friends;I'm working on a project where we are 4 programmers. We are working in a single environment, using only the "Workspaces" folder. Each has its own user, which is managed by Azure AD.We had a peak in consumption on the 5th Feb. So I can see ...

  • 1988 Views
  • 1 replies
  • 0 kudos
Latest Reply
wellington
New Contributor III
  • 0 kudos

Hi @Retired_mod , thanks for your quick answer.There is no other way to monitor notebook runs. I ask this because adding tags to the cluster and workspace does not solve my problem, considering that everyone uses the same cluster and the same workspa...

  • 0 kudos
JasonAckman
by New Contributor
  • 7944 Views
  • 0 replies
  • 0 kudos

Data Engineer – Databricks - Remote

Data Engineer – Databricks - RemoteApply Here: Job Application for Data Engineer – Databricks at Jenzabar (greenhouse.io)Jenzabar Website: Higher Education Software Solutions - JenzabarFor over four decades, the higher education experts at Jenzabar h...

  • 7944 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 11116 Views
  • 0 replies
  • 1 kudos

Calling all innovators and visionaries! The 2024 Data Team Awards are open for nominations

Each year, we celebrate the amazing customers that rely on Databricks to innovate and transform their organizations — and the world — with the power of data and AI. The nomination form is now open to submit nominations. Nominations will close on Marc...

Screenshot 2024-02-12 at 10.52.07 AM.png
  • 11116 Views
  • 0 replies
  • 1 kudos
RobsonNLPT
by Contributor III
  • 3772 Views
  • 4 replies
  • 0 kudos

Databricks XML - Bypassing rootTag and rowTag

I see the current conversion of dataframe to xml need to be improved.My dataframe schema is a perfect nested schema based on structs but when I create a xml I have the follow issues:1) I can't add elements to root2) rootTag and rowTag are requiredIn ...

  • 3772 Views
  • 4 replies
  • 0 kudos
Latest Reply
sandip_a
Databricks Employee
  • 0 kudos

Here is one of the ways to use the struct field name as rowTag:     import org.apache.spark.sql.types._ val schema = new StructType().add("Record", new StructType().add("age", IntegerType).add("name", StringType)) val data = Seq(Row(Row(18, "John ...

  • 0 kudos
3 More Replies
Israel_H
by New Contributor III
  • 3183 Views
  • 3 replies
  • 5 kudos

The risks of code execution by default on widget change

Taking from my experience, the default action of widgets triggering code execution upon value change poses risks that outweigh the convenience in certain scenarios. While this feature may seem advantageous in some cases, it can lead to unintended con...

  • 3183 Views
  • 3 replies
  • 5 kudos
Latest Reply
Kayla
Valued Contributor II
  • 5 kudos

I definitely have to agree with the original point- if you have a notebook that you import, and you touch any widget value you're running code, most likely accidentally. I'd love to see a workspace or user type option where you can change the default...

  • 5 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels