cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Shawn_Eary
by Contributor
  • 8163 Views
  • 2 replies
  • 0 kudos

Create Team Git Repo in Azure Databricks

When I attach a Git repo to Databricks, it always puts the repo under my username/domain name:How can I create a "team" repo at the top level, so teammates don't have to drill into my username?

Shawn_Eary_0-1702321413271.png
  • 8163 Views
  • 2 replies
  • 0 kudos
Latest Reply
DonatienTessier
Contributor
  • 0 kudos

Hi,Interest of using a repo is to have a dedicated area for each of developers.If you want to have only a folder with the last version of the code, you should a CI/CD pipeline that will package the code and then delivered into a folder inside Workspa...

  • 0 kudos
1 More Replies
BLPedersen
by New Contributor II
  • 5520 Views
  • 3 replies
  • 1 kudos

Error when setting up OAuth profile for Databricks CLI

Hello.I'm currently trying to migrate a project from dbx to Databricks Asset Bundles. I have successfully created the required profile using U2M authentication with the command```databricks auth login --host <host-name>```I'm able to see the new prof...

  • 5520 Views
  • 3 replies
  • 1 kudos
Latest Reply
kunalmishra9
New Contributor III
  • 1 kudos

I ran into a similar error just now, and in my case, Pycharm was running some iPython startup scripts each time it opened a console. There was, for some reason, a file at `~/.ipython/profile_default/startup/00-databricks-init-a5acf3baa440a896fa364d18...

  • 1 kudos
2 More Replies
Ahmad_A
by New Contributor II
  • 1045 Views
  • 1 replies
  • 0 kudos

Using Community Edition instance for Customer Academy’s DE Learning Plan

Hi!So I’ve been wondering since I started with the Data Engineering Learning Plan on the Customer Academy, should I go with my Community Edition Databricks, or I should go with creating a premium edition on either a cloud provider or the website.Than...

Get Started Discussions
Customer Academy
  • 1045 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ahmad_A
New Contributor II
  • 0 kudos

Hey, @Kaniz. May I recommend deleting the post if it is not in the right place or changing the forum so I get the proper response?

  • 0 kudos
brian999
by Contributor
  • 2060 Views
  • 1 replies
  • 1 kudos

How to configure github credentials for a service principal NOT using Azure

I want to have a service principal run a job that uses a notebook in our github. We are AWS not Azure. How do I configure git credentials for the service principal? Does this use deploy keys?

  • 2060 Views
  • 1 replies
  • 1 kudos
Latest Reply
brian999
Contributor
  • 1 kudos

Awesome details thank you for your help.

  • 1 kudos
mohaimen_syed
by New Contributor III
  • 1492 Views
  • 1 replies
  • 0 kudos

VS Code integration with Python Notebook and Remote Cluster

Hi, I'm trying to work on VS code remotely on my machine instead of using the Databricks environment on my browser. I have went through documentation to set up the Databricks. extension and also setup Databricks Connect but don't feel like they work ...

  • 1492 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hi, I'm trying to work on VS code remotely on my machine instead of using the Databricks environment on my browser. I have went through documentation to set up the Databricks. extension and also setup Databricks Connect but don't feel like they work ...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
rahuja
by New Contributor III
  • 1881 Views
  • 1 replies
  • 0 kudos

Py4JError: An error occurred while calling o992.resourceProfileManager

Hello I am trying to run the SparkXGBoostRegressor and I am getting the following error:SpoilerPy4JError: An error occurred while calling o992.resourceProfileManager. Trace: py4j.security.Py4JSecurityException: Method public org.apache.spark.resource...

  • 1881 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hello I am trying to run the SparkXGBoostRegressor and I am getting the following error:SpoilerPy4JError: An error occurred while calling o992.resourceProfileManager. Trace: py4j.security.Py4JSecurityException: Method public org.apache.spark.resource...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
mh_db
by New Contributor III
  • 5812 Views
  • 5 replies
  • 1 kudos

Job parameters to get date and time

I'm trying to set up a workflow in databricks and I need my job parameter to get the date and time. I see in the documentation there's some options for dynamic values.I'm trying to use this one: {{job.start_time.[argument]}}For the "argument" there, ...

  • 5812 Views
  • 5 replies
  • 1 kudos
Latest Reply
brockb
Databricks Employee
  • 1 kudos

Then please change the code to:```iso_datetime = dbutils.widgets.get("LoadID")```

  • 1 kudos
4 More Replies
Surajv
by New Contributor III
  • 775 Views
  • 0 replies
  • 0 kudos

Getting databricks-connect com.fasterxml.jackson.databind.exc.MismatchedInputException parse warning

Hi community, I am getting below warning when I try using pyspark code for some of my use-cases using databricks-connect. Is this a critical warning, and any idea what does it mean?Logs: WARN DatabricksConnectConf: Could not parse /root/.databricks-c...

  • 775 Views
  • 0 replies
  • 0 kudos
rafal_walisko
by New Contributor II
  • 1248 Views
  • 0 replies
  • 0 kudos

Optimal Strategies for downloading large query results with Databricks API

Hi everyone,I'm currently facing an issue with handling a large amount of data using the Databricks API. Specifically, I have a query that returns a significant volume of data, sometimes resulting in over 200 chunks.My initial approach was to retriev...

  • 1248 Views
  • 0 replies
  • 0 kudos
Surajv
by New Contributor III
  • 5827 Views
  • 1 replies
  • 0 kudos

Getting python version errors when using pyspark rdd using databricks connect

Hi community, When I use pyspark rdd related functions in my environment using databricks connect, I get below error: Databricks cluster version: 12.2. `RuntimeError: Python in worker has different version 3.9 than that in driver 3.10, PySpark cannot...

  • 5827 Views
  • 1 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Got it. As a side note, I tried above methods, but the error persisted, hence upon reading docs again, there was this statement: You must install Python 3 on your development machine, and the minor version of your client Python installation must be t...

  • 0 kudos
ymt
by New Contributor II
  • 1893 Views
  • 0 replies
  • 1 kudos

connection from databricks to snowflake using OKTA

Hi team,This is how I connect to Snowflake from Jupyter Notebook:import snowflake.connector snowflake_connection = snowflake.connector.connect( authenticator='externalbrowser', user='U1', account='company1.us-east-1', database='db1',...

  • 1893 Views
  • 0 replies
  • 1 kudos
sharpbetty
by New Contributor II
  • 3166 Views
  • 2 replies
  • 1 kudos

Workflows: Running dependent task despite earlier task fail

I have a scheduled task running in workflow.Task 1 computes some parameters then these are picked up by a dependent reporting task: Task 2.I want Task 2 to report "Failure" if Task 1 fails. Yet creating a dependency in workflows means that Task 2 wil...

Get Started Discussions
tasks
Workflows
  • 3166 Views
  • 2 replies
  • 1 kudos
Latest Reply
NerdSan
New Contributor II
  • 1 kudos

Hi @sharpbetty , Any suggestions how I can keep the parameter sharing and dependency from Task 1 to Task 2, yet also allow Task 2 to fire even on failure of Task 1?Setup:Task 2 dependent on Task1 Challenge: To Fire Task 2 even on Task 1 FailureSoluti...

  • 1 kudos
1 More Replies
Prashanthkumar
by New Contributor III
  • 3139 Views
  • 1 replies
  • 1 kudos

Databricks Users Access Control via Azure AAD?

Hi All,Looking for suggestions to see if it is possible to control users via Azure AD (outside of Azure Databricks). As i want to create a new users in Azure and then I want to give RBAC to individual users and rather than control their permissions f...

  • 3139 Views
  • 1 replies
  • 1 kudos
Latest Reply
Prashanthkumar
New Contributor III
  • 1 kudos

Thank you Kaniz, let me try some of the options as my Databricks is integrated with AAD. Let me try Option 1 as thats my primary requirement.

  • 1 kudos
jvk
by New Contributor III
  • 4213 Views
  • 3 replies
  • 0 kudos

Cant create cluster: "Aws Authorization Failure:" .. not authorized to perform: sts:AssumeRole

Full error here:Aws Authorization Failure:Failure happened when talking to AWS, AWS API error code: AccessDenied AWS error message: User: arn:aws:iam::414351767826:user/ConsolidatedManagerIAMUser-ConsolidatedManagerUser-VX02FYW0SSCY is not authorized...

  • 4213 Views
  • 3 replies
  • 0 kudos
Latest Reply
jvk
New Contributor III
  • 0 kudos

Never mind, I see it now thanks!

  • 0 kudos
2 More Replies
Surajv
by New Contributor III
  • 4343 Views
  • 2 replies
  • 0 kudos

Getting client.session.cache.size warning in pyspark code using databricks connect

Hi Community, I have setup a jupyter notebook in a server and installed databricks connect in its kernel to leverage my databricks cluster compute in the notebook and write pyspark code. Whenever I run my code it gives me below warning: ```WARN Spark...

  • 4343 Views
  • 2 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Thank you @Riyakh 

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors