cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mshettar
by New Contributor II
  • 2437 Views
  • 2 replies
  • 0 kudos

Databricks CLI's workspace export_dir command adds unnecessary edits despite not making any change in the workspace

databricks workspace export_dir / export command with overwrite option enabled adds non-existent changes in the target directory. 1. It introduces new line deletion and 2. add/deletion of MAGIC comments despite not making any meaningful changes in th...

Screenshot 2023-06-06 at 2.44.48 PM
  • 2437 Views
  • 2 replies
  • 0 kudos
Latest Reply
RyanHager
Contributor
  • 0 kudos

I am encountering this issue as well and it did not happen previously.  Additionally, you see this pattern if you are using repos internally and make a change to a notebook in another section.

  • 0 kudos
1 More Replies
AdamRink
by New Contributor III
  • 1696 Views
  • 1 replies
  • 0 kudos

Using Repos CLI to create a repo but getting Parent Directory /Repos/develop/ does not exist.

I would like to create the directory develop under Repos as part of the script, then link it to github and update it? How can I do this?

  • 1696 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16539034020
Databricks Employee
  • 0 kudos

Hi, Adam:Repos CLI does not have specific functionality to create directories in Databricks Repos. Please check the following doc for more information: https://docs.databricks.com/dev-tools/cli/repos-cli.htmlYou cold use  run databricks workspace mkd...

  • 0 kudos
ducng
by New Contributor II
  • 6104 Views
  • 1 replies
  • 0 kudos

VScode extension - certificate signature failure

Hi everyone,I'm trying to use the new Databricks extension (v0.3.10) for VS code (v1.77.3).I face this problem when connecting to our workspace:This problem persists when I tried to login through az CLI with our SSO, or through local config using PAT...

image.png
  • 6104 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Minh Duc Nguyen​ :It seems like the error you are facing is due to a failure in verifying the SSL certificate of your Databricks workspace. To resolve this, you need to add the custom CA certificate to your VS Code settings. Here's how you can do it...

  • 0 kudos
k9
by New Contributor II
  • 3932 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks CLI v0.17.6 issue

I do have multiple groups created in my databricks account and I have databricks cli installed on my mac. Some of the cli functions return errors that i cannot find solution for. databricks groups listReturns:Error: b'{"error_code":"INTERNAL_ERROR","...

  • 3932 Views
  • 3 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@kenan hasanov​ which version python you have installed on your machine please, you need to have 3-3.6 or 2-2.7.9 above , please try to go with latest one as you are only seeing issues with few functions. please raise issue in case if you are still f...

  • 1 kudos
2 More Replies
Ravikumashi
by Contributor
  • 1249 Views
  • 2 replies
  • 0 kudos

access databricks secretes in int script

we are trying install databricks cli on init scripts and in order to do this we need to autheticate with databricks token but it is not secure as anyone got access to cluster can get hold of this databricks token.we try to inject the secretes into se...

  • 1249 Views
  • 2 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

I think you don't need to install CLI. There is a whole API available via notebook. below is example:import requests ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext() host_name = ctx.tags().get("browserHostName").get() host_toke...

  • 0 kudos
1 More Replies
talha
by New Contributor III
  • 3578 Views
  • 5 replies
  • 0 kudos

spark-submit Error "Unrecognized option: --executor-memory 3G" although --executor-memory is available in Options.

Executed a spark-submit job through databricks cli with the following job configurations.{ "job_id": 123, "creator_user_name": "******", "run_as_user_name": "******", "run_as_owner": true, "settings": { "name": "44aa-8447-c123aad310", ...

  • 3578 Views
  • 5 replies
  • 0 kudos
Latest Reply
talha
New Contributor III
  • 0 kudos

Not really sure if running spark on local mode. But have used alternate property spark.executor.memoryand passed it as --conf and now it works

  • 0 kudos
4 More Replies
elgeo
by Valued Contributor II
  • 3974 Views
  • 1 replies
  • 3 kudos

Resolved! Generate new token error

Hello. I need to install Databricks CLI. While I am trying to generate new access token (User Settings->Generate new token) I get the following error:Could not create token with comment "cli" and lifetime (seconds) of 86400.I tried with different com...

  • 3974 Views
  • 1 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Please check in the Admin console that tokens are enabled and that you can manage them.

  • 3 kudos
huggies_23
by New Contributor
  • 984 Views
  • 0 replies
  • 0 kudos

Is it possible to specify a specific branch commit when deploying repo to a workspace via the Databricks CLI?

I would like to know if it is possible to include a specific commit identifier when updating a repo in a workspace via the Databricks CLI.Why? Currently we use the repos CLI to push updates to code throughout dev, test and prod (testing along the wa...

  • 984 Views
  • 0 replies
  • 0 kudos
Orianh
by Valued Contributor II
  • 5421 Views
  • 6 replies
  • 2 kudos

Resolved! Databrikcs job cli

Hey guys, I'm trying to create a job via databricks cli, This job is going to use a wheell file that I already upload to dbfs and exported from this package the entry point that needed for the job.In the UI I can see that the job has been created, Bu...

  • 5421 Views
  • 6 replies
  • 2 kudos
Latest Reply
Vivian_Wilfred
Databricks Employee
  • 2 kudos

Hi @orian hindi​ , adding the wheel package in the "libraries" section of json file will always try to install the whl on a cluster level that requires manage access, irrespective of job cluster or an existing interactive cluster. You cannot achieve ...

  • 2 kudos
5 More Replies
naveenmamidala
by New Contributor II
  • 21994 Views
  • 1 replies
  • 1 kudos

Error: ConnectionError: HTTPSConnectionPool(host='https', port=443): Max retries exceeded with url: /api/2.0/workspace/list?path=%2F (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))

Error: ConnectionError: HTTPSConnectionPool(host='https', port=443): Max retries exceeded with url: /api/2.0/workspace/list?path=%2F (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000001CAF52B4640>: Failed to establis...

  • 21994 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sajith
New Contributor II
  • 1 kudos

Set HTTPS proxy sever in CLI and it started working without any errorset HTTPS_PROXY=http://username:password@{proxy host}:{port}

  • 1 kudos
Juniper_AIML
by New Contributor
  • 4119 Views
  • 3 replies
  • 0 kudos

How to access the virtual environment directory where the databricks notebooks are running?

How to get access to a separate virtual environment space and its storage location on databricks so that we can move our created libraries into it without waiting for their installation each time the cluster is brought up.What we want basically is a ...

  • 4119 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hey there @Aman Gaurav​ Thank you for posting your question.Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!

  • 0 kudos
2 More Replies
ak09
by New Contributor
  • 813 Views
  • 0 replies
  • 0 kudos

Triggering Notebook in Azure Repos via Azure DevOps

I have been using Databricks workspace for all my data science projects in my firm. In my current project, I have built a CI pipeline using databricks-cli & Azure DevOps. Using databricks-cli I can trigger the Notebook which is present in my workspa...

  • 813 Views
  • 0 replies
  • 0 kudos
User16790091296
by Contributor II
  • 2662 Views
  • 2 replies
  • 5 kudos

Resolved! How do I use databricks-cli without manual configuration

I want to use databricks cli:databricks clusters listbut this requires a manual step that requires interactive work with the user:databricks configure --tokenIs there a way to use databricks cli without manual intervention so that you can run it as p...

  • 2662 Views
  • 2 replies
  • 5 kudos
Latest Reply
alexott
Databricks Employee
  • 5 kudos

You can set two environment variables: DATABRICKS_HOST and DATABRICKS_TOKEN, and databricks-cli will use them. See the example of that in the DevOps pipelinesee the full list of environment variables at the end of the Authentication section of docume...

  • 5 kudos
1 More Replies
Sarvagna_Mahaka
by New Contributor III
  • 16362 Views
  • 6 replies
  • 8 kudos

Resolved! Exporting csv files from Databricks

I'm trying to export a csv file from my Databricks workspace to my laptop.I have followed the below steps. 1.Installed databricks CLI2. Generated Token in Azure Databricks3. databricks configure --token5. Token:xxxxxxxxxxxxxxxxxxxxxxxxxx6. databrick...

  • 16362 Views
  • 6 replies
  • 8 kudos
Latest Reply
User16871418122
Contributor III
  • 8 kudos

Hi @Sarvagna Mahakali​ There is an easier hack: a) You can save results locally on the disk and create a hyper link for downloading CSV . You can copy the file to this location: dbfs:/FileStore/table1_good_2020_12_18_07_07_19.csvb) Then download with...

  • 8 kudos
5 More Replies
Labels