cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Databricks_Work
by New Contributor II
  • 2474 Views
  • 1 replies
  • 0 kudos

how to access data in one databricks in another databricks

I want to acces data in another databricks in my databricks, how to do that

  • 2474 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Hello, many thanks for your question, to be able to provide you with a more precise response we required some additional information:1. When you refer databricks in my databricks are you refering to access data that is in one workspace to another wor...

  • 0 kudos
hbs59
by New Contributor III
  • 6311 Views
  • 3 replies
  • 2 kudos

Resolved! Move multiple notebooks at the same time (programmatically)

If I want to move multiple (hundreds of) notebooks at the same time from one folder to another, what is the best way to do that? Other than going to each individual notebook and clicking "Move".Is there a way to programmatically move notebooks? Like ...

  • 6311 Views
  • 3 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

You should be redirected to the KB page, but this is the information contained: Problem How to migrate Shared folders and the notebooks Cause Shared notebooks are not migrated into new workspace by default Solution Please find the script to migrate t...

  • 2 kudos
2 More Replies
Phani1
by Valued Contributor II
  • 2873 Views
  • 1 replies
  • 2 kudos

Databricks API using the personal access token

We can access the Azure databricks API using the personal access token which is created by us manually.The objective is that client don’t want to store the personal access token which may not be secure .Do we have option to generate the token during ...

  • 2873 Views
  • 1 replies
  • 2 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 2 kudos

Hi @Phani1 ,Yes, now you can use databricks Create a user token API for create access token via automated API call.Please refer below doc - Create a user token | Token API | REST API reference | Azure Databricks

  • 2 kudos
Eldar_Dragomir
by New Contributor II
  • 5006 Views
  • 3 replies
  • 0 kudos

Databricks Volume. Not able to read a file from Scala.

I used to use dbfs with mounted directories and now I want to switch to Volumes for storing my jars and application.conf for pipelines. I see the file my application.conf in Data Explorer > Catalog > Volumes, I also see the file with dbutils.fs.ls("/...

Get Started Discussions
Databricks
Unity Catalog
  • 5006 Views
  • 3 replies
  • 0 kudos
Latest Reply
argus7057
New Contributor II
  • 0 kudos

Volumes mount are accessible using scala code only on a shared cluster. On single user mode this features is not supported yet. We use init scripts to move contents from Volumes to clusters local drive, when we need to access files from Native Scala ...

  • 0 kudos
2 More Replies
ChristianRRL
by Valued Contributor
  • 4811 Views
  • 2 replies
  • 1 kudos

Resolved! DLT Notebook and Pipeline Separation vs Consolidation

Super basic question. For DLT pipelines I see there's an option to add multiple "Paths". Is it generally best practice to completely separate `bronze` from `silver` notebooks? Or is it more recommended to bundle both raw `bronze` and clean `silver` d...

ChristianRRL_1-1705597040187.png
  • 4811 Views
  • 2 replies
  • 1 kudos
Latest Reply
ChristianRRL
Valued Contributor
  • 1 kudos

This is great! I completely missed the list view before.

  • 1 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 8003 Views
  • 0 replies
  • 0 kudos

Cloudera SQL

Hi Team,Could you please help me how efficiently/quickly can convert Cloudera SQL and Hive SQL Scripts to Pyspark Script.Regards,Phanindra

  • 8003 Views
  • 0 replies
  • 0 kudos
marcusfox
by New Contributor
  • 1537 Views
  • 1 replies
  • 0 kudos

Databricks setup with Azure storage

Hi ,We have an issue with our intial setup and design.We are using a single Azure, Premium, Block blob storage account with hierarchical namespace and LRS enabled.We have three containers within it, one for each environment – Dev -Test – ProdBut the ...

  • 1537 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, could you please check https://community.databricks.com/t5/data-governance/metastore-one-per-account-region-limitation/td-p/41097  and let us know if this discussion helps? 

  • 0 kudos
chrisf_sts
by New Contributor II
  • 3387 Views
  • 0 replies
  • 0 kudos

How to handle complex json schema

I have a mounted external directory that is an s3 bucket with multiple subdirectories containing call log files in json format.  The files are irregular and complex, when i try to use spark.read.json or spark.sql (SELECT *) i get the UNABLE_TO_INFER_...

Get Started Discussions
json
pyspark
schema
  • 3387 Views
  • 0 replies
  • 0 kudos
liefeld
by New Contributor
  • 2248 Views
  • 0 replies
  • 0 kudos

Foreign catalogs aren't populated.

I've created connections to various RDS Aurora databases but always get the same problem - when creating a foreign catalog only the information_schema database is shown in Catalog Explorer.  The AI chat agent has made a few ways to specify the databa...

  • 2248 Views
  • 0 replies
  • 0 kudos
wakutgba
by New Contributor
  • 1796 Views
  • 0 replies
  • 0 kudos

https://groups.google.com/g/ibm.software.network.directory-integrator/c/9ubZOuHJob4/m/JQAdpv5qAgAJ

https://feedback.azure.com/d365community/idea/e7289e29-26b4-ee11-92bc-000d3a037f01https://docs.google.com/document/d/1gVBMFUqZaVteXs9-DbZQmH1CYHajWe3EhU_h3MV4S-s/edithttps://support.google.com/looker-studio/thread/253656391https://bemorepanda.com/en/...

  • 1796 Views
  • 0 replies
  • 0 kudos
Databricks_Work
by New Contributor II
  • 2200 Views
  • 1 replies
  • 0 kudos
  • 2200 Views
  • 1 replies
  • 0 kudos
Latest Reply
Lakshay
Databricks Employee
  • 0 kudos

Hi @Databricks_Work , Vacuum and Analzye are two separate commands that are used for optimizing the queries but they perform two different operations. Vacuum is used to clear the stale data files in your delta table. Vacuum should be run after a opti...

  • 0 kudos
Databricks_Work
by New Contributor II
  • 1939 Views
  • 1 replies
  • 1 kudos
  • 1939 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, You can check the function https://docs.databricks.com/en/sql/language-manual/functions/date_format.html, let us know if this helps. 

  • 1 kudos
mobe
by New Contributor
  • 4166 Views
  • 1 replies
  • 0 kudos

How to query sql warehouse tables with spark?

Hey there... I managed to query my data following this guide https://learn.microsoft.com/en-us/azure/databricks/dev-tools/python-sql-connectorusing databricks sql#!/usr/bin/env python3from databricks import sqlwith sql.connect(server_hostname = "adb-...

  • 4166 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

Hi @mobe  - Please refer to the github link for more examples - https://github.com/databricks/databricks-sql-python/blob/main/examples.  Thanks,Shan

  • 0 kudos
bjjkkk
by New Contributor II
  • 2814 Views
  • 2 replies
  • 1 kudos

Getting 'No GCP Marketplace token provided' error while signing up from GCP marketplace.

Hey guys,I was trying to sign up to the 14 day free trial from GCP marketplace. When I click 'SIGN UP WITH DATABRICKS', I get the error below.HTTP ERROR 401Problem accessing /sign-up. Reason: No GCP Marketplace token provided. Please start over fr...

bjjkkk_0-1704807716840.png
  • 2814 Views
  • 2 replies
  • 1 kudos
Latest Reply
bjjkkk
New Contributor II
  • 1 kudos

Thanks Walter,I have the IAM permissions in place and also have a valid billing account.However, I keep getting the same error regarding the missing Marketplace token. I am clicking the 'SIGN UP WITH DATABRICKS' button from the GCP UI, so am not sure...

  • 1 kudos
1 More Replies
Rizaldy
by New Contributor II
  • 1260 Views
  • 2 replies
  • 0 kudos

HELP opening notebook displays blank, creating new one gives and error and other issues

Hi,SituationI just literally started using data bricks. I created a workspace, a cluster and uploaded a notebook. But my workspace doesn’t seem to function correctly at the moment.I will attach what it looks like when I try to open a notebookopening ...

Screenshot 2024-01-10 at 4.09.55 PM.png Screenshot 2024-01-10 at 4.24.16 PM.png Screenshot 2024-01-10 at 4.25.32 PM.png Screenshot 2024-01-10 at 4.39.13 PM.png
  • 1260 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rizaldy
New Contributor II
  • 0 kudos

UPDATEI have downloaded chrome and this does not happen for it as well

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors