cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

margutie
by New Contributor
  • 763 Views
  • 2 replies
  • 0 kudos

Error from Knime trought proxy

I want to connect to Databricks from Knime on a company computer that uses a proxy. The error I'm encountering is as follows: ERROR Create Databricks Environment 3:1 Execute failed: Could not open the client transport with JDBC URI: jdbc:hive2://adb-...

  • 763 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 1188 Views
  • 2 replies
  • 1 kudos

Billing usage per user

Hi Team ,Unity catalog is not enabled in our workspace, We would like to know the billing usage information per user ,could you please help us how to get these details( by using notebook level script).Regards,Phanindra

  • 1188 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 1 kudos
1 More Replies
udi_azulay
by New Contributor II
  • 717 Views
  • 2 replies
  • 0 kudos

Running sql command on Single User cluster vs Shared.

Hi, when i am running the below simple code over my Unity Catalog on a Shared cluster, it works very well.But on a Single User - i am getting : Failed to acquire a SAS token for list on /__unitystorage/schemas/1bb5b053-ac96-471b-8077-8288c56c8a20/tab...

  • 717 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
Databricks_Work
by New Contributor II
  • 2199 Views
  • 1 replies
  • 0 kudos

how to access data in one databricks in another databricks

I want to acces data in another databricks in my databricks, how to do that

  • 2199 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

Hello, many thanks for your question, to be able to provide you with a more precise response we required some additional information:1. When you refer databricks in my databricks are you refering to access data that is in one workspace to another wor...

  • 0 kudos
hbs59
by New Contributor III
  • 5165 Views
  • 4 replies
  • 2 kudos

Resolved! Move multiple notebooks at the same time (programmatically)

If I want to move multiple (hundreds of) notebooks at the same time from one folder to another, what is the best way to do that? Other than going to each individual notebook and clicking "Move".Is there a way to programmatically move notebooks? Like ...

  • 5165 Views
  • 4 replies
  • 2 kudos
Latest Reply
Walter_C
Honored Contributor
  • 2 kudos

You should be redirected to the KB page, but this is the information contained: Problem How to migrate Shared folders and the notebooks Cause Shared notebooks are not migrated into new workspace by default Solution Please find the script to migrate t...

  • 2 kudos
3 More Replies
Phani1
by Valued Contributor II
  • 2344 Views
  • 2 replies
  • 2 kudos

Databricks API using the personal access token

We can access the Azure databricks API using the personal access token which is created by us manually.The objective is that client don’t want to store the personal access token which may not be secure .Do we have option to generate the token during ...

  • 2344 Views
  • 2 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 2 kudos
1 More Replies
Eldar_Dragomir
by New Contributor II
  • 4214 Views
  • 3 replies
  • 0 kudos

Databricks Volume. Not able to read a file from Scala.

I used to use dbfs with mounted directories and now I want to switch to Volumes for storing my jars and application.conf for pipelines. I see the file my application.conf in Data Explorer > Catalog > Volumes, I also see the file with dbutils.fs.ls("/...

Get Started Discussions
Databricks
Unity Catalog
  • 4214 Views
  • 3 replies
  • 0 kudos
Latest Reply
argus7057
New Contributor II
  • 0 kudos

Volumes mount are accessible using scala code only on a shared cluster. On single user mode this features is not supported yet. We use init scripts to move contents from Volumes to clusters local drive, when we need to access files from Native Scala ...

  • 0 kudos
2 More Replies
ChristianRRL
by Contributor III
  • 3462 Views
  • 2 replies
  • 1 kudos

Resolved! DLT Notebook and Pipeline Separation vs Consolidation

Super basic question. For DLT pipelines I see there's an option to add multiple "Paths". Is it generally best practice to completely separate `bronze` from `silver` notebooks? Or is it more recommended to bundle both raw `bronze` and clean `silver` d...

ChristianRRL_1-1705597040187.png
  • 3462 Views
  • 2 replies
  • 1 kudos
Latest Reply
ChristianRRL
Contributor III
  • 1 kudos

This is great! I completely missed the list view before.

  • 1 kudos
1 More Replies
Databricks_Work
by New Contributor II
  • 1938 Views
  • 2 replies
  • 0 kudos
  • 1938 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
Databricks_Work
by New Contributor II
  • 1791 Views
  • 2 replies
  • 1 kudos
  • 1791 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 1 kudos
1 More Replies
marcusfox
by New Contributor
  • 1152 Views
  • 2 replies
  • 0 kudos

Databricks setup with Azure storage

Hi ,We have an issue with our intial setup and design.We are using a single Azure, Premium, Block blob storage account with hierarchical namespace and LRS enabled.We have three containers within it, one for each environment – Dev -Test – ProdBut the ...

  • 1152 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 7704 Views
  • 1 replies
  • 0 kudos

Cloudera SQL

Hi Team,Could you please help me how efficiently/quickly can convert Cloudera SQL and Hive SQL Scripts to Pyspark Script.Regards,Phanindra

  • 7704 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Phani1, One way to convert Cloudera SQL and Hive SQL scripts to Pyspark script is to use the sqlContext.sql() method, which allows you to execute SQL queries in Pyspark and return the results as a DataFrame.

  • 0 kudos
chrisf_sts
by New Contributor II
  • 2890 Views
  • 1 replies
  • 0 kudos

How to handle complex json schema

I have a mounted external directory that is an s3 bucket with multiple subdirectories containing call log files in json format.  The files are irregular and complex, when i try to use spark.read.json or spark.sql (SELECT *) i get the UNABLE_TO_INFER_...

Get Started Discussions
json
pyspark
schema
  • 2890 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @chrisf_sts, One possible approach is to use the spark.read.option("multiline", "true") method to read multi-line JSON files into a Spark DataFrame. This option allows Spark to handle JSON objects that span multiple lines. You can also use the inf...

  • 0 kudos
dhrubg
by New Contributor
  • 4562 Views
  • 2 replies
  • 0 kudos

Resolved! Data bricks for practice at no cost which cloud service or combination i need to use

Hi All Senior ,Context :I want to use databricks for practice to create projects and keep polishing my knowledge. My free credits are already used up . Now can you pls give me tips on how to run databricks in which cloud provider (storage account com...

  • 4562 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
liefeld
by New Contributor
  • 1963 Views
  • 1 replies
  • 0 kudos

Foreign catalogs aren't populated.

I've created connections to various RDS Aurora databases but always get the same problem - when creating a foreign catalog only the information_schema database is shown in Catalog Explorer.  The AI chat agent has made a few ways to specify the databa...

  • 1963 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @liefeld, Could you please paste the error stack here?    

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels