cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Eldar_Dragomir
by New Contributor II
  • 3630 Views
  • 3 replies
  • 0 kudos

Databricks Volume. Not able to read a file from Scala.

I used to use dbfs with mounted directories and now I want to switch to Volumes for storing my jars and application.conf for pipelines. I see the file my application.conf in Data Explorer > Catalog > Volumes, I also see the file with dbutils.fs.ls("/...

Get Started Discussions
Databricks
Unity Catalog
  • 3630 Views
  • 3 replies
  • 0 kudos
Latest Reply
argus7057
New Contributor II
  • 0 kudos

Volumes mount are accessible using scala code only on a shared cluster. On single user mode this features is not supported yet. We use init scripts to move contents from Volumes to clusters local drive, when we need to access files from Native Scala ...

  • 0 kudos
2 More Replies
ChristianRRL
by Contributor II
  • 3105 Views
  • 2 replies
  • 1 kudos

Resolved! DLT Notebook and Pipeline Separation vs Consolidation

Super basic question. For DLT pipelines I see there's an option to add multiple "Paths". Is it generally best practice to completely separate `bronze` from `silver` notebooks? Or is it more recommended to bundle both raw `bronze` and clean `silver` d...

ChristianRRL_1-1705597040187.png
  • 3105 Views
  • 2 replies
  • 1 kudos
Latest Reply
ChristianRRL
Contributor II
  • 1 kudos

This is great! I completely missed the list view before.

  • 1 kudos
1 More Replies
Databricks_Work
by New Contributor II
  • 1759 Views
  • 2 replies
  • 0 kudos
  • 1759 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
Databricks_Work
by New Contributor II
  • 1654 Views
  • 2 replies
  • 1 kudos
  • 1654 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 1 kudos
1 More Replies
marcusfox
by New Contributor
  • 931 Views
  • 2 replies
  • 0 kudos

Databricks setup with Azure storage

Hi ,We have an issue with our intial setup and design.We are using a single Azure, Premium, Block blob storage account with hierarchical namespace and LRS enabled.We have three containers within it, one for each environment – Dev -Test – ProdBut the ...

  • 931 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor
  • 7515 Views
  • 1 replies
  • 0 kudos

Cloudera SQL

Hi Team,Could you please help me how efficiently/quickly can convert Cloudera SQL and Hive SQL Scripts to Pyspark Script.Regards,Phanindra

  • 7515 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Phani1, One way to convert Cloudera SQL and Hive SQL scripts to Pyspark script is to use the sqlContext.sql() method, which allows you to execute SQL queries in Pyspark and return the results as a DataFrame.

  • 0 kudos
chrisf_sts
by New Contributor II
  • 2242 Views
  • 1 replies
  • 0 kudos

How to handle complex json schema

I have a mounted external directory that is an s3 bucket with multiple subdirectories containing call log files in json format.  The files are irregular and complex, when i try to use spark.read.json or spark.sql (SELECT *) i get the UNABLE_TO_INFER_...

Get Started Discussions
json
pyspark
schema
  • 2242 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @chrisf_sts, One possible approach is to use the spark.read.option("multiline", "true") method to read multi-line JSON files into a Spark DataFrame. This option allows Spark to handle JSON objects that span multiple lines. You can also use the inf...

  • 0 kudos
dhrubg
by New Contributor
  • 2847 Views
  • 2 replies
  • 0 kudos

Resolved! Data bricks for practice at no cost which cloud service or combination i need to use

Hi All Senior ,Context :I want to use databricks for practice to create projects and keep polishing my knowledge. My free credits are already used up . Now can you pls give me tips on how to run databricks in which cloud provider (storage account com...

  • 2847 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
liefeld
by New Contributor
  • 1774 Views
  • 1 replies
  • 0 kudos

Foreign catalogs aren't populated.

I've created connections to various RDS Aurora databases but always get the same problem - when creating a foreign catalog only the information_schema database is shown in Catalog Explorer.  The AI chat agent has made a few ways to specify the databa...

  • 1774 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @liefeld, Could you please paste the error stack here?    

  • 0 kudos
wakutgba
by New Contributor
  • 1568 Views
  • 0 replies
  • 0 kudos

https://groups.google.com/g/ibm.software.network.directory-integrator/c/9ubZOuHJob4/m/JQAdpv5qAgAJ

https://feedback.azure.com/d365community/idea/e7289e29-26b4-ee11-92bc-000d3a037f01https://docs.google.com/document/d/1gVBMFUqZaVteXs9-DbZQmH1CYHajWe3EhU_h3MV4S-s/edithttps://support.google.com/looker-studio/thread/253656391https://bemorepanda.com/en/...

  • 1568 Views
  • 0 replies
  • 0 kudos
mobe
by New Contributor
  • 2418 Views
  • 1 replies
  • 0 kudos

How to query sql warehouse tables with spark?

Hey there... I managed to query my data following this guide https://learn.microsoft.com/en-us/azure/databricks/dev-tools/python-sql-connectorusing databricks sql#!/usr/bin/env python3from databricks import sqlwith sql.connect(server_hostname = "adb-...

  • 2418 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

Hi @mobe  - Please refer to the github link for more examples - https://github.com/databricks/databricks-sql-python/blob/main/examples.  Thanks,Shan

  • 0 kudos
sheilaL
by New Contributor II
  • 2298 Views
  • 2 replies
  • 0 kudos

File size upload limit through CLI

Does anyone know the size limit for uploading files through the CLI? I'm not finding it in the documentation.

  • 2298 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @sheilaL, For Databricks, the size limit for uploading files through the Command Line Interface (CLI) is 2GB. If you use local file I/O APIs to read or write files larger than 2GB, you might see corrupted files. Instead, for files larger than 2GB,...

  • 0 kudos
1 More Replies
bjjkkk
by New Contributor II
  • 2422 Views
  • 2 replies
  • 1 kudos

Getting 'No GCP Marketplace token provided' error while signing up from GCP marketplace.

Hey guys,I was trying to sign up to the 14 day free trial from GCP marketplace. When I click 'SIGN UP WITH DATABRICKS', I get the error below.HTTP ERROR 401Problem accessing /sign-up. Reason: No GCP Marketplace token provided. Please start over fr...

bjjkkk_0-1704807716840.png
  • 2422 Views
  • 2 replies
  • 1 kudos
Latest Reply
bjjkkk
New Contributor II
  • 1 kudos

Thanks Walter,I have the IAM permissions in place and also have a valid billing account.However, I keep getting the same error regarding the missing Marketplace token. I am clicking the 'SIGN UP WITH DATABRICKS' button from the GCP UI, so am not sure...

  • 1 kudos
1 More Replies
Rizaldy
by New Contributor II
  • 561 Views
  • 2 replies
  • 0 kudos

HELP opening notebook displays blank, creating new one gives and error and other issues

Hi,SituationI just literally started using data bricks. I created a workspace, a cluster and uploaded a notebook. But my workspace doesn’t seem to function correctly at the moment.I will attach what it looks like when I try to open a notebookopening ...

Screenshot 2024-01-10 at 4.09.55 PM.png Screenshot 2024-01-10 at 4.24.16 PM.png Screenshot 2024-01-10 at 4.25.32 PM.png Screenshot 2024-01-10 at 4.39.13 PM.png
  • 561 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rizaldy
New Contributor II
  • 0 kudos

UPDATEI have downloaded chrome and this does not happen for it as well

  • 0 kudos
1 More Replies
SalDossored
by New Contributor II
  • 2020 Views
  • 2 replies
  • 1 kudos

PPT material or document from Databricks Learning

Hello Databricks Community,I am a beginner with Databricks. I am wondering if we can download power point slides or learning documents from the Databricks Learning Platform. I like to read after taking the online course. Could you let me know? Curren...

Get Started Discussions
Learning Databricks
study material
  • 2020 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Thank you for posting your concern on Community! To expedite your request, please list your concerns on our ticketing portal. Our support staff would be able to act faster on the resolution (our standard resolution time is 24-48 hours).

  • 1 kudos
1 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors