cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sbs
by New Contributor II
  • 2314 Views
  • 3 replies
  • 0 kudos

issue when reading csv in pandas

 Hello Team,I've encountered an issue while attempting to read a CSV data file into a pandas DataFrame by uploading it into DBFS in the community version of Databricks. Below is the error I encountered along with the code snippet I used:import pandas...

  • 2314 Views
  • 3 replies
  • 0 kudos
Latest Reply
YuliyanBogdanov
New Contributor III
  • 0 kudos

Assuming dbutils.fs.ls works without the "dbfs:/" prefix, try using it directly, i.e. df1 = pd.read_csv("/FileStore/shared_uploads/shiv/Dea.csv") . Alternatively, adjust the path as needed if using a local file path df1 = pd.read_csv("dbfs:/FileStore...

  • 0 kudos
2 More Replies
Kayla
by Valued Contributor II
  • 2699 Views
  • 0 replies
  • 0 kudos

Export notebook dashboard

I'm looking to find way to export notebook dashboards as HTML files.We will be scheduling the notebook via workflows, so I'm not sure if we'd be looking at exporting something from the workflow via API or if there's a better way to do this.I'm also c...

  • 2699 Views
  • 0 replies
  • 0 kudos
Prasad_Koneru
by New Contributor III
  • 1721 Views
  • 1 replies
  • 0 kudos

DevOps Pipeline failing after implementing Private End Points

Hi Team,I have created devops pipeline for databricks deployment on different environments and which got succussed but recently i have implemented the PEP's on databricks and devops pipeline getting failed with below error.Error: JSONDecodeError: Exp...

  • 1721 Views
  • 1 replies
  • 0 kudos
Latest Reply
Prasad_Koneru
New Contributor III
  • 0 kudos

Bump@Retired_mod  

  • 0 kudos
Gopi9
by New Contributor II
  • 2720 Views
  • 2 replies
  • 0 kudos

Need Guidance on Key Rotation Process for Storage Customer-Managed Keys in Databricks Workspace

Problem Statement: We are currently utilizing customer-managed keys for Databricks compute encryption at the workspace level. As part of our key rotation strategy, we find ourselves needing to bring down the entire compute/clusters to update storage ...

  • 2720 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 0 kudos

Maybe you can use azure key vault to store customer-managed keyshttps://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#--create-an-azure-key-vault-backed-secret-scope 

  • 0 kudos
1 More Replies
DatabricksGuide
by New Contributor III
  • 1666 Views
  • 0 replies
  • 0 kudos

Join Our Databricks Free Trial Experience feedback AMA on Friday March 29, 2024!

We're looking for feedback on the Databricks free trial experience, and we need your help! Whether you've used it for data engineering, data science, or analytics, Sujit Nair, our Product Manager on the free trial experience, and our journey archite...

  • 1666 Views
  • 0 replies
  • 0 kudos
alpacas
by New Contributor II
  • 1412 Views
  • 0 replies
  • 0 kudos

Help with getting photon usage

I want to get whether photon was used for a job or not. The api lets me get this for maybe 40% of jobs through the runtime_engine field, but the majority of jobs are unspecified. How do I get whether photon was used for those cases? The docs mention ...

  • 1412 Views
  • 0 replies
  • 0 kudos
tajinder123
by New Contributor II
  • 6901 Views
  • 5 replies
  • 1 kudos

Resolved! Delta External table

Hi I am new to databricks and need some inputs.I am trying to create Delta External table in databricks using existing path which contains csv files.What i observed is below code will create EXTERNAL table but provider is CSV.------------------------...

  • 6901 Views
  • 5 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

@tajinder123 - can you please modify the syntax as below to create as a delta table CREATE TABLE employee123 USING DELTA LOCATION '/path/to/existing/delta/files';  

  • 1 kudos
4 More Replies
Frustrated_DE
by New Contributor III
  • 2222 Views
  • 1 replies
  • 0 kudos

DLT SQL demo pipeline issue

Hi,   First foray into DLT and following code exerts from the sample-DLT-notebook.I'm creating a notebook with the SQL below:CREATE STREAMING LIVE TABLE sales_orders_rawCOMMENT "The raw sales orders, ingested from /databricks-datasets."TBLPROPERTIES ...

  • 2222 Views
  • 1 replies
  • 0 kudos
Latest Reply
Frustrated_DE
New Contributor III
  • 0 kudos

If you change the notebook default language as opposed to using magic command. I normally have it set to Python, I've wrongly assumed DLT would transpose as can't use magic command but have to change default in order for it to work. 

  • 0 kudos
vieiradsousa
by New Contributor II
  • 2079 Views
  • 1 replies
  • 0 kudos

Validating Dlt Pipeline

Whenever I try validating a pipeline that already runs productively without any issue, it throws me the following error:BAD_REQUEST: Failed to load notebook '/Repos/(...).sql'. Only SQL and Python notebooks are supported currently.

  • 2079 Views
  • 1 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels