cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Jerry01
by New Contributor III
  • 310 Views
  • 1 replies
  • 0 kudos

How to override a in-built function in databricks

I am trying to override is_member() in-built function in such a way that, it always return true. How to do it in databricks using sql or python?

  • 310 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Jerry01, To override the is_member() built-in function in Databricks so that it always returns true, you can follow these steps: Python Approach: Unfortunately, Databricks does not directly allow you to override built-in functions like is_mem...

  • 0 kudos
samarth_solanki
by New Contributor II
  • 673 Views
  • 2 replies
  • 0 kudos

Creating a python package that uses dbutils.secrets

Hello Databricks,I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook I want to use dbutils.secret for ...

  • 673 Views
  • 2 replies
  • 0 kudos
Latest Reply
mfall-shift
New Contributor II
  • 0 kudos

@samarth_solanki wrote:Hello Databricks,I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook I want to ...

  • 0 kudos
1 More Replies
sbs
by New Contributor II
  • 694 Views
  • 3 replies
  • 0 kudos

issue when reading csv in pandas

 Hello Team,I've encountered an issue while attempting to read a CSV data file into a pandas DataFrame by uploading it into DBFS in the community version of Databricks. Below is the error I encountered along with the code snippet I used:import pandas...

  • 694 Views
  • 3 replies
  • 0 kudos
Latest Reply
YuliyanBogdanov
New Contributor III
  • 0 kudos

Assuming dbutils.fs.ls works without the "dbfs:/" prefix, try using it directly, i.e. df1 = pd.read_csv("/FileStore/shared_uploads/shiv/Dea.csv") . Alternatively, adjust the path as needed if using a local file path df1 = pd.read_csv("dbfs:/FileStore...

  • 0 kudos
2 More Replies
thibault
by Contributor
  • 1920 Views
  • 6 replies
  • 0 kudos

Asset Bundles git branch per target

Hi,I am migrating from dbx to Databricks Asset Bundles (DAB) a deployment setup where I have specific parameters per environment. This was working well with dbx, and I am trying now to define those parameters defining targets (3 targets : dev, uat, p...

  • 1920 Views
  • 6 replies
  • 0 kudos
Latest Reply
thibault
Contributor
  • 0 kudos

Something must have changed in the meantime on Databricks side. I have only updated databricks CLI to 016 and now, using a git / branch under each target deploys this setup, where feature-dab is the branch I want the job to pull sources from, I see t...

  • 0 kudos
5 More Replies
Prasad_Koneru
by New Contributor II
  • 629 Views
  • 1 replies
  • 0 kudos

DevOps Pipeline failing after implementing Private End Points

Hi Team,I have created devops pipeline for databricks deployment on different environments and which got succussed but recently i have implemented the PEP's on databricks and devops pipeline getting failed with below error.Error: JSONDecodeError: Exp...

  • 629 Views
  • 1 replies
  • 0 kudos
Latest Reply
Prasad_Koneru
New Contributor II
  • 0 kudos

Bump@Kaniz  

  • 0 kudos
Gopi9
by New Contributor II
  • 981 Views
  • 2 replies
  • 0 kudos

Need Guidance on Key Rotation Process for Storage Customer-Managed Keys in Databricks Workspace

Problem Statement: We are currently utilizing customer-managed keys for Databricks compute encryption at the workspace level. As part of our key rotation strategy, we find ourselves needing to bring down the entire compute/clusters to update storage ...

  • 981 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor III
  • 0 kudos

Maybe you can use azure key vault to store customer-managed keyshttps://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#--create-an-azure-key-vault-backed-secret-scope 

  • 0 kudos
1 More Replies
tajinder123
by New Contributor II
  • 1890 Views
  • 5 replies
  • 1 kudos

Resolved! Delta External table

Hi I am new to databricks and need some inputs.I am trying to create Delta External table in databricks using existing path which contains csv files.What i observed is below code will create EXTERNAL table but provider is CSV.------------------------...

  • 1890 Views
  • 5 replies
  • 1 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 1 kudos

@tajinder123 - can you please modify the syntax as below to create as a delta table CREATE TABLE employee123 USING DELTA LOCATION '/path/to/existing/delta/files';  

  • 1 kudos
4 More Replies
stevieg95
by New Contributor III
  • 918 Views
  • 2 replies
  • 5 kudos

Getting java.util.concurrent.TimeoutException: Timed out after 15 seconds on community edition

Im using databricks communtiy edition for learning purpose and im whenever im running notebook, im getting:Exception when creating execution context: java.util.concurrent.TimeoutException: Timed out after 15 seconds databricks.I have deleted cluster ...

stevieg95_0-1711552230784.png
  • 918 Views
  • 2 replies
  • 5 kudos
Latest Reply
VenuG
New Contributor III
  • 5 kudos

Hello, Databricks Team,My students are reporting none of them are able to use DBCE, and are running into this same error when they spin up an instance with defaults (DBR 12.2 LTS). Some have reported seeing this error since last night (3/26 ET). Coul...

  • 5 kudos
1 More Replies
vieiradsousa
by New Contributor II
  • 708 Views
  • 2 replies
  • 0 kudos

Validating Dlt Pipeline

Whenever I try validating a pipeline that already runs productively without any issue, it throws me the following error:BAD_REQUEST: Failed to load notebook '/Repos/(...).sql'. Only SQL and Python notebooks are supported currently.

  • 708 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @vieiradsousa,  Verify that the path to your notebook is correctly specified. The error message mentions â€˜/Repos/(…).sql’, which seems to be a placeholder.Make sure the actual path points to the correct location of your notebook file.

  • 0 kudos
1 More Replies
HeijerM84
by New Contributor III
  • 2453 Views
  • 7 replies
  • 4 kudos

Resolved! Databricks runtime 14.3 gives error scala.math.BigInt cannot be cast to java.lang.Integer

We have a cluster running on 13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12).We want to test with a different type of cluster (14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12))And all of a sudden we get errors that complain about a casting a Big...

  • 2453 Views
  • 7 replies
  • 4 kudos
Latest Reply
jeroenvs
New Contributor III
  • 4 kudos

I have logged the issue with Microsoft last week and they confirmed it is a Databricks bug. A fix is supposedly being rolled out at the moment across Databricks regions. As anticipated, we have engaged the Databricks core team to further investigate ...

  • 4 kudos
6 More Replies
Wojciech_BUK
by Contributor III
  • 585 Views
  • 1 replies
  • 0 kudos

All my replies has been deleted and new one is being moderated (deleted) - why?

I noticed that when I Reply to post and trying to help solve community problem - my post are being either moderated (deleted) or just not being saved. Old post/replies has been deleted.Is there any reason for that?I kind of lost my will to participat...

  • 585 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Wojciech_BUK , I'm sorry to hear about the issues you've been experiencing with your posts and replies in the community. It's understandable that this situation has been disheartening for you. I'll investigate this matter further to understand wh...

  • 0 kudos
AravindNani
by New Contributor
  • 785 Views
  • 2 replies
  • 0 kudos

Unable to read data from API due to Private IP Restriction

I have data in my API Endpoint but am unable to read it using Databricks. My data is limited to my private IP address and can only be accessed over a VPN connection. I can't read data into Databricks as a result of this. I can obtain the data in VS C...

  • 785 Views
  • 2 replies
  • 0 kudos
Latest Reply
Wojciech_BUK
Contributor III
  • 0 kudos

Hi AravindNaniThis is more of infrastructure questions, you have to make sure that:1) Your databricks Workspace is provisioned in VNET Injection mode2) Your VNET is either peered to "HUB" network where you have S2S VPN Connection to API or you have t...

  • 0 kudos
1 More Replies
stevieg95
by New Contributor III
  • 1203 Views
  • 7 replies
  • 2 kudos

Unable to login to databricks community edition

Hi, I was using databricks and it was working fine. Im using community edition. Suddenly, it logged me out. I also clicked on forgot password, changed the password and when i tried to login again, i keeps me redirecting to login page without any erro...

stevieg95_0-1711422006387.png
  • 1203 Views
  • 7 replies
  • 2 kudos
Latest Reply
MohammedIrfan
New Contributor II
  • 2 kudos

Hi, i am also facing same issue.

  • 2 kudos
6 More Replies
Shravanshibu
by New Contributor III
  • 342 Views
  • 1 replies
  • 0 kudos

Public preview API not working - artifact-allowlists

 I am trying to hit /api/2.1/unity-catalog/artifact-allowlists/as a part of INIT migration script. Its is in public preview, do we need to enable anything else to use a API which is in Public preview. I am getting 404 error. But using same token for ...

  • 342 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Shravanshibu ,  Verify that the URL you are using to access the API is correct. Ensure there are no typos or missing segments in the URL.Double-check the endpoint path: /api/2.1/unity-catalog/artifact-allowlists/.Confirm that the API is available...

  • 0 kudos
RozaZaharieva
by New Contributor
  • 484 Views
  • 1 replies
  • 0 kudos

set up Azure Databricks workspace and Unity catalog - how to automate not using Terraform

Hi everyone, I am looking for a way to automate initial setup of Azure Databricks workspace and Unity Catalog but can't find anything on this topic other than Terraform. Can you share if this is possible with powershell, for example. Thank you un adv...

  • 484 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @RozaZaharieva, Automating the initial setup of Azure Databricks workspace and Unity Catalog is crucial for efficient management. Let’s break it down: Unity Catalog Setup: Unity Catalog is used to manage data in your Azure Databricks workspac...

  • 0 kudos