cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JrVerbiest
by New Contributor II
  • 746 Views
  • 2 replies
  • 3 kudos

Resolved! Databricks Free Edition - compute does not start

Hello,I am using the Databricks Free Edition, but for the past couple of days, I have been facing a problem where I am unable to start the serverless compute. It takes forever to initiate. When I check the log, I see the message that the compute <......

  • 746 Views
  • 2 replies
  • 3 kudos
Latest Reply
Advika
Community Manager
  • 3 kudos

Hello @JrVerbiest! Thanks for sharing this, @BS_THE_ANALYST.This is a known issue with the Free Edition, and the team is already aware and working on a fix. It should be resolved soon.Thanks for your patience in the meantime!

  • 3 kudos
1 More Replies
spearitchmeta
by Contributor
  • 1264 Views
  • 6 replies
  • 5 kudos

Resolved! Datbricks CLI is encoding the secrets in base 64 automatically

Hello Everyone,I am using my MAc terminal to save a databricks primary key using a specific scope and secret.Now everything runs smoothly ! except when I get to the step where I generate a secret. Problem is that my primary key is for example "test12...

  • 1264 Views
  • 6 replies
  • 5 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 5 kudos

Hi @spearitchmeta ,Yes, you're right. In order to read the value of a secret using the Databricks CLI, you must decode the base64 encoded value. You can use jq to extract the value and base --decode to decode it:databricks secrets get-secret <scope-n...

  • 5 kudos
5 More Replies
jitenjha11
by New Contributor II
  • 956 Views
  • 3 replies
  • 0 kudos

how to parallel n number of process in databricks

Requirement: I have a volume in which random txt file coming from MQ with random numbers. In my workspace I have python script. Also, i have created job which, when new file will come in volume it will trigger automatically. My requirement is, I need...

  • 956 Views
  • 3 replies
  • 0 kudos
Latest Reply
BR_DatabricksAI
Contributor III
  • 0 kudos

Hello @jitenjha11  : You can do it same manner they way it has highlighted by @MujtabaNoori but you have to call the process process twice.Sharing the sample reference code below :Iterating through the files in each directory. for directory in direct...

  • 0 kudos
2 More Replies
APJESK
by New Contributor III
  • 745 Views
  • 3 replies
  • 2 kudos

Provision users and groups from an Identity Provider (IdP)

In our organization, SCIM is not supported for user and group provisioning. I’d like to know what other options are available to provision users and groups from an Identity Provider (IdP) into Databricks.Are there alternative methods (e.g., JIT provi...

  • 745 Views
  • 3 replies
  • 2 kudos
Latest Reply
APJESK
New Contributor III
  • 2 kudos

Thank you for sharing this information. I would like to inform you that our environment is Databricks on AWS, and our IdP is Ping Federate. Could you please advise if there are equivalent solutions or recommended best practices for this setup?

  • 2 kudos
2 More Replies
APJESK
by New Contributor III
  • 1707 Views
  • 5 replies
  • 4 kudos

Resolved! Setting up observability for serverless Databricks

I’m looking for best practices and guidance on setting up observability for serverless Databricks. Specifically, I’d like to know:How to capture and monitor system-level metrics (CPU, memory, network, disk) in a serverless setup.How to configure and ...

  • 1707 Views
  • 5 replies
  • 4 kudos
Latest Reply
APJESK
New Contributor III
  • 4 kudos

Ok, Got it I will start explore it and get back to you.

  • 4 kudos
4 More Replies
vieiralaura
by New Contributor
  • 4012 Views
  • 1 replies
  • 1 kudos

Help with cookiecutter Prompts in Databricks Notebooks

Hi everyone! I’m working on using cookiecutter to help us set up consistent project templates on Databricks. So far, it’s mostly working, but I’m struggling with the prompts – they’re not displaying well in the Databricks environment.I’ve tried some ...

  • 4012 Views
  • 1 replies
  • 1 kudos
Latest Reply
guigasque
New Contributor II
  • 1 kudos

Hi! Great initiative using cookiecutter to standardize project templates on Databricks — I’ve been exploring similar workflows.I ran into the same issue with prompts not displaying well in the notebook environment. One workaround I found effective wa...

  • 1 kudos
SylvainW
by New Contributor III
  • 1061 Views
  • 5 replies
  • 3 kudos

How to embed a databricks dashboard in an iframe when running on localhost?

Hello!I'm working on embedding one of our databricks dashboards into an iframe but even after adding localhost to the Approved domains list, I'm getting the message that embedding for this domain isn't possible (see below).We need to be able to devel...

SylvainW_0-1755638534643.png
  • 1061 Views
  • 5 replies
  • 3 kudos
Latest Reply
SylvainW
New Contributor III
  • 3 kudos

So for tunnelling, something like what is described here: https://dev.to/tahsin000/free-services-to-expose-localhost-to-https-a-comparison-5c19

  • 3 kudos
4 More Replies
boitumelodikoko
by Valued Contributor
  • 647 Views
  • 1 replies
  • 1 kudos

Clarification on Data Privacy with ai_query Models

Hi everyone,We've had a client ask about the use of the Claude 3.7 Sonnet model (and others) in the Databricks SQL editor via the ai_query function. Specifically, they want to confirm whether any data passed to these models is ringfenced — i.e., not ...

  • 647 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

HI @boitumelodikoko ,The documentation you've provided is official confirmation by Databricks (otherwise they wouldn't put it in public documentation in the first place). Every customer that uses ai functions within databricks should expect that any ...

  • 1 kudos
RossMacrae777
by New Contributor II
  • 2175 Views
  • 4 replies
  • 1 kudos

Connecting to Databricks from Workato. JDBCDriver 500593 Communication Link failure

I'm trying to connect to Databricks from Workato, to pull data in as part of a Workato Recipe.  I'm getting the following error when I test the connection:"Database bridge error: Failed since could not connect to the database - Failed to initialize p...

  • 2175 Views
  • 4 replies
  • 1 kudos
Latest Reply
nicolai_nqbt
New Contributor II
  • 1 kudos

Hi WiliamRosaI am in fact trying to connect via the recommended method of using the built-in databricks connector. When I provide all the relevant details for the configuration I get the error mentioned in the original post. I don't understand why i ...

  • 1 kudos
3 More Replies
toko_chi
by New Contributor II
  • 1067 Views
  • 2 replies
  • 2 kudos

Resolved! Network Connectivity Configurations : "Default rules" tab not visible

Hi, I'm trying to enable static outbound IPs for Serverless Compute in order to whitelist them on external SaaS.I created an NCC in the same AWS region as my workspace (customer-managed VPC)According to the documentation(Serverless Firewall) , I expe...

  • 1067 Views
  • 2 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Hello @toko_chi! Yes, as noted in the documentation, this feature is currently in Public Preview and requires enablement by your Databricks account team.Since you’re not seeing the Default rules tab and your API response shows an empty egress_config,...

  • 2 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 395 Views
  • 1 replies
  • 0 kudos

Unity Catalog Backup

Dear allHas anyone explored on backing up unity catalog?...because of gaps in the security model of Databricks, we had to make certain user groups as the schema owners and the schema owners can drop the schema I always wonder how to recover the schem...

  • 395 Views
  • 1 replies
  • 0 kudos
Latest Reply
WiliamRosa
Contributor III
  • 0 kudos

Hi @noorbasha534,Currently, there’s no native backup/restore. The pragmatic approach is.Treat Unity Catalog metadata as code (Terraform / SQL DDL checked into Git), regularly export UC object definitions with REST API and lock down schema ownership s...

  • 0 kudos
dbdev
by Contributor
  • 1312 Views
  • 4 replies
  • 2 kudos

Resolved! VNet Injected Workspace trouble connecting to the Storage Account of a catalog (UC),

Hi! I have some issues with setting up my workspace and storage account within a virtual network and letting them connect to each other.Background set up (all done in terraform):- Databricks workspace with VNet Injection and Unity Catalog enabled. (3...

  • 1312 Views
  • 4 replies
  • 2 kudos
Latest Reply
dbdev
Contributor
  • 2 kudos

It was correctly linked.Turned out we were missing one extra private endpoint of type 'dfs'.So for our storage account we needed to create 2 private endpoints, both configured on the same subnet with one of subresource type 'blob' and one subresource...

  • 2 kudos
3 More Replies
davide-maestron
by New Contributor
  • 511 Views
  • 3 replies
  • 3 kudos

Databricks Workspace APIs

Hello, I'm trying to collect metadata about the files stored in a Workspace Repo Folder, but it looks like the "size" field is always missing. The field is listed in a few SDKs (for example, in the Python documentation: https://databricks-sdk-py.read...

  • 511 Views
  • 3 replies
  • 3 kudos
Latest Reply
WiliamRosa
Contributor III
  • 3 kudos

@szymon_dybczak, Help me understand if this makes sense: what if I use the Files API instead of the Workspace API to get the size?

  • 3 kudos
2 More Replies
alex_svyatets
by New Contributor II
  • 794 Views
  • 2 replies
  • 1 kudos

Verbose logging

Hello! I want audit access to some delta tables in unity catalog from jobs and notebooks. We enabled audit verbose logging. But not all queries that were run from notebooks are  written to system.access.audit, column_lineage and table_lineage tables

  • 794 Views
  • 2 replies
  • 1 kudos
Latest Reply
alex_svyatets
New Contributor II
  • 1 kudos

It are queries to registered table name.

  • 1 kudos
1 More Replies