cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

bvraravind
by New Contributor II
  • 300 Views
  • 1 replies
  • 0 kudos

Prevent users from running shell commands

Hi, is there any way to prevent users from running shell commands in Databricks notebooks? for example, "%%bash" I read that REVOKE EXECUTE ON SHELL command can be used. but i am unable to make it to work. Thanks in advance for any help.

  • 300 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @bvraravind, You can use this spark setting  "spark_conf.spark.databricks.repl.allowedLanguages": { "type": "fixed", "value": "python,sql" Over a cluster policy to prevent access to shell commands. https://docs.databricks.com/en/archive/compute/c...

  • 0 kudos
Sudheer2
by New Contributor III
  • 847 Views
  • 2 replies
  • 0 kudos

Terraform: Add Key Vault Administrator Role Assignment and Save Outputs to JSON Dynamically in Azure

Hi everyone,I am using Terraform to provision an OpenAI service and its modules along with a Key Vault in Azure. While the OpenAI service setup works as expected, I am facing two challenges:Role Assignment for Key VaultI need to assign the Key Vault ...

  • 847 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

For question two, you can use the local_file resource in Terraform: output "openai_api_type" {value = module.openai.api_type} output "openai_api_base" {value = module.openai.api_base} output "openai_api_version" {value = module.openai.api_version} ou...

  • 0 kudos
1 More Replies
thrinadhReddy
by New Contributor II
  • 3542 Views
  • 4 replies
  • 1 kudos

Cluster policy type

Hi guys,I am creating a cluster policy through json. "runtime_engine": {"type": "fixed""value": "PHOTON"}When I run the above code... PHOTON option is getting enabled but graying out... What would I specify in type field so that the photon option sho...

  • 3542 Views
  • 4 replies
  • 1 kudos
Latest Reply
Khalil_Robinson
New Contributor II
  • 1 kudos

Hey, If you change the type to "allowlist" and provide "PHOTON" and "STANDARD" as options that should fix your issue.here is an example: "runtime_engine": {"type": "allowlist""value": ["PHOTON", "STANDARD"]"defaultValue": "PHOTON"}

  • 1 kudos
3 More Replies
sshukla
by New Contributor III
  • 1564 Views
  • 5 replies
  • 0 kudos

External Api not returning any response

import requestsurl = "https://example.com/api"headers = {"Authorization": "Bearer YOUR_TOKEN","Content-Type": "application/json"}Payload = json.dumps({json_data})response = requests.post(url, headers=headers, data=Payload)print(response.status_code)p...

  • 1564 Views
  • 5 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Can you DIM the two payload, it has to do with something on endpoint, since it is working on one payload fine, where in the other is not to the same endpoint.

  • 0 kudos
4 More Replies
Wojciech_BUK
by Valued Contributor III
  • 1801 Views
  • 1 replies
  • 1 kudos

All my replies has been deleted and new one is being moderated (deleted) - why?

I noticed that when I Reply to post and trying to help solve community problem - my post are being either moderated (deleted) or just not being saved. Old post/replies has been deleted.Is there any reason for that?I kind of lost my will to participat...

  • 1801 Views
  • 1 replies
  • 1 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 1 kudos

Currently, I'm also facing same issue, my comments are automatically deleted and also not receiving email for same, which is happening to specific post.

  • 1 kudos
sanjay
by Valued Contributor II
  • 2741 Views
  • 2 replies
  • 2 kudos

Error accessing file from dbfs inside mlflow serve endpoint

Hi,I have mlflow model served using serverless GPU which takes audio file name as input and then file will be passed as parameter to huggiung face model inside predict method. But I am getting following errorHFValidationError(\nhuggingface_hub.utils....

  • 2741 Views
  • 2 replies
  • 2 kudos
Latest Reply
txti
New Contributor III
  • 2 kudos

I have the same issue.I have a large file that I cannot access from an MLFlow service.Things I have tried (none of these work):Read-only from DBFS`dbfs:/myfolder/myfile.chroma` does not work`/dbfs/myfolder/myfile.chroma` does not workRead-only from U...

  • 2 kudos
1 More Replies
AlexG
by New Contributor III
  • 3803 Views
  • 5 replies
  • 1 kudos

Query results in csv file include 'null' string for blank cell

After running a sql script, when downloading the results to a csv file, the file includes a null string for blank cells (see screenshot). Is ther a setting I can change to simply get empty cells instead? 

AlexG_1-1702927614092.png
  • 3803 Views
  • 5 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

I understand, however this is more on CSV file format.  Save your data in Delta format instead of CSV or text-based formats. Delta tables handle empty strings and NULL values more effectively, ensuring that empty strings are preserved during data ins...

  • 1 kudos
4 More Replies
Sudheer2
by New Contributor III
  • 702 Views
  • 0 replies
  • 0 kudos

How to Fetch Azure OpenAI api_version and engine Dynamically After Resource Creation via Python?

Hello,I am using Python to automate the creation of Azure OpenAI resources via the Azure Management API. I am successfully able to create the resource, but I need to dynamically fetch the following details after the resource is created:API Version (a...

  • 702 Views
  • 0 replies
  • 0 kudos
ChristianRRL
by Valued Contributor II
  • 4548 Views
  • 6 replies
  • 1 kudos

Materialized Views Without DLT?

I'm curious, is DLT *required* to use Materialized Views in Databricks? Is it not possible to create and refresh a Materialized view via a standard Databricks Workflow?

  • 4548 Views
  • 6 replies
  • 1 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 1 kudos

Hi @ChristianRRL ,When creating a materialized view in Databricks, the data is stored in DBFS, cloud storage, or Unity Catalog volume. You can still create a materialized view by overwriting the same table each time, instead of using Append, Update, ...

  • 1 kudos
5 More Replies
ChristianRRL
by Valued Contributor II
  • 3541 Views
  • 4 replies
  • 1 kudos

Purpose of DLT Table table_properties > quality:medallion

Hi there, silly question here but can anyone help me understand what practical purpose does labelling the table_properties with "quality":"<specific_medallion>"? For example: @Dlt.table( comment="Bronze live streaming table for Test data", name="...

  • 3541 Views
  • 4 replies
  • 1 kudos
Latest Reply
nochimo
New Contributor II
  • 1 kudos

I'm with the same doubt @ChristianRRL, did you figured out something related to it?My doubt is to check if it's possible to apply any kind of access control based on this property.

  • 1 kudos
3 More Replies
Geophph
by New Contributor III
  • 2205 Views
  • 6 replies
  • 3 kudos

Resolved! Plotly Express not rendering in Firefox but fine in Safari

Using a basic example of plotly express i see no output in firefox but is fine in Safari. Any ideas why this may occur?  import plotly.express as px import pandas as pd # Create a sample dataframe df = pd.DataFrame({ 'x': range(10), 'y': [2, 3, 5, 7...

  • 2205 Views
  • 6 replies
  • 3 kudos
Latest Reply
Geophph
New Contributor III
  • 3 kudos

UPDATE: I reached out further to Databricks support and they have since deployed a fix. Works fine for me now!

  • 3 kudos
5 More Replies
Avvar2022
by Contributor
  • 5946 Views
  • 8 replies
  • 3 kudos

Unity catalog enabled workspace -Is there any way to disable workflow/job creation for certain users

Currently in unity catalog enabled workspace users with "Workspace access" can create workflows/jobs, there is no access control available to restrict users from creating jobs/workflows.Use case: In production there is no need for users, data enginee...

  • 5946 Views
  • 8 replies
  • 3 kudos
Latest Reply
Avvar2022
Contributor
  • 3 kudos

@Lakshay Databricks offers a robust platform with a variety of features, including data ingestion, engineering, science, dashboards, and applications. However, I believe that some features, such as workflow/job creation, alerts, dashboards, and Genie...

  • 3 kudos
7 More Replies
SamGreene
by Contributor II
  • 2404 Views
  • 3 replies
  • 0 kudos

String to date conversion errors

Hi,I am getting data from CDC on SQL Server using Informatica which is writing parquet files to ADLS.  I read the parquet files using DLT and end up with the date data as a string such as this'20240603164746563' I couldn't get this to convert using m...

  • 2404 Views
  • 3 replies
  • 0 kudos
Latest Reply
SamGreene
Contributor II
  • 0 kudos

Checking on my current code, this is what I am using, which works for me because we don't use daylight savings time.  from_utc_timestamp(date_time_utc, 'UTC-7') as date_time_local

  • 0 kudos
2 More Replies
GeKo
by New Contributor III
  • 18710 Views
  • 5 replies
  • 0 kudos

Insufficient privileges:User does not have permission SELECT on any file

Hello,after switching to "shared cluster" usage a python job is failing with error message:  Py4JJavaError: An error occurred while calling o877.load. : org.apache.spark.SparkSecurityException: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User...

Get Started Discussions
permissions
privileges
python
  • 18710 Views
  • 5 replies
  • 0 kudos
Latest Reply
Uj337
New Contributor III
  • 0 kudos

Hi @GeKo The checkpoint directory, is that set on cluster level or how do we set that ? Can you please help me with this ?

  • 0 kudos
4 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels