cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sudheer2
by New Contributor III
  • 862 Views
  • 1 replies
  • 1 kudos

Unable to Register Models After Uploading Artifacts to DBFS in Databricks

 Hi everyone,I'm currently working on a project where I'm migrating models and artifacts from a source Databricks workspace to a target one. I've written a script to upload the model artifacts from my local system to DBFS in the target workspace (usi...

  • 862 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @Sudheer2, Does it give you any error while trying to register the model?

  • 1 kudos
Greg_c
by New Contributor II
  • 737 Views
  • 2 replies
  • 0 kudos

Scheduling multiple jobs (workflows) in DABs

Hello, I'm wondering how can I schedule multiple jobs (workflow).I'd like to do something like this but on a workflow level.  tasks: - task_key: task_1 sql_task: warehouse_id: ${var.warehouse_id} paramet...

  • 737 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Greg_c, You can try with this sctructure: - In the main databricks.yml # databricks.ymlbundle:name: master-bundle include:- resources/*.yml # Other bundle configurations... In resource directory, create a YAML for each job: # resources/job1.ymlre...

  • 0 kudos
1 More Replies
malhm
by New Contributor II
  • 1628 Views
  • 4 replies
  • 1 kudos

ALIAS Not accepted 42601

I am unable to run the following query generated from my backend at databricks sideQuery: SELECT "A".`cut` AS "Cut" , "A".`color` AS "Color" , "A".`carat` AS "Carat" , "A".`clarity` AS "Clarity" FROM databricksconnect.default.diamonds "A"  Error logs...

  • 1628 Views
  • 4 replies
  • 1 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 1 kudos

Hi @malhm ,Double quotes are not supported in column alias. In Databricks SQL/Spark SQL one uses backticks instead of double quotes like in PostgreSQL.Check the docs:https://spark.apache.org/docs/3.5.1/sql-ref-identifier.html 

  • 1 kudos
3 More Replies
Iguinrj11
by New Contributor II
  • 3264 Views
  • 3 replies
  • 0 kudos

Resolved! DataBricks x Query Folding Power BI

I ran a native Power BI query in DataBricks in import mode and query folding was not enabled. No query folding? 

Iguinrj11_0-1737819675982.png
  • 3264 Views
  • 3 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @Iguinrj11 ,The trick is to configure Databricks.Query instead of Databricks.Catalogs.Check this article and let us know if that helps:https://www.linkedin.com/pulse/query-folding-azure-databricks-tushar-desai/

  • 0 kudos
2 More Replies
tomvogel01
by New Contributor II
  • 1224 Views
  • 2 replies
  • 0 kudos

Dynamic Bloom Filters for Inner Joins

I have a question regarding combining the use of Bloom filters with Liquid Clustering to further reduce the data read during a join/merge on top of dynamic file pruning. Testing both combined worked extremely well together for point queries. However ...

  • 1224 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

We do not recommend Bloom filters Index on the Delta Tables as they have to be manually maintained.  If you prefer photon - please try predictive I/O with Liquid Clustering.

  • 0 kudos
1 More Replies
ivvande
by New Contributor II
  • 1415 Views
  • 4 replies
  • 0 kudos

Automate run as workflow parameter to default to current user

I am trying to run a workflow within Databricks. I have 2 workflows, workflow one which always runs as the service principal, as all data gets accessed and wrangled within this workflow, and workflow 2 which always defaults to the last run account. I...

ivvande_0-1737706760905.png
  • 1415 Views
  • 4 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 0 kudos

Hi, how are you expecting to achieve this? Do you want users who are manually triggering this workflow first update to their run_as? or you want to make this happen programatically?

  • 0 kudos
3 More Replies
subhadeep
by New Contributor II
  • 921 Views
  • 2 replies
  • 0 kudos

Create csv and upload on azure

Can some write a sql query , which queries a table like select * from stages.benefit , creates a csv and upload on azure 

  • 921 Views
  • 2 replies
  • 0 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 0 kudos

Hi @subhadeep ,You can achieve this in SQL similarly to how you write a dataframe into a table or blob path. We will create an external table pointing to the blob path or mounted blob path. Note that this table does not support ACID transactions and ...

  • 0 kudos
1 More Replies
idanyow
by New Contributor III
  • 1910 Views
  • 10 replies
  • 2 kudos

01_demo_setup error

HelloI was following "Demo: Creating and Working with a Delta Table"while I have a community edition user.The first command in the Notebook is: %run ./setup/01_demo_setup But I got the following error:Notebook not found: Users/<my-email-was-here..>/s...

  • 1910 Views
  • 10 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

Hey!Sad news guys... if you go to Course Logistics Review you can read:"We are pleased to offer a version of this course that also contains hands-on practice via a Databricks Academy Labs subscription. With a Databricks Academy Labs subscription, you...

  • 2 kudos
9 More Replies
Rakeshch007
by New Contributor
  • 993 Views
  • 1 replies
  • 0 kudos

Databricks app giving 'upstream request timeout '

Hello all,We are developing an app which is based on flask, which is used to download logs from databricks dbfs location. For this useful case we are using databricks inbuilt App feature to deploy our app.While we pass a smaller file it is getting do...

  • 993 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hey!It looks like the issue you’re facing might be related to the proxy timeout when downloading large files from DBFS. Since modifying the proxy settings might not be an option, there are a couple of alternative approaches you could consider to miti...

  • 0 kudos
haritashva31
by New Contributor
  • 5807 Views
  • 3 replies
  • 0 kudos

50%-off Databricks certification voucher

Hello Databricks Community Team, I am reaching out to inquire about the Databricks certification voucher promotion for completing the Databricks Learning Festival (Virtual) courses.I completed one of the Databricks Learning Festival courses July 2024...

  • 5807 Views
  • 3 replies
  • 0 kudos
Latest Reply
MARC9312
New Contributor II
  • 0 kudos

I have already finished the course, how do I get the discount?

  • 0 kudos
2 More Replies
naveen0142
by New Contributor
  • 1089 Views
  • 1 replies
  • 0 kudos

How to Create Azure Key Vault and Assign Key Vault Administrator Role Using Terraform

Hi all,I’m currently working with Terraform to set up Azure resources, including OpenAI services, and I’d like to extend my configuration to create an Azure Key Vault. Specifically, I want to:Create an Azure Key Vault to store secrets/keys.Assign the...

  • 1089 Views
  • 1 replies
  • 0 kudos
Latest Reply
parthSundarka
Databricks Employee
  • 0 kudos

Hi @naveen0142 , 1. Create the Key Vault resource "azurerm_key_vault" "example" { name = var.key_vault_name location = azurerm_resource_group.example.location resource_group_name = azurerm_resource_group.example....

  • 0 kudos
Iguinrj11
by New Contributor II
  • 514 Views
  • 0 replies
  • 0 kudos

Atualização Incremental e/ou Modelos Compostos (Databricks x Power BI)

Gostaria de deixar meu modelo mais performático no Power BI, mas tenho encontrado algumas dificuldades ao conectá-lo em uma fonte no DataBricks. Queria saber se é possível fazer atualização incremenal e/ou trabalhar com modelos compostos (Direct Quer...

  • 514 Views
  • 0 replies
  • 0 kudos
Sudheer2
by New Contributor III
  • 1252 Views
  • 1 replies
  • 0 kudos

User Unable to Access Key Vault Secrets Despite Role Assignment in Terraform

Hi All,I'm encountering an issue where a user is unable to access secrets in an Azure Key Vault, even though the user has been assigned the necessary roles using Terraform. Specifically, the user gets the following error when trying to access the sec...

  • 1252 Views
  • 1 replies
  • 0 kudos
Latest Reply
mm41
New Contributor II
  • 0 kudos

Are they accessing the Key Vault directly and not through Databricks? If so, based on your Terraform code, they should be able to directly read Secrets in the Azure Key Vault.  You've configured the Key Vault with RBAC Authorization and assigned Key ...

  • 0 kudos
bvraravind
by New Contributor II
  • 520 Views
  • 1 replies
  • 0 kudos

Prevent users from running shell commands

Hi, is there any way to prevent users from running shell commands in Databricks notebooks? for example, "%%bash" I read that REVOKE EXECUTE ON SHELL command can be used. but i am unable to make it to work. Thanks in advance for any help.

  • 520 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @bvraravind, You can use this spark setting  "spark_conf.spark.databricks.repl.allowedLanguages": { "type": "fixed", "value": "python,sql" Over a cluster policy to prevent access to shell commands. https://docs.databricks.com/en/archive/compute/c...

  • 0 kudos
Sudheer2
by New Contributor III
  • 1398 Views
  • 2 replies
  • 0 kudos

Terraform: Add Key Vault Administrator Role Assignment and Save Outputs to JSON Dynamically in Azure

Hi everyone,I am using Terraform to provision an OpenAI service and its modules along with a Key Vault in Azure. While the OpenAI service setup works as expected, I am facing two challenges:Role Assignment for Key VaultI need to assign the Key Vault ...

  • 1398 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

For question two, you can use the local_file resource in Terraform: output "openai_api_type" {value = module.openai.api_type} output "openai_api_base" {value = module.openai.api_base} output "openai_api_version" {value = module.openai.api_version} ou...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels