- 862 Views
- 1 replies
- 1 kudos
Unable to Register Models After Uploading Artifacts to DBFS in Databricks
Hi everyone,I'm currently working on a project where I'm migrating models and artifacts from a source Databricks workspace to a target one. I've written a script to upload the model artifacts from my local system to DBFS in the target workspace (usi...
- 862 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sudheer2, Does it give you any error while trying to register the model?
- 1 kudos
- 737 Views
- 2 replies
- 0 kudos
Scheduling multiple jobs (workflows) in DABs
Hello, I'm wondering how can I schedule multiple jobs (workflow).I'd like to do something like this but on a workflow level. tasks: - task_key: task_1 sql_task: warehouse_id: ${var.warehouse_id} paramet...
- 737 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Greg_c, You can try with this sctructure: - In the main databricks.yml # databricks.ymlbundle:name: master-bundle include:- resources/*.yml # Other bundle configurations... In resource directory, create a YAML for each job: # resources/job1.ymlre...
- 0 kudos
- 1628 Views
- 4 replies
- 1 kudos
ALIAS Not accepted 42601
I am unable to run the following query generated from my backend at databricks sideQuery: SELECT "A".`cut` AS "Cut" , "A".`color` AS "Color" , "A".`carat` AS "Carat" , "A".`clarity` AS "Clarity" FROM databricksconnect.default.diamonds "A" Error logs...
- 1628 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @malhm ,Double quotes are not supported in column alias. In Databricks SQL/Spark SQL one uses backticks instead of double quotes like in PostgreSQL.Check the docs:https://spark.apache.org/docs/3.5.1/sql-ref-identifier.html
- 1 kudos
- 3264 Views
- 3 replies
- 0 kudos
Resolved! DataBricks x Query Folding Power BI
I ran a native Power BI query in DataBricks in import mode and query folding was not enabled. No query folding?
- 3264 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Iguinrj11 ,The trick is to configure Databricks.Query instead of Databricks.Catalogs.Check this article and let us know if that helps:https://www.linkedin.com/pulse/query-folding-azure-databricks-tushar-desai/
- 0 kudos
- 1224 Views
- 2 replies
- 0 kudos
Dynamic Bloom Filters for Inner Joins
I have a question regarding combining the use of Bloom filters with Liquid Clustering to further reduce the data read during a join/merge on top of dynamic file pruning. Testing both combined worked extremely well together for point queries. However ...
- 1224 Views
- 2 replies
- 0 kudos
- 0 kudos
We do not recommend Bloom filters Index on the Delta Tables as they have to be manually maintained. If you prefer photon - please try predictive I/O with Liquid Clustering.
- 0 kudos
- 1415 Views
- 4 replies
- 0 kudos
Automate run as workflow parameter to default to current user
I am trying to run a workflow within Databricks. I have 2 workflows, workflow one which always runs as the service principal, as all data gets accessed and wrangled within this workflow, and workflow 2 which always defaults to the last run account. I...
- 1415 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, how are you expecting to achieve this? Do you want users who are manually triggering this workflow first update to their run_as? or you want to make this happen programatically?
- 0 kudos
- 921 Views
- 2 replies
- 0 kudos
Create csv and upload on azure
Can some write a sql query , which queries a table like select * from stages.benefit , creates a csv and upload on azure
- 921 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @subhadeep ,You can achieve this in SQL similarly to how you write a dataframe into a table or blob path. We will create an external table pointing to the blob path or mounted blob path. Note that this table does not support ACID transactions and ...
- 0 kudos
- 1910 Views
- 10 replies
- 2 kudos
01_demo_setup error
HelloI was following "Demo: Creating and Working with a Delta Table"while I have a community edition user.The first command in the Notebook is: %run ./setup/01_demo_setup But I got the following error:Notebook not found: Users/<my-email-was-here..>/s...
- 1910 Views
- 10 replies
- 2 kudos
- 2 kudos
Hey!Sad news guys... if you go to Course Logistics Review you can read:"We are pleased to offer a version of this course that also contains hands-on practice via a Databricks Academy Labs subscription. With a Databricks Academy Labs subscription, you...
- 2 kudos
- 993 Views
- 1 replies
- 0 kudos
Databricks app giving 'upstream request timeout '
Hello all,We are developing an app which is based on flask, which is used to download logs from databricks dbfs location. For this useful case we are using databricks inbuilt App feature to deploy our app.While we pass a smaller file it is getting do...
- 993 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey!It looks like the issue you’re facing might be related to the proxy timeout when downloading large files from DBFS. Since modifying the proxy settings might not be an option, there are a couple of alternative approaches you could consider to miti...
- 0 kudos
- 5807 Views
- 3 replies
- 0 kudos
50%-off Databricks certification voucher
Hello Databricks Community Team, I am reaching out to inquire about the Databricks certification voucher promotion for completing the Databricks Learning Festival (Virtual) courses.I completed one of the Databricks Learning Festival courses July 2024...
- 5807 Views
- 3 replies
- 0 kudos
- 0 kudos
I have already finished the course, how do I get the discount?
- 0 kudos
- 1089 Views
- 1 replies
- 0 kudos
How to Create Azure Key Vault and Assign Key Vault Administrator Role Using Terraform
Hi all,I’m currently working with Terraform to set up Azure resources, including OpenAI services, and I’d like to extend my configuration to create an Azure Key Vault. Specifically, I want to:Create an Azure Key Vault to store secrets/keys.Assign the...
- 1089 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @naveen0142 , 1. Create the Key Vault resource "azurerm_key_vault" "example" { name = var.key_vault_name location = azurerm_resource_group.example.location resource_group_name = azurerm_resource_group.example....
- 0 kudos
- 514 Views
- 0 replies
- 0 kudos
Atualização Incremental e/ou Modelos Compostos (Databricks x Power BI)
Gostaria de deixar meu modelo mais performático no Power BI, mas tenho encontrado algumas dificuldades ao conectá-lo em uma fonte no DataBricks. Queria saber se é possÃvel fazer atualização incremenal e/ou trabalhar com modelos compostos (Direct Quer...
- 514 Views
- 0 replies
- 0 kudos
- 1252 Views
- 1 replies
- 0 kudos
User Unable to Access Key Vault Secrets Despite Role Assignment in Terraform
Hi All,I'm encountering an issue where a user is unable to access secrets in an Azure Key Vault, even though the user has been assigned the necessary roles using Terraform. Specifically, the user gets the following error when trying to access the sec...
- 1252 Views
- 1 replies
- 0 kudos
- 0 kudos
Are they accessing the Key Vault directly and not through Databricks? If so, based on your Terraform code, they should be able to directly read Secrets in the Azure Key Vault. You've configured the Key Vault with RBAC Authorization and assigned Key ...
- 0 kudos
- 520 Views
- 1 replies
- 0 kudos
Prevent users from running shell commands
Hi, is there any way to prevent users from running shell commands in Databricks notebooks? for example, "%%bash" I read that REVOKE EXECUTE ON SHELL command can be used. but i am unable to make it to work. Thanks in advance for any help.
- 520 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @bvraravind, You can use this spark setting "spark_conf.spark.databricks.repl.allowedLanguages": { "type": "fixed", "value": "python,sql" Over a cluster policy to prevent access to shell commands. https://docs.databricks.com/en/archive/compute/c...
- 0 kudos
- 1398 Views
- 2 replies
- 0 kudos
Terraform: Add Key Vault Administrator Role Assignment and Save Outputs to JSON Dynamically in Azure
Hi everyone,I am using Terraform to provision an OpenAI service and its modules along with a Key Vault in Azure. While the OpenAI service setup works as expected, I am facing two challenges:Role Assignment for Key VaultI need to assign the Key Vault ...
- 1398 Views
- 2 replies
- 0 kudos
- 0 kudos
For question two, you can use the local_file resource in Terraform: output "openai_api_type" {value = module.openai.api_type} output "openai_api_base" {value = module.openai.api_base} output "openai_api_version" {value = module.openai.api_version} ou...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
112 | |
56 | |
42 | |
30 |