- 81 Views
- 1 replies
- 0 kudos
Passing a list of images on AI_Query
HiI would like to know how can I pass a list of images to ai_query on the same call, It will be helpful to extract some metadata information on a multipage document, where I want to know if different images to see if it belongs to the same document.T...
- 81 Views
- 1 replies
- 0 kudos
- 2796 Views
- 11 replies
- 2 kudos
Issue with Non-Bulk Inserts Using JDBC on Databricks Runtime 14.1
Hello team,I am experiencing an issue with insert operations on Databricks using the JDBC driver. In my SAS Viya system, the DatabricksJDBC42.jar driver version 2.6.40 is configured. I’ve noticed that, up to Databricks Runtime version 13.1, insert op...
- 2796 Views
- 11 replies
- 2 kudos
- 2 kudos
The PR looks closed for now, any news on this? Also, I was wondering if there's a code snippet available on using this functionality
- 2 kudos
- 47 Views
- 0 replies
- 0 kudos
Pass parameter value to sql query in databricks sql alert
Hi,I have created a sql query with the parameter.Now using that parameterized sql query in databricks legacy sql alert.And scheduling that databricks sql alert in databricks workflow/job using Sql task type.Now how should I pass the parameter value t...
- 47 Views
- 0 replies
- 0 kudos
- 139 Views
- 3 replies
- 3 kudos
Resolved! Help to get the swags for the event.
Hello Team,I am hosting my first event on 13th September. The date is final, but the venue is something that I am working on, due to I am not creating the event. I'll be able to create it within a day or two. Can you please help me understand how I ...
- 139 Views
- 3 replies
- 3 kudos
- 3 kudos
Thank you for tagging @BS_THE_ANALYST !Hi @Devanshu ,Thanks for reaching out. I’ve sent you a DM with the swag request details, please share your response so we can move forward and support you better. ThanksRishabh
- 3 kudos
- 4344 Views
- 1 replies
- 0 kudos
the inference table table doesn't get updated
I setup a model serving endpoint and created a monitoring dashboard to monitor its performance. The problem is my inference table doesn't get updated by model serving endpoints. To test the endpoint I use the following codeimport random import time ...
- 4344 Views
- 1 replies
- 0 kudos
- 0 kudos
If the inference table is renamed, schema changed, deleted, or if the creator loses data access/the permissions are missing or have recently changed (for the endpoint creator), inference tables may silently stop updating.
- 0 kudos
- 75 Views
- 1 replies
- 0 kudos
How to install python libraries in DLT using databricks asset bundles
Hi Team,Is there any way we can install python packages to DLT using Databricks asset bundles. resources:pipelines:xysz:name: xyxconfiguration:input_file: test.jsonenv: ${var}permissions:- group_name: ${var}level: CAN_MANAGE- group_name: ${var}level...
- 75 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @nkrom456 ,Add a %pip install command at the top of your DLT pipeline notebook to install the required packages. This will ensure the packages are available on all nodes during pipeline execution.
- 0 kudos
- 98 Views
- 1 replies
- 1 kudos
Performance issue with the Databricks Add-on for Splunk 1.4.2
We are currently using the add-on for Databricks in our on-prem Splunk Enterprise environment. The connection goes directly to the cloud without a proxy. Unfortunately, the add-on is very slow and we lose about 20 seconds with every query. We see th...
- 98 Views
- 1 replies
- 1 kudos
- 1 kudos
It’s likely not Databricks but the Splunk add-on causing the delay. The databricksquery command in version 1.4.2 has known performance issues — most of the lag is inside Splunk while parsing results. Try upgrading to the latest add-on (1.5.x+), monit...
- 1 kudos
- 180 Views
- 2 replies
- 0 kudos
DLT Pipeline Design
I am new to DLT and trying to understand the process. My bronze table will receive incremental data from SAP in real time. In my bronze table, we are not maintaining any history and any data older than 2 weeks will be deleted. This data from bronze w...
- 180 Views
- 2 replies
- 0 kudos
- 0 kudos
The scenario you mentioned can be efficiently managed using a Delta Live Table (DLT) combined with a CDC (Change Data Capture) flow. From what I understand, the bronze table is always in append mode. Check out the link below for more details. Hope th...
- 0 kudos
- 123 Views
- 2 replies
- 0 kudos
Ai Query Prompt Token and Completition token
HiI would like to know how can I get the Completition token and Prompt token quantity when using Ai_Query?Thanks
- 123 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @Andreyai good day!!For AI_queries, we have documentation from databricks. : https://docs.databricks.com/aws/en/sql/language-manual/functions/ai_query I am 100% sure you will get better insights from the documentations. But I have something for...
- 0 kudos
- 254 Views
- 5 replies
- 0 kudos
Error when executing spark.readStream Script
Hi all,When I try to execute the script (as per the screenshot below), in a Notebook cell, I get an error message. I am using Databricks Free Edition, and am not sure if the error relates to the compute cluster that I am using?Any guidance would be ...
- 254 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @giuseppe_esq , Can you please share the DBR version, cluster configurations and log4 file containing the error stact trace to further review?
- 0 kudos
- 254 Views
- 6 replies
- 2 kudos
Execute Immediate not working to fetch table name based on year
I am trying to pass the year as argument so it can be used in the table name. Ex: there are tables like Claims_total_2021 , Claims_total_2022 and so on till 2025. Now I want to pass the year in parameter , say 2024 and it must fetch the table Claims...
- 254 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi, I believe the solution shared by Martison would fix this issue. In Databricks SQL, when using EXECUTE IMMEDIATE, the SQL string must be a single variable or single string literal, not an inline expression using string concatenation ('...' || clai...
- 2 kudos
- 350 Views
- 6 replies
- 20 kudos
Resolved! Databricks Community Innovators - Program
Hi, I'd like to know what's happened with the Databricks Community innovators program? https://community.databricks.com/t5/databricks-community-innovators/bg-p/databricks-community-news-members Is this still alive? I've applied and emailed: databrick...
- 350 Views
- 6 replies
- 20 kudos
- 20 kudos
Hi Mandy! Thanks for the introduction and update. I'm really looking forward to being a part of everything moving forward. Just echoing what @TheOC mentioned above, I'm happy to help/assist in whatever way I can.Absolutely buzzing based off the info ...
- 20 kudos
- 67 Views
- 0 replies
- 0 kudos
Blackduck scanning on Databricks Workflow
Does anyone know if its compatible scan in blackduck your json based files from Workflows? At least, when its come to the notebook its compatible as blackduck detects python based files but i am wondering why can workflow be scanned as well.
- 67 Views
- 0 replies
- 0 kudos
- 600 Views
- 11 replies
- 43 kudos
Databricks Monthly Spotlight - Discontinued?
Hi, I'm curious what's happened with the Databricks monthly spotlight? https://community.databricks.com/t5/databricks-community-innovators/bg-p/databricks-community-news-members I can there hasn't been anyone in the spotlight since April 2025. Has th...
- 600 Views
- 11 replies
- 43 kudos
- 43 kudos
Hi all,I just wanted to jump in this thread to say how motivating this kind of conversation is. It's great that we've got passion from users and Databricks employees alike to ensure integrity in the Community - it's really reassuring as someone start...
- 43 kudos
- 167 Views
- 3 replies
- 3 kudos
Python
Guys am not able to access pyspark in free edition somebody help me.
- 167 Views
- 3 replies
- 3 kudos
- 3 kudos
@szymon_dybczak it may be that @SinchBhat has the notebook set to SQL as the default language ?Could also, by mistake, be using the SQL editor.@SinchBhat could you provide some screenshots of what your UI looks like please. It'll help to resolve the ...
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |