- 514 Views
- 2 replies
- 4 kudos
Resolved! What is `read_files`?
Bit of a silly question, but wondering if someone can help me better understand what is `read_files`?read_files table-valued function | Databricks on AWSThere's at least 3 ways to pull raw json data into a spark dataframe:df = spark.read...df = spark...
- 514 Views
- 2 replies
- 4 kudos
- 4 kudos
Also, @ChristianRRL , with a slight adjustment to the syntax, it does indeed behave like Autoloaderhttps://docs.databricks.com/aws/en/ingestion/cloud-object-storage/auto-loader/patterns?language=SQL I'd also advise looking at the different options th...
- 4 kudos
- 5263 Views
- 8 replies
- 0 kudos
Need help migrating company customer and partner academy accounts to work properly
Hi, originally I accidentally made a customer academy account with my company that is a databricks partner. Then I made an account using my personal email and listed my company email as the partner email for the partner academy account. that account ...
- 5263 Views
- 8 replies
- 0 kudos
- 0 kudos
Need help to merge my customer portal id with partner mail id my case number is 00754330
- 0 kudos
- 358 Views
- 4 replies
- 2 kudos
Trying to reduce latency on DLT pipelines with Autoloader and derived tables
What I'm trying to achieve: ingest files into bronze tables with Autoloader, then produce Kafka messages for each file ingested using a DLT sink.The issue: latency between file ingested and message produced get exponentially higher the more tables ar...
- 358 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi, I think it is a delay of the autoloader as it doesn't know about the ingested files. It is nothing in common with the state, as it is just an autoloader and it keeps a list of processed files. Autloader scans the directory every minute, usually a...
- 2 kudos
- 256 Views
- 2 replies
- 2 kudos
how to import sample notebook to azure databricks workspace
In the second onboarding video, the Quickstart Notebook is shown. I found that notebook here:https://www.databricks.com/notebooks/gcp-qs-notebook.htmlI wanted to import it to my workspace in Azure Databricks account, to play with it. However, selecti...
- 256 Views
- 2 replies
- 2 kudos
- 5956 Views
- 3 replies
- 3 kudos
Resolved! Configure Databricks in VSCode through WSL
Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....
- 5956 Views
- 3 replies
- 3 kudos
- 3 kudos
What worked for me was NOT opening the browser using the pop-up (which generated the 3-legged-OAuth flow error), but clicking on the link provided by the CLI (or copy paste the link on the browser)
- 3 kudos
- 133 Views
- 1 replies
- 1 kudos
Request to Extend Partner Tech Summit Lab Access
Hi Team,I would appreciate it if my Partner Tech Summit lab access could be extended, as two of the assigned labs were inaccessible. Could you please advise whom I should contact for this?Thank you.Regards,Lakshmipriya
- 133 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Lakshmipriya_N ,Create a support ticket and wait for reply:Contact Us
- 1 kudos
- 5421 Views
- 11 replies
- 2 kudos
Resolved! DataBricks Certification Exam Got Suspended. Require Immediate support
Hello @Cert-Team @Certificate Team,Request Id# 00432042I encountered a pathetic experience while attempting my Databricks Certified Data Engineer Professional certification exam. This is a completely unethical process to harass the examinee and lose...
- 5421 Views
- 11 replies
- 2 kudos
- 2 kudos
Hi @Cert-Team, I had similar issue. My exam got suspended too. I had already completed my exam when it got suspended. So you can either evaluate and provide the results or help me reschedule the exam. I have raised a request - #00750846, its been mor...
- 2 kudos
- 229 Views
- 1 replies
- 1 kudos
Resolved! How to install whl from volume for databricks_cluster_policy via terraform.
I would expect resource "databricks_cluster_policy" "cluster_policy" { name = var.policy_name libraries { Volumes { whl = "/Volumes/bronze/config/python.wheel-1.0.3-9-py3-none-any.whl" }}}to work but terraform doesnt recognize "volum...
- 229 Views
- 1 replies
- 1 kudos
- 1 kudos
This workedresource "databricks_cluster_policy" "cluster_policy" { name = var.policy_name libraries { whl = "/Volumes/bronze/config/python.wheel-1.0.3-9-py3-none-any.whl" }}
- 1 kudos
- 544 Views
- 1 replies
- 0 kudos
Cannot get tracing to work on genai app deployed on databricks
Hi, I have a gradio app that is deployed on databricks. The app is coming from this example provided by databricks. The app works fine, but when I want to add tracing I cannot get it to work. I keep getting the errormlflow.exceptions.MlflowException...
- 544 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @MisterT , In our docs, it is mentioned we use MLflow 3(major upgrade) with GenAI monitoring enabled. Each agent endpoint is assigned an MLflow experiment, and log agent traces from the endpoint to that experiment in real-time. Internally an MLF...
- 0 kudos
- 489 Views
- 3 replies
- 1 kudos
Ai Query Prompt Token and Completition token
HiI would like to know how can I get the Completition token and Prompt token quantity when using Ai_Query?Thanks
- 489 Views
- 3 replies
- 1 kudos
- 1 kudos
Hello @Andreyai good day!!For AI_queries, we have documentation from databricks. : https://docs.databricks.com/aws/en/sql/language-manual/functions/ai_query I am 100% sure you will get better insights from the documentations. But I have something for...
- 1 kudos
- 195 Views
- 1 replies
- 1 kudos
2025 Data + AI World Tour Atlanta
Attending how to build Intelligent Agents at Databricks Data+AI World Tour 2025#Databricks #Data+AI #DatabricksWorldTour
- 195 Views
- 1 replies
- 1 kudos
- 1 kudos
Great to hear, @agilecoach360! Please share your learnings and experience from the event with the Community, it would be really valuable for everyone. Looking forward to your insights.
- 1 kudos
- 2565 Views
- 12 replies
- 5 kudos
Parameters in dashboards data section passing via asset bundles
A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...
- 2565 Views
- 12 replies
- 5 kudos
- 5 kudos
I did however just found out there is parameterization possible.. dont know yet how to incorporate it into asset bundle deploy but at least i have a first step. You can use SELECT * FROM IDENTIFIER(:catalog || '.' || :schema || '.' || :table)Or hardc...
- 5 kudos
- 485 Views
- 2 replies
- 1 kudos
I made an AI assistant for Databricks docs, let me know what you think!
Hello members of the Databricks community!I built this Ask AI chatbot/widget where I gave a custom LLM access to some of Databricks' docs to help answer technical questions for people using Databricks. I tried it on a couple of questions that resembl...
- 485 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @matte_kapa_bul , how are you doing?First of all, congratulations on the initiative. I’ve tried to do something similar myself, and it’s very useful for locating documentation, but it doesn’t end up being very effective in solving some issues repo...
- 1 kudos
- 691 Views
- 3 replies
- 0 kudos
Got No such file or directory error while serving the endpoint
Hello Everyone!I'm using Databricks for my MLOps learning, and I'm following the tutorial, and I got an error while serving the endpoint. I need help in this.Problem Overview:I have created a basic LightGBM model and logged it in the Unity Catalog. T...
- 691 Views
- 3 replies
- 0 kudos
- 0 kudos
To give you a quick recap, I’ve consolidated the code into a single file for clarity. Using the Iris dataset as an example, I first create a basic model with the scikit-learn flavor. Then, I create a PyFunc wrapper from the registered model and, fina...
- 0 kudos
- 1374 Views
- 1 replies
- 1 kudos
Resolved! Addressing Memory Constraints in Scaling XGBoost and LGBM: A Comprehensive Approach for High-Volume
Scaling XGBoost and LightGBM models to handle exceptionally large datasets—those comprising billions to tens of billions of rows—presents a formidable computational challenge, particularly when constrained by the limitations of in-memory processing o...
- 1374 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @fiverrpromotion, As you mention, scaling XGBoost and LightGBM for massive datasets has its challenges, especially when trying to preserve critical training capabilities such as early stopping and handling of sparse features / high-cardinality cat...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
2 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 116 | |
| 56 | |
| 42 | |
| 34 |