- 711 Views
- 4 replies
- 1 kudos
Resolved! Import .py files module does not work on VNET injected workspace
We have problem with import any python files as module on VNET injected workspace.For same folder structure (see bellow), the imports works on serverless clusters or in databricks managed workspace (i.e. create new azure databricks workspace without ...
- 711 Views
- 4 replies
- 1 kudos
- 1 kudos
Redeploying workspace from azure portal worked with "documentation" VNET injection set up with NSG and NAT gw. Only added new NSG rule on top of deployed rulesOutboundTCPVirtualNetworkAnyAzureDatabricks (service tag)443, 3306, 8443-8451No idea where ...
- 1 kudos
- 640 Views
- 3 replies
- 1 kudos
ModuleNotFoundError: No module named 'MY-MODEL'
I'm currently trying to create a model serving end point around a model I've recently created. I'm trying to wrap my head around an error. The model is defined as below class MY-MODEL(mlflow.pyfunc.PythonModel): def load_context(self, context): ...
- 640 Views
- 3 replies
- 1 kudos
- 1 kudos
@DBXDeveloper111 could you please create the class like MYMODEL without hyphen and then try improting it. as hyphen is invalid identifier. Please confirm if you are still facing the issue after this change.
- 1 kudos
- 2953 Views
- 5 replies
- 0 kudos
How to get Databricks usage invoices?
Hey guys,I'm wondering if there are people who wanted to see invoices? I've been using Databricks and I registered my credit card. I've been paying for it.Now I just want to see the invoices but I can't find it. Is there anybody who experienced simil...
- 2953 Views
- 5 replies
- 0 kudos
- 0 kudos
Hello @freshmint! To clarify, are you looking for invoices related to courses you've purchased, or are you referring to other Databricks services?
- 0 kudos
- 533 Views
- 1 replies
- 1 kudos
Resolved! Changing profile from customer to partner
HelloI was previously registered with a customer profile, and have updated my profile to use my work email which is a partner email, but still I am unable to access the partner academy.I tried different things (incognito window, clearing cookies, etc...
- 533 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @AnneEst! If you’re unable to sign in to the Partner Academy using your partner email address, please raise a ticket with the Databricks Support team. They’ll be able to review your profile and help you get access to the Partner Academy.
- 1 kudos
- 4779 Views
- 13 replies
- 6 kudos
Parameters in dashboards data section passing via asset bundles
A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...
- 4779 Views
- 13 replies
- 6 kudos
- 6 kudos
Just three weeks ago, Databricks added the ability to parameterize the catalog and schema, here with examples -https://medium.com/@protmaks/dynamic-catalog-schema-in-databricks-dashboards-b7eea62270c6
- 6 kudos
- 637 Views
- 2 replies
- 0 kudos
Getting Genie to Generate SPC (Control) Charts Reliably
Hi everyone!I’m working on getting Genie to accurately generate Statistical Process Control (SPC) charts when prompted. I'm looking for suggestions on how to best approach this. So far, I’ve tried using pre-defined SQL queries to select the data, bu...
- 637 Views
- 2 replies
- 0 kudos
- 0 kudos
Or here is hopefully a more elegant way to phrase my question:To visualise a control diagram in Genie for an end-user, should I a) instruct Genie how to create an SPC chart with SQL on the fly, of b) create a background job (pre-defined SQL query in ...
- 0 kudos
- 433 Views
- 0 replies
- 2 kudos
Calling for Speakers – Manchester Databricks User Group | 19 March 2026
Hi everyoneThe Manchester Databricks User Group is looking for speakers for our in-person meetup on Thursday, 19 March 2026.https://www.meetup.com/manchester-databricks-user-group/We’re keen to hear from Databricks users, practitioners, partners, and...
- 433 Views
- 0 replies
- 2 kudos
- 1226 Views
- 2 replies
- 2 kudos
Resolved! Error in creating external iceberg table
I am new to Databrcicks and was trying to create an Iceberg table. I have configured the Credentials and External Location using the UI under Catalog > External Locations and Credentials. I am able to create a table by using the Browse feature.But wh...
- 1226 Views
- 2 replies
- 2 kudos
- 2 kudos
Thanks I am able to create a Lakehouse Federation and query the Snowflake catalog. So, if I create an Iceberg table in Databricks, one cannot access the path directly from else where like if I want to access it from Snowflake, right?
- 2 kudos
- 1610 Views
- 2 replies
- 3 kudos
Resolved! Ingesting data from APIs
Hi, I need to ingest some data available at API endpoint. I was thinking of this option - 1. make API call from Notebook and save data to ADLS2. use AutoLoader to load data from ADLS location. But then, i have some doubts - like I can directly write ...
- 1610 Views
- 2 replies
- 3 kudos
- 3 kudos
@Anonym40 - its generally a good idea to break the direct API calls to your rest of the data pipeline. By staging the data to ADLS, you are protecting your downstream to upstream processes and getting more restartability/maintenance in your e2e flow....
- 3 kudos
- 894 Views
- 6 replies
- 1 kudos
Resolved! Account reset and loss of access to paid Databricks Academy Labs subscription
Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...
- 894 Views
- 6 replies
- 1 kudos
- 693 Views
- 2 replies
- 3 kudos
Resolved! AWS & Databricks Registration Issue
I have created both AWS and Databricks account but I cannot move to further steps in aws marketplace (configure and launch section)
- 693 Views
- 2 replies
- 3 kudos
- 3 kudos
Hello @Prathy!Also, please check out this video: https://www.youtube.com/watch?v=uzjHI0DNbbsRefer to the deck linked in the video’s description (https://drive.google.com/file/d/1ovZd...) and check slide no. 16, titled “Linking AWS to your Databricks ...
- 3 kudos
- 2100 Views
- 10 replies
- 2 kudos
Create function issue
Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...
- 2100 Views
- 10 replies
- 2 kudos
- 2 kudos
In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...
- 2 kudos
- 5869 Views
- 4 replies
- 0 kudos
Informatica ETLs
I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...
- 5869 Views
- 4 replies
- 0 kudos
- 0 kudos
We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and fin...
- 0 kudos
- 3226 Views
- 3 replies
- 3 kudos
Resolved! Databricks partner journey for small firms
Hello,We are a team of 5 ( DE/ Architects ) exploring the idea of starting a small consulting company focused on Databricks as a SI partner and wanted to learn from others who have gone through the partnership journey.I would love to understand how t...
- 3226 Views
- 3 replies
- 3 kudos
- 3 kudos
If I’m being completely honest, I haven’t seen any. As you can imagine, partner organizations tend to keep things pretty close to the vest for a variety of reasons. That said, once a new partner is officially enrolled, they are granted access to an e...
- 3 kudos
- 2205 Views
- 2 replies
- 3 kudos
How realistic is truly end-to-end LLMOps on Databricks?
Databricks is positioning the platform as a full stack for LLM development — from data ingestion → feature/embedding pipelines → fine-tuning (Mosaic AI) → evaluation → deployment (Model Serving) → monitoring (Lakehouse Monitoring).I’m curious about r...
- 2205 Views
- 2 replies
- 3 kudos
- 3 kudos
Thank You @Gecofer for taking the time to share such a clear, experience-backed breakdown of where Databricks shines and where real-world LLM Ops architectures still need supporting components. Your explanation was incredibly practical and resonates ...
- 3 kudos
-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
adcAws databricks
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
5 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
api
1 -
Api Calls
1 -
API Documentation
4 -
App
2 -
Application
2 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
7 -
Azure data disk
1 -
Azure databricks
16 -
Azure Databricks Delta Table
1 -
Azure Databricks Job
1 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
best practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
CDC
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
4 -
Community Edition Account
1 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Community site
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
csv
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
4 -
Data Explorer
1 -
Data Governance
1 -
Data Ingestion & connectivity
1 -
Data Ingestion Architecture
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
4 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks autoloader
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakeflow
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks Serverless
2 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks User Group
1 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
Delta Time Travel
1 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
DQX
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
Event Driven
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
File Trigger
1 -
Filenotfoundexception
1 -
Free Edition
1 -
Free trial
1 -
friendsofcommunity
1 -
GCP Databricks
1 -
GenAI
2 -
GenAI and LLMs
1 -
GenAI Course Material
1 -
Getting started
3 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
2 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
JSON Object
1 -
LakeflowDesigner
1 -
Learning
2 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
2 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model
1 -
Model Serving
1 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
2 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
provisioned throughput
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Speakers
1 -
Splunk
2 -
SQL
8 -
streamlit
1 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
2 -
Venicold
3 -
Vnet
1 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 143 | |
| 135 | |
| 57 | |
| 46 | |
| 42 |