cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

chalabit
by New Contributor III
  • 711 Views
  • 4 replies
  • 1 kudos

Resolved! Import .py files module does not work on VNET injected workspace

We have problem with import any python files as module on VNET injected workspace.For same folder structure (see bellow), the imports works on serverless clusters or in databricks managed workspace (i.e. create new azure databricks workspace without ...

chalabit_3-1767349860220.png chalabit_2-1767349733641.png chalabit_1-1767349674098.png
  • 711 Views
  • 4 replies
  • 1 kudos
Latest Reply
chalabit
New Contributor III
  • 1 kudos

Redeploying workspace from azure portal worked with "documentation" VNET injection set up with NSG and NAT gw. Only added new NSG rule on top of deployed rulesOutboundTCPVirtualNetworkAnyAzureDatabricks (service tag)443, 3306, 8443-8451No idea where ...

  • 1 kudos
3 More Replies
DBXDeveloper111
by New Contributor III
  • 640 Views
  • 3 replies
  • 1 kudos

ModuleNotFoundError: No module named 'MY-MODEL'

I'm currently trying to create a model serving end point around a model I've recently created. I'm trying to wrap my head around an error. The model is defined as below class MY-MODEL(mlflow.pyfunc.PythonModel): def load_context(self, context): ...

  • 640 Views
  • 3 replies
  • 1 kudos
Latest Reply
JAHNAVI
Databricks Employee
  • 1 kudos

@DBXDeveloper111 could you please create the class like MYMODEL without hyphen and then try improting it. as hyphen is invalid identifier. Please confirm if you are still facing the issue after this change.

  • 1 kudos
2 More Replies
freshmint
by New Contributor II
  • 2953 Views
  • 5 replies
  • 0 kudos

How to get Databricks usage invoices?

Hey guys,I'm wondering if there are people who wanted to see invoices? I've been using Databricks and I registered my credit card. I've been paying for it.Now I just want to see the invoices but I can't find it. Is there anybody who experienced simil...

  • 2953 Views
  • 5 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @freshmint! To clarify, are you looking for invoices related to courses you've purchased, or are you referring to other Databricks services?

  • 0 kudos
4 More Replies
AnneEst
by Databricks Partner
  • 533 Views
  • 1 replies
  • 1 kudos

Resolved! Changing profile from customer to partner

HelloI was previously registered with a customer profile, and have updated my profile to use my work email which is a partner email, but still I am unable to access the partner academy.I tried different things (incognito window, clearing cookies, etc...

  • 533 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Hello @AnneEst! If you’re unable to sign in to the Partner Academy using your partner email address, please raise a ticket with the Databricks Support team. They’ll be able to review your profile and help you get access to the Partner Academy.

  • 1 kudos
drag7ter
by Contributor
  • 4779 Views
  • 13 replies
  • 6 kudos

Parameters in dashboards data section passing via asset bundles

A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...

  • 4779 Views
  • 13 replies
  • 6 kudos
Latest Reply
protmaks
Databricks MVP
  • 6 kudos

Just three weeks ago, Databricks added the ability to parameterize the catalog and schema, here with examples -https://medium.com/@protmaks/dynamic-catalog-schema-in-databricks-dashboards-b7eea62270c6 

  • 6 kudos
12 More Replies
curious_rabbit
by New Contributor II
  • 637 Views
  • 2 replies
  • 0 kudos

Getting Genie to Generate SPC (Control) Charts Reliably

Hi everyone!I’m working on getting Genie to accurately generate Statistical Process Control (SPC) charts when prompted.  I'm looking for suggestions on how to best approach this. So far, I’ve tried using pre-defined SQL queries to select the data, bu...

  • 637 Views
  • 2 replies
  • 0 kudos
Latest Reply
curious_rabbit
New Contributor II
  • 0 kudos

Or here is hopefully a more elegant way to phrase my question:To visualise a control diagram in Genie for an end-user, should I a) instruct Genie how to create an SPC chart with SQL on the fly, of b) create a background job (pre-defined SQL query in ...

  • 0 kudos
1 More Replies
Primus-Connect
by Databricks Partner
  • 433 Views
  • 0 replies
  • 2 kudos

Calling for Speakers – Manchester Databricks User Group | 19 March 2026

Hi everyoneThe Manchester Databricks User Group is looking for speakers for our in-person meetup on Thursday, 19 March 2026.https://www.meetup.com/manchester-databricks-user-group/We’re keen to hear from Databricks users, practitioners, partners, and...

  • 433 Views
  • 0 replies
  • 2 kudos
icurious
by New Contributor II
  • 1226 Views
  • 2 replies
  • 2 kudos

Resolved! Error in creating external iceberg table

I am new to Databrcicks and was trying to create an Iceberg table. I have configured the Credentials and External Location using the UI under Catalog > External Locations and Credentials. I am able to create a table by using the Browse feature.But wh...

  • 1226 Views
  • 2 replies
  • 2 kudos
Latest Reply
icurious
New Contributor II
  • 2 kudos

Thanks I am able to create a Lakehouse Federation and query the Snowflake catalog. So, if I create an Iceberg table in Databricks, one cannot access the path directly from else where like if I want to access it from Snowflake, right?

  • 2 kudos
1 More Replies
Anonym40
by New Contributor III
  • 1610 Views
  • 2 replies
  • 3 kudos

Resolved! Ingesting data from APIs

Hi, I need to ingest some data available at API endpoint. I was thinking of this option - 1. make API call from Notebook and save data to ADLS2. use AutoLoader to load data from ADLS location. But then, i have some doubts - like I can directly write ...

  • 1610 Views
  • 2 replies
  • 3 kudos
Latest Reply
Raman_Unifeye
Honored Contributor III
  • 3 kudos

@Anonym40 - its generally a good idea to break the direct API calls to your rest of the data pipeline. By staging the data to ADLS, you are protecting your downstream to upstream processes and getting more restartability/maintenance in your e2e flow....

  • 3 kudos
1 More Replies
ymmmm
by New Contributor III
  • 894 Views
  • 6 replies
  • 1 kudos

Resolved! Account reset and loss of access to paid Databricks Academy Labs subscription

Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...

  • 894 Views
  • 6 replies
  • 1 kudos
Latest Reply
ymmmm
New Contributor III
  • 1 kudos

Thank you for your support and tour help

  • 1 kudos
5 More Replies
Prathy
by New Contributor II
  • 693 Views
  • 2 replies
  • 3 kudos

Resolved! AWS & Databricks Registration Issue

I have created both AWS and Databricks account but I cannot move to further steps in aws marketplace (configure and launch section)

Screenshot 2025-12-18 154546.png Screenshot 2025-12-18 151247.png
  • 693 Views
  • 2 replies
  • 3 kudos
Latest Reply
Advika
Community Manager
  • 3 kudos

Hello @Prathy!Also, please check out this video: https://www.youtube.com/watch?v=uzjHI0DNbbsRefer to the deck linked in the video’s description (https://drive.google.com/file/d/1ovZd...) and check slide no. 16, titled “Linking AWS to your Databricks ...

  • 3 kudos
1 More Replies
greengil
by Contributor
  • 2100 Views
  • 10 replies
  • 2 kudos

Create function issue

Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...

  • 2100 Views
  • 10 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...

  • 2 kudos
9 More Replies
DataYoga
by New Contributor
  • 5869 Views
  • 4 replies
  • 0 kudos

Informatica ETLs

I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...

  • 5869 Views
  • 4 replies
  • 0 kudos
Latest Reply
zalmane
New Contributor II
  • 0 kudos

We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and fin...

  • 0 kudos
3 More Replies
Peter_Theil
by New Contributor II
  • 3226 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks partner journey for small firms

Hello,We are a team of 5 ( DE/ Architects ) exploring the idea of starting a small consulting company focused on Databricks as a SI partner and wanted to learn from others who have gone through the partnership journey.I would love to understand how t...

  • 3226 Views
  • 3 replies
  • 3 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 3 kudos

If I’m being completely honest, I haven’t seen any. As you can imagine, partner organizations tend to keep things pretty close to the vest for a variety of reasons. That said, once a new partner is officially enrolled, they are granted access to an e...

  • 3 kudos
2 More Replies
Poorva21
by Contributor II
  • 2205 Views
  • 2 replies
  • 3 kudos

How realistic is truly end-to-end LLMOps on Databricks?

Databricks is positioning the platform as a full stack for LLM development — from data ingestion → feature/embedding pipelines → fine-tuning (Mosaic AI) → evaluation → deployment (Model Serving) → monitoring (Lakehouse Monitoring).I’m curious about r...

  • 2205 Views
  • 2 replies
  • 3 kudos
Latest Reply
Poorva21
Contributor II
  • 3 kudos

Thank You @Gecofer for taking the time to share such a clear, experience-backed breakdown of where Databricks shines and where real-world LLM Ops architectures still need supporting components. Your explanation was incredibly practical and resonates ...

  • 3 kudos
1 More Replies
Labels