Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
The message you are trying to access is permanently deleted.
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insig...
Databricks is positioning the platform as a full stack for LLM development — from data ingestion → feature/embedding pipelines → fine-tuning (Mosaic AI) → evaluation → deployment (Model Serving) → monitoring (Lakehouse Monitoring).I’m curious about r...
Thank You @Gecofer for taking the time to share such a clear, experience-backed breakdown of where Databricks shines and where real-world LLM Ops architectures still need supporting components. Your explanation was incredibly practical and resonates ...
I'm a software engineer and a bit new to databricks. My goal is to create a model serving endpoint, that interfaces with several ML models. Traditionally this would look like:API--> Service --> DataNow using databricks, my understanding is that it w...
Just register model and then deploy service endpoint to serve this model.
Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...
In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...
Real-time mode is a breakthrough that lets Spark utilize all available CPUs to process records with single-millisecond latency, while decoupling checkpointing from per-record processing.
Hi Databricks,our learners wants to know when are we disclosing the results of october badge challenge - learningsBr
Hello @saurabh18cs! Have you checked with your company's Databricks Partner Admin or Databricks Support? They’re the right contacts to provide the official timeline for the Partner Learning Badge Challenge results.
Databricks goes native on Excel. You can now ingest + query .xls/.xlsx directly in Databricks (SQL + PySpark, batch and streaming), with auto schema/type inference, sheet + cell-range targeting, and evaluated formulas, no extra libraries anymore.
Our final webinar of December is here and we are closing the year with a powerhouse session!SpeakersAs many organisations still get stuck in the PoC phase, we’re bringing clarity, structure, and real delivery practices to help teams move from promi...
Appreciate you sharing this with the community, @bianca_unifeye!
I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test: def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...
I also cannot read from a Volume from a databricks app and would be interested in a solution
I’m exploring how to handle real-time data for an application and I keep seeing Databricks recommended as a strong option — especially with its support for streaming pipelines, Delta Live Tables, and integrations with various event sources. That said...
Databricks is very effective for real-time app data because it supports streaming data processing using Apache Spark and Delta Lake. It helps handle large data volumes, provides low-latency analytics, and makes it easier to build scalable event-drive...
I was wondering if there's a way to alter table and add/update comments for multiple columns at once using SQL or API calls. For instance - ALTER TABLE <table_name> CHANGE COLUMN <col1> COMMENT '<comment1>',CHANGE COLUMN <col2> COMMENT 'comment2' ; ...
The correct SQL syntax for this is:ALTER TABLE your_table_name ALTER COLUMN col1 COMMENT 'comment1', col2 COMMENT 'comment2', col3 COMMENT 'comment3';
Tags, whether manually assigned or automatically assigned by the “data classification” service, can be protected using policies. Column masking can automatically mask columns with a given tag for all except some with elevated access.
Hi there,Just testing the new Databricks free edition. Was trying to play around with LLMs, but I', not able to create serving endpoints with foundational model entities, interact with pay-per-token foundational model APIs or use them in Databricks a...
I got the same error using the free version.
Hi everyone,I’m exploring ways to leverage Databricks for building data-driven web and mobile applications and wanted to get some insights from this community. Databricks is great for processing large datasets, running analytics, and building machine...
Check Databricks Apps - you pass databricks resources and then use databricks-sdk to interact with them.
Data classification automatically tags Unity Catalog tables and is now available in system tables as well.
I am writing to request assistance with rescheduling my exam for the Databricks Certified Generative AI Engineer Associate certification. My exam was originally scheduled for Today at 7:30 PM. However, when I attempted to access the exam, the Webasse...
Hello @Eswariy , Sorry to hear you missed your exam window. Please file a ticket with our support team so they can review the case and determine next steps. Please note that we cannot provide support via community. Thanks & Regards,@cert-ops
| User | Count |
|---|---|
| 212 | |
| 193 | |
| 94 | |
| 79 | |
| 75 |