Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
The message you are trying to access is permanently deleted.
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insig...
DBX is one of the most crucial projects of dblabs this year, and we can expect that more and more great checks from it will be supported natively in databricks. More about dbx on https://databrickslabs.github.io/dqx/
I'm using Job cluster and created compute policies for library management and now I'm trying to use pools in databricks. I'm getting error like this : Cluster validation error: Validation failed for azure_attributes.spot_bid_max_price from pool, the ...
This error occurs because instance pools require a concrete spot bid max price value, even if the cluster policy marks it as unlimited. Set an explicit value (e.g., 100) directly in the instance pool configuration, or switch the pool to on-demand nod...
I'm running a DBR/Spark job using a container. I've set docker_image.url to `docker.io/databricksruntime/standard:13.3-LTS`, as well as the Spark env var `DATABRICKS_RUNTIME_VERSION=13.3`. At runtime, however, I'm encountering this error: ImportError...
Go to Compute → Your Cluster / Job ComputeChange Databricks Runtime to:Databricks Runtime 13.3 LTSRe-run your job with the same container.
I am writing to request assistance with rescheduling my exam for the Databricks Certified Generative AI Engineer Associate certification. My exam was originally scheduled for Today at 7:30 PM. However, when I attempted to access the exam, the Webasse...
Hey everyone, I’m currently on my journey to prepare for the Databricks Certified Data Engineer Professional exam and have been exploring multiple resources, including Databricks documentation, hands-on labs, and practice exercises. Midway through my...
Hey everyone,we are trying to get an overview of all users that we have in our databricks groups. We have tried to do so with the REST API as well as the SQL-queries (with normal developer accounts as well as workspace administrator accounts). The pr...
Use the Databricks SQL system users tableSELECT * FROM system.usersOnly shows fully provisioned users Users pending invitation may not appear.
Hi databricks community, I want to create a lakebase table that is synced with the delta table . So whenever the delta table is updated the changes should be available in lakebase table. Now I want to create a databricks streamlit application and ma...
Yes, it’s possible to have a Lakehouse table synced with a Delta table in Unity Catalog. You have a few options:Direct read: Register the Delta table in Unity Catalog and query it directly from your Streamlit app.Delta Live Tables (DLT): Create a DLT...
When something goes wrong, and your pattern is doing MERGEs per day in your jobs, backfill jobs will help you to reload many days in one shot.
Full link of the actual blog for reference - https://www.databricks.com/blog/announcing-backfill-runs-lakeflow-jobs-higher-quality-downstream-data
Hi everyone,I’m exploring ways to leverage Databricks for building data-driven web and mobile applications and wanted to get some insights from this community. Databricks is great for processing large datasets, running analytics, and building machine...
To connect Databricks with web or mobile apps, most developers recommend exposing your data or models through a lightweight API layer. Use Databricks SQL Endpoints or MLflow model serving to generate secure REST endpoints your app can call directly. ...
With the first day of December comes the first window of our Databricks Advent Calendar. It’s a perfect time to look back at this year’s biggest achievements and surprises — and to dream about the new “presents” the platform may bring us next year. ...
Fantastic kickoff to the Databricks Advent Calendar 2025 , appreciate you steering the series, @Hubert-Dudek!
Hi Databricks community ,Hi I have a doubt regarding databricks streamlit application . I have a databricks streamlit application that takes input values from the user through streamlit UI. Now I want to store these input values in a delta table in U...
Hi @gokkul ,Your app service principal needs to have a proper permission to write to UC table. You also need to use python databricks sdk to interact with UC object (i.e read/save a table).You can get some inspiration from following databricks cookbo...
Hi everyone,I’m working on building and optimizing data pipelines in Databricks, especially for large-scale workloads, and I want to learn from others who have hands-on experience with performance tuning, architecture decisions, and best practices.I’...
Optimizing Databricks pipelines for large-scale workloads mostly comes down to smart architecture + efficient Spark practices.Key tips from real-world users:Use Delta Lake – for ACID transactions, incremental updates, and schema enforcement.Partition...
Calling All Data Enthusiasts in Kerala! Hey everyone,I'm excited about the idea of launching a Databricks Community Group here in Kerala! This group would be a hub for learning, sharing knowledge, and networking among data enthusiasts, analysts, a...
Great initiative! It's good to see the tech community growing here. I’m representing Fegno Technologies, a web and mobile app development company in Kochi. We are always keen to stay updated on the latest data engineering trends and cloud platforms.
Hi,Today I went to access the Databricks Free_Edition catalog: Legacy / Hive_Metastore and it wasn't there.Is it no longer available in this version or have they changed it?Thanks,Fabio
Free edition from the beginning had only UC
With the new ALTER SET, it is really easy to migrate (copy/move) tables. Quite awesome also when you need to make an initial load and have an old system under Lakehouse Federation (foreign tables).
| User | Count |
|---|---|
| 212 | |
| 192 | |
| 94 | |
| 75 | |
| 67 |