- 1531 Views
- 1 replies
- 1 kudos
Spot databricks VMs - eviction rates
Before using Spot machines in #databricks, it's a good idea to check their eviction rates in your region. Azure Resource Graph Explorer and that simple query will help. SpotResources | where type =~ 'microsoft.compute/skuspotevictionrate/location' ...
- 1531 Views
- 1 replies
- 1 kudos
- 1230 Views
- 0 replies
- 0 kudos
Feedback request for Gradient, a tool to help optimize and monitor jobs automatically
Hi Everyone,We built Gradient, a tool to automatically optimize and monitor Databricks jobs to hit your business objectives of cost or runtime.Gradient works by applying a reinforcement ML model to automatically learn and custom tune your jobs cluste...
- 1230 Views
- 0 replies
- 0 kudos
- 921 Views
- 0 replies
- 0 kudos
Understand why your jobs' performances are changing over time
Hi Folks -We released a new metrics view for databricks jobs in Gradient, which helps track and plot the metrics below over time to help engineers understand what's going on with their jobs over time.Job cost (DBU + Cloud fees)Job RuntimeNumber of co...
- 921 Views
- 0 replies
- 0 kudos
- 3248 Views
- 1 replies
- 1 kudos
Jonathan Frankel at Sigma talk
Hi @Sujitha Just to follow up on your suggestion to share my takeaways from Jonathan Frankel's talk at Sigma in NYC. The key ideas I came away with is:Building in-house custom models is more than just possible, there's advantages to itThere's danger...
- 3248 Views
- 1 replies
- 1 kudos
- 1 kudos
@Danny_Lee This is super insightful! Really appreciate your time to share your key takeaways with us.
- 1 kudos
- 1774 Views
- 0 replies
- 1 kudos
Databricks AI Security Framework
Today Databricks announced the release of the Databricks AI Security Framework (LinkedIn Post)You can download the paper (PDF) from blog post. Anyone else download this and have thoughts? My first thought is its a great start and has an excellent G...
- 1774 Views
- 0 replies
- 1 kudos
- 1635 Views
- 0 replies
- 0 kudos
GCP - Initial External Location to GCP Bucket is wrong
When creating a new Workspace in GCP the default GCP External Location is wrong.Its easily fixed by Catalog (on the left) > External Data (on the bottom) > External Locations > choose the connection and edit the URL by deleting the second BucketId af...
- 1635 Views
- 0 replies
- 0 kudos
- 1877 Views
- 0 replies
- 0 kudos
Predictive optimization log
After you enable predictive optimization, it is good to look at the system table and see what is going on with your tables #databricks
- 1877 Views
- 0 replies
- 0 kudos
- 14033 Views
- 2 replies
- 8 kudos
Materials to pass Databricks Data Engineering Associate Exam
Hi Guys, I have passed it already some time ago, but just recently have summarized all the materials which helped me to do it. Pay special attention to GitHub repository, which contains many great exercises prepared by Databricks teamhttps://youtu.be...
- 14033 Views
- 2 replies
- 8 kudos
- 8 kudos
Thanks for sharing. It is indeed very useful.
- 8 kudos
- 1885 Views
- 0 replies
- 0 kudos
Feature article: Leveraging Generative AI with Apache Spark: Transforming Data Engineering
I created this article in Linkedlin to allow both this community and Apache Spark user community to have access to it.It is particularly useful for data engineers who want to have a basic understanding of what Generative AI with Spark can do.Leverag...
- 1885 Views
- 0 replies
- 0 kudos
- 4027 Views
- 1 replies
- 3 kudos
DBR 15.0 beta
databricks runtime 15 is out there!Some breaking changes. More info here https://docs.databricks.com/en/release-notes/runtime/15.0.html
- 4027 Views
- 1 replies
- 3 kudos
- 3 kudos
Thanks for sharing this information @Hubert-Dudek!!!
- 3 kudos
- 3475 Views
- 1 replies
- 1 kudos
Notebook IDE
This is an excellent step for #databricks notebooks. Integrated debugger and CLI in notebook terminal is a big step towards a fully functional cloud IDE.
- 3475 Views
- 1 replies
- 1 kudos
- 6409 Views
- 2 replies
- 0 kudos
Build a machine learning model to detect fraudulent transactions using PySpark's MLlib library
IntroductionFinancial fraud is a significant concern for businesses and consumers alike. I have written about this concern a few times in Linkedlin articles. Machine learning offers powerful tools to combat this issue by automatically identifying sus...
- 6409 Views
- 2 replies
- 0 kudos
- 0 kudos
Looking to build a machine learning model for detecting fraudulent transactions using PySpark’s MLlib. Generate synthetic transaction data. Provides a dataset for model training without using sensitive real-world data. Enables the creation of diverse...
- 0 kudos
- 2086 Views
- 1 replies
- 2 kudos
is it possible to have a class level separation in databricks or implement a design pattern in datab
if you have thought about making your code inside databricks and notebooks more reusable and organized and you have thought about implementing a design pattern or class level separation in databricks the answer is yes, I am going to tell you the deta...
- 2086 Views
- 1 replies
- 2 kudos
- 2 kudos
tnx! I have spent quite some time on figuring out what the best way is. Your approach is certainly a valid one.Myself I prefer to package reused classes in a jar (we mainly code in scala). Works fine too.
- 2 kudos
- 6197 Views
- 1 replies
- 1 kudos
Building Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integration
I recently saw an article from Databricks titled "Scalable Spark Structured Streaming for REST API Destinations". A great article focusing on continuous Spark Structured Streaming (SSS). About a year old. I then decided, given customer demands to wo...
- 6197 Views
- 1 replies
- 1 kudos
- 2203 Views
- 0 replies
- 0 kudos
stored procedures
The plan for stored procedures in databricks spark has been announced in a few places. How can stored procedures look in Spark SQL?
- 2203 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
1 -
ADF Linked Service
1 -
ADF Pipeline
1 -
Advanced Data Engineering
3 -
AI Agents
1 -
AI Readiness
1 -
Apache spark
1 -
ApacheSpark
1 -
Associate Certification
1 -
Automation
1 -
AWSDatabricksCluster
1 -
Azure
1 -
Azure databricks
3 -
Azure devops integration
1 -
AzureDatabricks
2 -
Big data
1 -
Blog
1 -
Caching
2 -
CICDForDatabricksWorkflows
1 -
Cluster
1 -
Cluster Policies
1 -
Cluster Pools
1 -
Community Event
1 -
Cost Optimization Effort
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data
1 -
Data Analysis with Databricks
1 -
Data Engineering
4 -
Data Governance
1 -
Data Mesh
1 -
Data Processing
1 -
Databricks Assistant
1 -
Databricks Community
1 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks Job
1 -
Databricks Migration
2 -
Databricks Mlflow
1 -
Databricks Notebooks
1 -
Databricks Support
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksML
1 -
DBR Versions
1 -
Declartive Pipelines
1 -
DeepLearning
1 -
Delta Live Table
1 -
Delta Live Tables
1 -
Delta Time Travel
1 -
Devops
1 -
DimensionTables
1 -
DLT
2 -
DLT Pipelines
3 -
DLT-Meta
1 -
Dns
1 -
Dynamic
1 -
Free Databricks
3 -
GenAI agent
1 -
GenAI and LLMs
2 -
GenAIGeneration AI
1 -
Generative AI
1 -
Genie
1 -
Governance
1 -
Hive metastore
1 -
Hubert Dudek
1 -
Lakeflow Pipelines
1 -
Lakehouse
1 -
Lakehouse Migration
1 -
Lazy Evaluation
1 -
Learning
1 -
Library Installation
1 -
Llama
1 -
Medallion Architecture
1 -
Migrations
1 -
MSExcel
2 -
Multiagent
1 -
Networking
2 -
Partner
1 -
Performance
1 -
Performance Tuning
1 -
Private Link
1 -
Pyspark
1 -
Pyspark Code
1 -
Pyspark Databricks
1 -
Pytest
1 -
Python
1 -
Reading-excel
1 -
Scala Code
1 -
Scripting
1 -
SDK
1 -
Serverless
2 -
Spark Caching
1 -
SparkSQL
1 -
SQL
1 -
SQL Serverless
1 -
Support Ticket
1 -
Sync
1 -
Tutorial
1 -
Unit Test
1 -
Unity Catalog
4 -
Unity Catlog
1 -
Warehousing
1 -
Workflow Jobs
1 -
Workflows
3
- « Previous
- Next »
| User | Count |
|---|---|
| 71 | |
| 43 | |
| 38 | |
| 30 | |
| 23 |