cancel
Showing results for 
Search instead for 
Did you mean: 
Events
Stay updated on Databricks events, including webinars, conferences, and workshops. Discover opportunities to connect with industry experts, learn about new technologies, and network with peers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Events

View:
How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path Step 1: Go to the Databricks Training portal and create a free Academy account if you don't have one. Step 2: Complete the fundamentals courses (detailed below) and Get-Started Series/Get Started Days if you are a beginner in Databricks. Step 3: Consider taking related instructor-led - intermediate to advanced courses if you wish to deepen your expertise or obtain certification - contact your databricks account team for packages -  also available as self-paced.  Step 4: Register for your Specialist Session How to Register for a Databricks Specialist Session Q&A Session Registration Pages: EMEA Timezone and AMER Timezone Specialist Sessions are once a month with Databricks experts to ask questions during the recorded session. On-demand versions of the sessions are also available in the Databricks Academy (Customer Link - Partner Link)  Subtitles in English, French and Spanish Recommended Free Databricks Fundamentals Courses Databricks now offers all self-paced training for free to all.  Here are some of the best free introductory courses: Databricks Fundamentals: Understand the architecture and benefits of the Databricks Lakehouse Platform in about 60 minutes. Generative AI Fundamentals: A video series if you're interested in the AI/ML aspects of Databricks. 120 mins. AI Agent Fundamentals: An introductory course on building and applying AI agents for enterprises using Databricks, with demos on the Mosaic AI platform and Agent Bricks. 90 mins Upcoming Specialist Sessions with recommended preparation courses:   Related Fundamental Sessions Related Intermediate + Advanced Sessions Specialist Session Get Started with Databricks for Data Engineering Get Started with Databricks for Machine Learning NEW: Building Enterprise Applications with Databricks Apps (First session Oct 17th)   Oct 14th 2025 - Databricks Apps EMEA | AMER Get Started with Databricks Platform Administration Get Started with Databricks for Data Engineering Data Management and Governance with Unity Catalog Automated Deployment with Databricks Asset Bundles Full Courses Advanced Data Engineering Apache Spark Developer Use Virtual Learning Festival for 50% certification voucher Nov 25th 2025 - Managing Databricks at scale using Terraform EMEA | AMER Extra Tips for Success Hands-on Practice: Use the Databricks Free Edition with sample datasets and labs. Consult Documentation: Supplement your learning with the extensive Databricks technical documentation and notebook galleries. Community Support: Leverage the Databricks Community forums for Q&A and study tips from peers and experts. Quick Links: [EMEA Registration Page] [AMER Registration] [Get Started Series / Get Started Days] [Learning Catalog] [Virtual Learning Festival] - complete a full persona-based course and receive a 50% off certification voucher (worth $100) Start your journey now to maximise your Specialist Session experience with strong foundational knowledge!  
🔊🌟 Calling all data engineers🌟🔊 ​Join us for Databricks DevConnect Washington DC on Tuesday, November 4 from 5:00pm - 9:00pm ET! ➡️ RSVP HERE: luma.com/ena8irzu Databricks DevConnect is a technical meetup designed for data engineers to get stuck into the details, collaborate, learn, and find your people - hosted by the Databricks Developer Relations team. Together, we’ll chat about new features, deep dive into meaty topics, share real-world experiences, and discover new ways to get Databricks to be the best developer you can be. ​Why You Can't Miss This Event: ​​🏆 Become a 10x Data Engineer by using the best of Databricks for your Data + AI projects ​​🧠 Learn from exclusive experts that you won’t hear anywhere else; from Product Managers, DevRel, and MVPs. ​​💬 Connect with your community and build relationships with peers, Developer Advocates, MVPs and Databricks Product Managers. ​👕 Get those goodies: Swag will be raffled off throughout the event and attendees will get access to hands-on training and guided labs through Databricks Academy Labs post event! ​AGENDA ​5:00 PM: Registration & Mingling  6:00 PM: Welcome Remarks ​Denny Lee, Product Management Director, Databricks ​6:15 PM ➡️ Session #1: Apache Iceberg™, Delta Lake & Unity Catalog: How to make them interoperable in your Lakehouse ​Lisa Cao, Staff Developer Relations, Databricks ​6:45 PM ➡️ Session #2: Lakebase: Adding OLTP to the acronym stack of ELT, OLAP, BI & AI ​Denny Lee, Product Management Director, Databricks 7:15 PM ➡️ Session #3: Don’t spend more than you have to with Cost Controls on Databricks ​Nick Karpov, Staff Developer Advocate ​7:55 PM: Closing Remarks ​Denny Lee, Product Management Director, Databricks ​8:00 PM: Networking Reception 9​:00 PM: Good night ​SESSION DESCRIPTIONS UC: The Everything Catalog It’s been four years since Unity Catalog has been introduced, originally to manage access to organised data and files. Today it’s the backbone of many exciting advancements including federated connectivity to third party services, table format agnosticism, automatically delivered system tables, lineage, background table optimisations, the list goes on. Join us to hear the latest developments and what the future holds. Lakebase: Adding OLTP to the acronym stack of ELT, OLAP, BI & AI In June, Databricks announced its acquisition of Neon, a managed postgres company. This isn’t just any old postgres instance, it has separated storage and compute (sound familiar?), branching, and now an integration to a unified data + AI platform. Together we’ll go through what transactional data can do for you, and how it works with Databricks. Don’t spend more than you have to with Cost Controls on Databricks No one likes being slapped with a massive cloud bill, especially ones caused by silly mistakes. Databricks has invested heavily into cost controls so you know how much you’re spending, prevent accidental spend, keep to your budget, and charge back to that vibe coding data scientist that insisted on GPUs.   ​SPEAKERS ​Denny Lee, Product Management Director, Databricks ​Lisa Cao, Staff Developer Relations, Databricks ​Nick Karpov, Staff Developer Advocate  
National Union Building, F Street Northwest, Washington, DC, USA
Brickstalkupdated1.png
Join us next Thursday with Product Manager Pranav Aurora for a another high-impact virtual session: how to bring the data intelligence from your Lakehouse to all your apps and users.  What we’ll cover: Using Lakebase (Lakebase Postgres is a fully managed, cloud-native PostgreSQL database that brings online transaction processing (OLTP) capabilities to the Lakehouse) to serve applications with ultra-low latency requirements. One-click sync of Lakehouse tables into Lakebase. It just works – without the operational overhead. Sync Postgres tables back into the Lakehouse for downstream analytics.  Who should join: Product marketers, data engineers, analytics leads or anyone ready to move from “we have good data” to “we’re using it live, at scale, in real time”. Why you’ll leave energized: You’ll walk away with a practical mental model of a modern data-architecture loop: Lakehouse → Lakebase → Lakehouse. You’ll see how the combination cuts latency, simplifies infrastructure, and keeps governance strong. Fully managed sync – no external tools, permissioning. It just works out of the box.  You’ll leave inspired to build the next-gen operational data platform (yes, you’re up to this). Date: Thursday, November 13 2025 Time: 9:00 am PT Recording: https://community.databricks.com/t5/databrickstv/bricktalks-serve-intelligence-from-your-lakehouse-to-your-apps/ba-p/138980 About BrickTalks BrickTalks are monthly, expert-led sessions hosted by the Databricks Community team. Each event spotlights real-world innovation from Databricks specialists, helping you stay at the forefront of data and AI. Don’t miss this chance to learn directly from the PM leading the charge, ask your questions live, and see what’s coming next for SQL in Databricks.
🎉 Pune, get ready for another epic data day! 🚀 The Databricks Pune User Group Community Meetup is back — bigger, bolder, and packed with insights, innovation, and networking! 📍 Hosted by: EPAM Systems, Pune 📅 Date & Time: November 15, 2025 | 09:30 AM – 02:30 PM IST 📌 Venue: EPAM Systems, 11th Floor, Malpani Agile, Pancard Club Road, Baner, Pune, Maharashtra 411069 🗺 Map: https://maps.app.goo.gl/6jnx1kf7uHhzA6cv8 📝 Registration: https://forms.gle/AqXoHZtWGSgPGWek7 — RSVP soon!   ✨Agenda Highlights Time Session Title Speakers / Coordinator 09:30-10:00 Registration   10:00–10:15 Welcome Note & Pune Community Introduction Databricks + EPAM Systems 10:15–11:00 EPAM : Building Conversational Multi-Agent Systems using Databricks Satish Gunisetty, Narasimha Modugula (EPAM Systems) 11:05–11:50 Accelerating migrations with Lakebridge Aman Jain, Rutuja Pathak (Databricks) 11:50–12:15 High Tea & Networking   12:40–13:15 Databricks Apps to the Rescue: Simplifying UI Needs for AI & Data Workflows Manish Pansari, NiteshChand Sharma (EPAM Systems) 13:20–13:55 Deployments made easy with DABS Ajinkya Netke, Vidya Jasud (Databricks) 14:00–14:15 Quiz   14:15–14:30 Closing Remarks followed by Lunch   💡 Why Attend? Learn from real-world Databricks projects and innovations Network with data and AI experts across Pune Participate in fun quizzes and win cool goodies Enjoy lunch and meaningful conversations with peers ⚠️ Note: Please bring a valid Government ID (Aadhaar / PAN / DL) for entry. Let’s make this another unforgettable milestone for the Pune Databricks Community! 💙 See you there! #Databricks #Pune #CommunityMeetup #EPAM #AI #DataEngineering #LakeBridge #DABs #PuneEvents
🚀 Grow Your Skills and Earn Rewards! 🚀 📅 Mark your calendar: January 09 – January 30, 2026 🎉 Join us for a three-week event dedicated to learning, upskilling, and advancing your career in data engineering, analytics, machine learning, and generative AI. Whether you’re new to the field or aiming to deepen your knowledge, this is a great opportunity to invest in your professional development alongside a global community. 🎟️ Finish all of the modules listed within at least one of the self-paced learning pathways below within Customer Academy during the event window to receive: ✨ 50% discount on any Databricks Certification ✨ 20% discount on a yearly Databricks Academy Labs subscription 💬 All Incentives will be distributed to the eligible participants on 06 February 2026, after the event has concluded. These incentives will be sent to the email associated with your Customer Academy account. ⚠️ Note: Please ensure every component of the course is marked as completed (including any course introduction sections) during the eligibility window to confirm your learning pathway is properly recorded and to qualify for the incentives. Start exploring our learning pathways below! LEARNING PATHWAY CUSTOMER ACADEMY ENROLLMENT LINKS LEARNING PATHWAY 1: ASSOCIATE DATA ENGINEERING QUALIFICATION CRITERIA: Please complete all 4 modules from the Data Engineer Learning Plan Data Ingestion with Lakeflow Connect Deploy Workloads with Lakeflow Jobs Build Data Pipelines with Lakeflow Declarative Pipelines DevOps Essentials for Data Engineering LEARNING PATHWAY 2: PROFESSIONAL DATA ENGINEERING QUALIFICATION CRITERIA: Please complete all 4 modules from the Data Engineer Learning Plan Databricks Streaming and Lakeflow Declarative Pipelines Databricks Data Privacy Databricks Performance Optimisation Automated Deployment with Databricks Asset Bundles LEARNING PATHWAY 3: DATA ANALYSTS QUALIFICATION CRITERIA: Please complete both modules from the Data Analyst Learning Plan AI/BI for Data Analysts SQL Analytics on Databricks LEARNING PATHWAY 4: ASSOCIATE ML PRACTITIONERS QUALIFICATION CRITERIA: Please complete all 4 modules from the Machine Learning Practitioner Learning Plan Data Preparation for Machine Learning Machine Learning Model Development Machine Learning Model Deployment Machine Learning Operations LEARNING PATHWAY 5: PROFESSIONAL ML PRACTITIONERS QUALIFICATION CRITERIA Please complete both modules from the Machine Learning Practitioner Learning Plan Advanced Machine Learning Operations Machine Learning at Scale LEARNING PATHWAY 6:  GENERATIVE AI ENGINEERING QUALIFICATION CRITERIA: Please complete all 4 modules from the Generative AI Engineering Learning Plan Generative AI Solution Development Generative AI Application Development Generative AI Application Evaluation and Governance  Generative AI Application Deployment and Monitoring LEARNING PATHWAY 7:  APACHE SPARK DEVELOPER QUALIFICATION CRITERIA: Please complete all 4 modules from the Apache Spark™ Developer Learning Plan Introduction to Apache Spark™ Developing Applications with Apache Spark™ Stream Processing and Analytics with Apache Spark™ Monitoring and Optimizing Apache Spark™ Workloads on Databricks LEARNING PATHWAY 8:  DATA WAREHOUSING PRACTITIONER QUALIFICATION CRITERIA: Please complete all 3 modules from the Data Warehousing Practitioner Learning Plan Data Warehousing with Databricks SQL Programming and Procedural Logic in Databricks Data Modelling Strategies Happy learning! 📚🎓🧑‍💻🧑‍ #Databricks #MachineLearning #DataEngineering #AI #GenerativeAI #DataAnalysis #Upskill #Certification #TechCareers #LearningFestival #DataScience #AICommunity #DatabricksAcademy #DataSkills #AITraining #LearntoEarn