Tuesday, January 28, 2025 | 9:00 AM-11:00 AM PT
Register Now
Healthcare faces an evolving regulatory landscape with the goals of both improving system efficiency and patient outcomes. One prominent example from the Centers for Medicare & Medicaid Services (CMS) is known as the CMS Interoperability and Patient Access Final Rule. This was expanded in 2024 to add several new provisions to improve data exchange practices.
Of particular importance is accelerating the turnaround times to support timely access to care. Beginning on January 1, 2026, expedited requests must be handled within 72 hours. What makes this challenging from a data perspective? Requests are submitted via varying channels (phone, fax, email) and unstructured and structured data types (PDFs, CSVs, etc.).
With a modern data intelligence platform, you can ingest any data type and any interoperable standard (HL7, FHIR, DICOM) to deliver faster turnaround times.
Join Databricks, Redox and XponentL Data for this workshop, where we’ll explore:
The evolving regulatory landscape around interoperability
Ingesting data from the healthcare ecosystem including FHIR
Supporting predictive analytics to shorten turnaround times
Delivering data intelligence via FHIR to downstream applications to improve care
Agenda
9:00 AM - 9:05 AM
Welcome and Introductions
9:05 AM - 9:20 AM
The Data Intelligence Platform
9:20 AM - 10:00 AM
Best Practices for Healthcare Use Cases
10:00 AM - 10:20 AM
Demo: Ingesting FHIR and Democratizing Intelligence with AI/BI
10:20 AM - 10:40 AM
Demo: Delivering Data Intelligence via FHIR to the Healthcare Ecosystem
10:40 AM - 11:00 AM
Live Q&A
Speakers
Tuesday, February 11, 2025 | 9:30 AM-1:30 PM ET
Montreal - McGill
2000 Avenue McGill College, Suite 1400
Montreal, Quebec H3A 3H3
To Secure Your Spot Click Here
The Azure Databricks Data Intelligence Platform allows your entire organization to use data and AI. It’s built on the lakehouse – a unified system to query and manage all data across the enterprise – to provide an open, unified foundation for all data and governance, and is powered by a Data Intelligence Engine that understands the uniqueness of your data. From ETL to data warehousing to generative AI, Azure Databricks helps you simplify and accelerate your data and AI goals. Join Databricks and Microsoft to learn how to leverage best practices for implementing a complete data analytics, data engineering and data science lifecycle on the lakehouse architecture with Azure Databricks Data intelligence Platform.
This live, hands-on lab will teach you how to:
Access all your data — structured, semi-structured, unstructured — with a lakehouse architecture
Use Databricks SQL to query and visualize data in your lakehouse
Train models and create predictions with Azure Databricks
Track experiments and tune hyperparameters with MLflow
Deploy and serve models with MLflow and other Azure services
Agenda
9:30 AM
Databricks & Microsoft Keynote
10:15 AM
Set Up Your Workspace
10:30 AM
Data Engineering Hands-On Lab
11:30 AM
Lunch & Networking Break
12:15 PM
Data Science and ML Hands-On Lab
12:55 PM
Databricks SQL Hands-On Lab
1:30 PM
Event Concludes
Thursday, February 13, 2025 | 9:00 AM-11:30 AM PT
To Register For The Event Click Here
Most organizations run complex cloud data architectures that silo applications, users and data. As a result, most analysis is performed with stale data and there isn’t a single source of truth of data for analytics.
Join this interactive hands-on workshop to learn how Databricks SQL allows you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics — with up to 12x better price/performance than traditional cloud data warehouses. Now data analysts and scientists can work with the freshest and most complete data and quickly derive new insights for accurate decision making.
Here’s what we’ll cover:
An overview of the Data Intelligence Platform and how Databricks SQL fits in, enabling you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics
How to manage and monitor compute resources, data access and users across your lakehouse infrastructure using Databricks SQL
How to use Databricks SQL to query directly on your data lake using your tools of choice or the built-in SQL editor, visualizations and dashboards
How to use AI to increase productivity when querying, completing code, or building dashboards
Led by Databricks instructors, this interactive live workshop will give you opportunities to practice during the hands-on lab and ask questions.
Register today!
Agenda
9:00 AM - 9:05 AM
Welcome and Introduction
9:05 AM - 9:30 AM
Product Overview
9:30 AM - 11:00 AM
Guided Hands-on Lab
11:00 AM - 11:30 AM
Live Q&A
11:30 PM
Conclusion
Overview
Databricks Get Started Days is a half-day virtual event to sharpen your data engineering and analysis skills. This interactive training is the perfect way to dive in and learn about Databricks. Here’s what makes Get Started Days special:
Practical Learning: Demo sessions tailored for beginners.
Quick & Convenient: Two sessions in just half a day.
Global Access: Choose a time zone that fits your schedule.
Exclusive Perks: Attend and complete the survey to unlock a 30-day free coupon for Databricks Academy Labs!
This event is held during APJ-friendly timezone.
Registration
Customer Academy Link: https://customer-academy.databricks.com/learn/courses/3125/databricks-get-started-days
Partner Academy Link: https://partner-academy.databricks.com/learn/courses/3125/databricks-get-started-days
Course Description: Get Started with Databricks for Data Engineering
In this course, you will learn basic skills that will allow you to use the Databricks Data Intelligence Platform to perform a simple data engineering workflow and support data warehousing endeavors. You will be given a tour of the workspace and be shown how to work with objects in Databricks such as catalogs, schemas, volumes, tables, compute clusters and notebooks. You will then follow a basic data engineering workflow to perform tasks such as creating and working with tables, ingesting data into Delta Lake, transforming data through the medallion architecture, and using Databricks Workflows to orchestrate data engineering tasks. You’ll also learn how Databricks supports data warehousing needs through the use of Databricks SQL, Delta Live Tables, and Unity Catalog. With the purchase of a Databricks Labs subscription, the course also closes out with a comprehensive lab exercise to practice what you’ve learned in a live Databricks Workspace environment.
Course Description: Get Started with SQL Analytics and BI on Databricks
In this course, you will learn basic skills that will allow you to use the Databricks Data Intelligence Platform to perform a simple data analytics workflow and support data warehousing endeavors. You will be given a tour of the workspace and be shown how to work with data objects in Databricks such as catalogs, schemas, tables, compute clusters, notebooks, and dashboards. You will then follow a basic data analytics workflow to perform tasks such as manipulating data using Databricks SQL, leveraging Delta Lake version logs to time travel, creating dashboards within the platform, and creating Genie Spaces for data exploration using natural language prompts. You will also learn how Databricks supports data warehousing needs through the use of Databricks SQL, Delta Live Tables, and Unity Catalog.
Overview
Databricks Get Started Days is a half-day virtual event to sharpen your data engineering and analysis skills. This interactive training is the perfect way to dive in and learn about Databricks. Here’s what makes Get Started Days special:
Practical Learning: Demo sessions tailored for beginners.
Quick & Convenient: Two sessions in just half a day.
Global Access: Choose a time zone that fits your schedule.
Exclusive Perks: Attend and complete the survey to unlock a 30-day free coupon for Databricks Academy Labs!
This event is held during EMEA-friendly timezone.
Registration
Customer Academy Link: https://customer-academy.databricks.com/learn/courses/3125/databricks-get-started-days
Partner Academy Link: https://partner-academy.databricks.com/learn/courses/3125/databricks-get-started-days
Course Description: Get Started with Databricks for Data Engineering
In this course, you will learn basic skills that will allow you to use the Databricks Data Intelligence Platform to perform a simple data engineering workflow and support data warehousing endeavors. You will be given a tour of the workspace and be shown how to work with objects in Databricks such as catalogs, schemas, volumes, tables, compute clusters and notebooks. You will then follow a basic data engineering workflow to perform tasks such as creating and working with tables, ingesting data into Delta Lake, transforming data through the medallion architecture, and using Databricks Workflows to orchestrate data engineering tasks. You’ll also learn how Databricks supports data warehousing needs through the use of Databricks SQL, Delta Live Tables, and Unity Catalog. With the purchase of a Databricks Labs subscription, the course also closes out with a comprehensive lab exercise to practice what you’ve learned in a live Databricks Workspace environment.
Course Description: Get Started with SQL Analytics and BI on Databricks
In this course, you will learn basic skills that will allow you to use the Databricks Data Intelligence Platform to perform a simple data analytics workflow and support data warehousing endeavors. You will be given a tour of the workspace and be shown how to work with data objects in Databricks such as catalogs, schemas, tables, compute clusters, notebooks, and dashboards. You will then follow a basic data analytics workflow to perform tasks such as manipulating data using Databricks SQL, leveraging Delta Lake version logs to time travel, creating dashboards within the platform, and creating Genie Spaces for data exploration using natural language prompts. You will also learn how Databricks supports data warehousing needs through the use of Databricks SQL, Delta Live Tables, and Unity Catalog.
Overview
Databricks Get Started Days is a half-day virtual event to sharpen your data engineering and analysis skills. This interactive training is the perfect way to dive in and learn about Databricks. Here’s what makes Get Started Days special:
Practical Learning: Demo sessions tailored for beginners.
Quick & Convenient: Two sessions in just half a day.
Global Access: Choose a time zone that fits your schedule.
Exclusive Perks: Attend and complete the survey to unlock a 30-day free coupon for Databricks Academy Labs!
This event is held during AMER-friendly timezone.
Registration
Customer Academy Link: https://customer-academy.databricks.com/learn/courses/3125/databricks-get-started-days
Partner Academy Link: https://partner-academy.databricks.com/learn/courses/3125/databricks-get-started-days
Course Description: Get Started with Databricks for Data Engineering
In this course, you will learn basic skills that will allow you to use the Databricks Data Intelligence Platform to perform a simple data engineering workflow and support data warehousing endeavors. You will be given a tour of the workspace and be shown how to work with objects in Databricks such as catalogs, schemas, volumes, tables, compute clusters and notebooks. You will then follow a basic data engineering workflow to perform tasks such as creating and working with tables, ingesting data into Delta Lake, transforming data through the medallion architecture, and using Databricks Workflows to orchestrate data engineering tasks. You’ll also learn how Databricks supports data warehousing needs through the use of Databricks SQL, Delta Live Tables, and Unity Catalog. With the purchase of a Databricks Labs subscription, the course also closes out with a comprehensive lab exercise to practice what you’ve learned in a live Databricks Workspace environment.
Course Description: Get Started with SQL Analytics and BI on Databricks
In this course, you will learn basic skills that will allow you to use the Databricks Data Intelligence Platform to perform a simple data analytics workflow and support data warehousing endeavors. You will be given a tour of the workspace and be shown how to work with data objects in Databricks such as catalogs, schemas, tables, compute clusters, notebooks, and dashboards. You will then follow a basic data analytics workflow to perform tasks such as manipulating data using Databricks SQL, leveraging Delta Lake version logs to time travel, creating dashboards within the platform, and creating Genie Spaces for data exploration using natural language prompts. You will also learn how Databricks supports data warehousing needs through the use of Databricks SQL, Delta Live Tables, and Unity Catalog.