- 225 Views
- 1 replies
- 1 kudos
Python Async db connection to databricks
How to connect to azure databricks using sqlalchemy asynchronously
- 225 Views
- 1 replies
- 1 kudos
- 1 kudos
Databricks SQL Connector is fundamentally synchronous - it doesn't have native async/await support. This creates an inherent mismatch when you want to use it in an async Python application. Why This Matters When you make a database query ...
- 1 kudos
- 253 Views
- 3 replies
- 3 kudos
Streamlining App Development with Databricks Marketplace and API Integrations
Hi everyone,I’m interested in discussing how Databricks Marketplace and API integrations can be leveraged to speed up and simplify application development. With the growing need for data-driven apps, I’d like to hear from the community about practica...
- 253 Views
- 3 replies
- 3 kudos
- 3 kudos
Leveraging Databricks Marketplace and APIs can significantly speed up data-driven app development. The Marketplace gives quick access to validated datasets, ML models, and connectors, reducing time spent on sourcing and infrastructure setup. For inte...
- 3 kudos
- 445 Views
- 1 replies
- 1 kudos
Creating links in embedded dashboards
I am curious what Databricks AI/BI's capabilities are in terms of linking out other dashboards in your dashboards. Can I create a text box that links out another dashboard? I noticed you can also create links in a pivot table. What will happen if I e...
- 445 Views
- 1 replies
- 1 kudos
- 1 kudos
You can add a text widget to the canvas to insert hyperlinks to any URL, including other dashboards.In pivot tables, you can define URL paths for cell values from the Actions tab to make them clickable.For navigation within a dashboard, use “drill th...
- 1 kudos
- 170 Views
- 1 replies
- 0 kudos
Attempting to Use VSCode Extension: Error: No sync destination found
Hello,I have the Databricks Extension + Databricks Connect Python module installed.I am connected to the target environment with a running cluster and I have all green checkmarks under the "Configuration" section.However, when testing a python file a...
- 170 Views
- 1 replies
- 0 kudos
- 0 kudos
The error message "No sync destination found" when running your Python file in Databricks with the Extension and Databricks Connect module, even with all green configuration checkmarks, typically indicates a configuration issue with the sync destinat...
- 0 kudos
- 178 Views
- 2 replies
- 0 kudos
What’s the easiest way to clean and transform data using PySpark in Databricks?
You have some raw data (like messy Excel files, CSVs, or logs) and you want to prepare it for analysis — by removing errors, fixing missing values, changing formats, or combining columns — using PySpark (Python for Apache Spark) inside Databricks.
- 178 Views
- 2 replies
- 0 kudos
- 0 kudos
The easiest way to clean and transform data using PySpark in Databricks is by leveraging the DataFrame API. Start by loading data into a Spark DataFrame with spark.read. Use built-in functions like dropna, fillna, and withColumn to handle missing val...
- 0 kudos
- 277 Views
- 1 replies
- 0 kudos
How to Leverage Databricks for End-to-End AI Model Development
Hi everyone,I’m exploring how to use Databricks as a platform for end-to-end AI and machine learning model development, and I’d love to get insights from professionals and practitioners who have hands-on experience.Specifically, I’m curious about:Set...
- 277 Views
- 1 replies
- 0 kudos
- 0 kudos
You can leverage Databricks for end-to-end AI model development by using its Lakehouse Platform, which unifies data engineering, analytics, and machine learning in one workspace. Start by ingesting and transforming data using Apache Spark and Delta L...
- 0 kudos
- 2513 Views
- 4 replies
- 2 kudos
Resolved! Try Databricks sign up failed
Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle.
- 2513 Views
- 4 replies
- 2 kudos
- 188 Views
- 1 replies
- 1 kudos
- 188 Views
- 1 replies
- 1 kudos
- 1 kudos
We are going to need a little more information to better help you. What is the scenario? Louis
- 1 kudos
- 243 Views
- 1 replies
- 1 kudos
Why keep both Azure OpenAI and Databricks?
Hi everyone,I’m curious to hear your thoughts on the benefits of having both Azure OpenAI and Azure Databricks within the same ecosystem.From what I can see, Databricks provides a strong foundation for data engineering, governance, and model lifecycl...
- 243 Views
- 1 replies
- 1 kudos
- 1 kudos
Two use case I can think of is RAG:Use Databricks for vector indexing (e.g., via Delta Lake or FAISS) and Azure OpenAI for inference.Example: A chatbot that queries Databricks-hosted documents and uses GPT-4 for response generation.Agentic Workflows:...
- 1 kudos
- 256 Views
- 2 replies
- 1 kudos
Resolved! Ingesting data from APIs Like Shopify (for orders), Meta Ads, Google Ads etc
Hi,I am and trying to create some table by calling APIs of Shopify/Meta Ads/Google Ads and so on. Where will I make the API call ? Is making API calls in Notebooks considered standard way to ingest in these cases. I intend to make a daily call to ge...
- 256 Views
- 2 replies
- 1 kudos
- 1 kudos
hello @int32lama i can help you with that if you are interested
- 1 kudos
- 126 Views
- 0 replies
- 4 kudos
Databricks One
Databricks One is a user interface designed for business users, giving them a single, intuitive entry point to interact with data and AI in Azure Databricks, without needing to navigate technical concepts such as clusters, queries, models, or noteboo...
- 126 Views
- 0 replies
- 4 kudos
- 158 Views
- 0 replies
- 0 kudos
SQL warehouse: A materialized view is the simplest and cost-efficient way to transform your data
Materialized views running on SQL warehouse are super cost-efficient, and additionally, it is a really simple and powerful data engineering tool - just be sure that Enzyme updates it incrementally. Read more: - https://databrickster.medium.com/sql-wa...
- 158 Views
- 0 replies
- 0 kudos
- 187 Views
- 0 replies
- 2 kudos
The purpose of your All-Purpose Cluster
Small, hidden but useful cluster setting.You can set that no jobs are allowed on the all-purpose cluster.Or vice versa, you can set an all-purpose cluster that can be used only by jobs. read more: - https://databrickster.medium.com/purpose-for-your-...
- 187 Views
- 0 replies
- 2 kudos
- 238 Views
- 1 replies
- 1 kudos
How to Integrate Machine Learning Model Development with Databricks Workflows?
Hey everyone I’m currently exploring machine learning model development and I’m interested in understanding how to effectively integrate ML workflows within Databricks.Specifically, I’d like to hear from the community about:How do you structure ML pi...
- 238 Views
- 1 replies
- 1 kudos
- 1 kudos
You can integrate machine learning model development into Databricks Workflows pretty smoothly using the platform’s native tools. The main idea is to treat your ML lifecycle (data prep → training → evaluation → deployment) as a series of tasks within...
- 1 kudos
- 5330 Views
- 1 replies
- 0 kudos
Support for managed identity based authentication in python kafka client
We followed this document https://docs.databricks.com/aws/en/connect/streaming/kafka?language=Python#msk-aad to use Kafka client to read events from our event hub for a feature.As part of the SFI, the guidance is to move away from client secret and u...
- 5330 Views
- 1 replies
- 0 kudos
- 0 kudos
Currently, Databricks does not support using Managed Identities directly for Kafka client authentication (e.g., MSK IAM or Event Hubs Kafka endpoint) in Python Structured Streaming connections. However, there is a supported and secure alternative tha...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
4 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 120 | |
| 57 | |
| 42 | |
| 37 |