- 2387 Views
- 1 replies
- 0 kudos
SQL Warehouse cluster is always running when configure metabase conneciton
I encountered an issue while using the Metabase JDBC driver to connect to Databricks SQL Warehouse: I noticed that the SQL Warehouse cluster is always running and never stops automatically. Every few seconds, a SELECT 1 query log appears, which I sus...
- 2387 Views
- 1 replies
- 0 kudos
- 0 kudos
Will try to remove "preferredTestQuery" or add "idleConnectionTestPeriod" to avoid keep send select 1 query.
- 0 kudos
- 2178 Views
- 0 replies
- 0 kudos
Is the Databricks Community Edition still available?
Hello,I am a professor of IT in the Price College of Business at The University of Oklahoma. In the past, I have used the Databricks Community Edition to demonstrate the principles of building and maintaining a data warehouse. This spring semester ...
- 2178 Views
- 0 replies
- 0 kudos
- 1739 Views
- 2 replies
- 1 kudos
utils.add_libraries_to_model creates a duplicated model
Hello,When I call this function,mlflow.models.utils.add_libraries_to_model(MODEL_URI)It register a new model into the Model Registry. Is it possible to do the same but without registering a new model?Thanks,
- 1739 Views
- 2 replies
- 1 kudos
- 1 kudos
I ended up publishing the library to AWS CodeArtifact repository. Now, how can I tell MLFlow to use AWS CodeArtifact private repository instead of PyPi?
- 1 kudos
- 3001 Views
- 2 replies
- 0 kudos
Auto Loader Use Case Question - Centralized Dropzone to Bronze?
Good day,I am trying to use Auto Loader (potentially extending into DLT in the future) to easily pull data coming from an external system (currently located in a single location) and organize it and load it respectively. I am struggling quite a bit a...
- 3001 Views
- 2 replies
- 0 kudos
- 0 kudos
Quick follow-up on this @Retired_mod (or to anyone else in the Databricks multi-verse who is able to help clarify this case).I understand that the proposed solution would work for a "one-to-one" case where many files are landing in a specific dbfs pa...
- 0 kudos
- 3241 Views
- 3 replies
- 0 kudos
Fail to write large dataframe
Hi all, we have a issue while trying to write a quite large data frame, close to 35 million records. We try to write it as parquet and also table and none work. But writing a small chink (10k records) is working. Basically we have some text on which ...
- 3241 Views
- 3 replies
- 0 kudos
- 0 kudos
That could work, but you will have to create a UDF.Check this SO topic for more info
- 0 kudos
- 2051 Views
- 0 replies
- 2 kudos
Exciting Update! 🚀 Check out our fresh Community upgrades
Hello Community Members, We are thrilled to announce a fresh look for our Community platform! Get ready for an enhanced user experience with our brand-new UI changes. Our team has worked diligently to bring you a more intuitive and visually appealing...
- 2051 Views
- 0 replies
- 2 kudos
- 4801 Views
- 2 replies
- 0 kudos
ClassCastException when attempting to timetravel (databricks-connect)
Hi all,Using databricks-connect 11.3.19, I get an "java.lang.ClassCastException" when attempting to timetravel. The exact same statement works fine when executed in the databricks GUI directly. Any ideas on what's going on? Is this a known limitation...
- 4801 Views
- 2 replies
- 0 kudos
- 3802 Views
- 2 replies
- 0 kudos
Resolved! Spark read with format as "delta" isn't working with Java multithreading
0I have a Spark application (using Java library) which needs to replicate data from one blob storage to another. I have created a readStream() within it which is listening continuously to a Kafka topic for incoming events. The corresponding writeStre...
- 3802 Views
- 2 replies
- 0 kudos
- 0 kudos
The problem was indeed with the way ClassLoader was being set in the ForkJoinPool (common Pool used) thread. Spark in SparkClassUtils uses Thread.currentThread().getContextClassLoader which might behave differently in another thread.To solve it I cre...
- 0 kudos
- 6539 Views
- 1 replies
- 0 kudos
Webassessor Secure Browser will not Launch during exam.
Hello - I registered for the Databricks Data Engineering Associate Certification exam. I hit an issue, their Secure browser would not launch, it just crashed - the only thing I could see in a flash is "bad request" and poof its gone. Spend over 2 h...
- 6539 Views
- 1 replies
- 0 kudos
- 1427 Views
- 0 replies
- 0 kudos
Execute delete query from notebook on azure synapse
Hello Everyone, Is there a way we can execute the delete query from azure notebook on azure synapse database.I tried using read api method with option "query" but getting error like jdbc connector not able to handle code.Can any suggest how we can de...
- 1427 Views
- 0 replies
- 0 kudos
- 9857 Views
- 2 replies
- 2 kudos
Python file testing using pytest
Hi All,I have a requirement in my project, where we will be writing some python code inside databricks . Please note we will not be using pyspark . It will plain pythin with polars.I am looking into ho to create test files for main file. Below is sim...
- 9857 Views
- 2 replies
- 2 kudos
- 11746 Views
- 1 replies
- 5 kudos
New how-to guide to data warehousing with the Data Intelligence Platform
Just launched: The Big Book of Data Warehousing and BI, a new hands-on guide focused on real-world use cases from governance, transformation, analytics and AI.As the demand for data becomes insatiable in every company, the data infrastructure has bec...
- 11746 Views
- 1 replies
- 5 kudos
- 5 kudos
lol beans It used to take me a long time to regain my equilibrium, but recently I learned that a website really leads this layout when you may find delight after a stressful day here. Since then, I've been able to find my equilibrium much more quickl...
- 5 kudos
- 3420 Views
- 1 replies
- 0 kudos
Asset bundle build and deploy python wheel with versions
Hi all,I was able to deploy a wheel to the /Shared/ folder from a repository in Gitlab with asset bundles. The databricks.yml looks something like this.artifacts: default: type: whl build: poetry build path: . targets: workspace: h...
- 3420 Views
- 1 replies
- 0 kudos
- 0 kudos
Finally I decided to use AWS Code Artifact and mirror the PyPI, which I think it's a bit cleaner. But your solution looks good too. Thanks!
- 0 kudos
- 1871 Views
- 1 replies
- 2 kudos
Create new workbooks with code
Is is possible to create new notebooks from a notevbook in databricks? I have tried this code. But all of them are generic files, not notebooks.notebook_str = """# Databricks notebook source import pyspark.sql.functions as F import numpy as np # CO...
- 1871 Views
- 1 replies
- 2 kudos
- 2 kudos
Unfortunaly %run does not help me since I can't %run a .py file. I still need my code in notebooks.I am transpiling propriatary code to python using jinja templates. I would like to have the output as notebooks since those are most convenient to edit...
- 2 kudos
- 2730 Views
- 1 replies
- 0 kudos
Resolved! DLT Pipeline Graph is not detecting dependencies
Hi,This is my first databricks project. I am loading data from a UC external volume in ADLS into tables and then split one of the tables into two tables based on a column. When I create a pipeline, the tables don't have any dependencies and this is...
- 2730 Views
- 1 replies
- 0 kudos
- 0 kudos
While re-implementing my pipeline to publish to dev/test/prod instead of bronze/silver/gold, I think I found the answer. The downstream tables need to use the LIVE schema.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free Edition
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
2 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
JSON Object
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
2 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
2 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Speakers
1 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Vnet
1 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 129 | |
| 57 | |
| 42 | |
| 42 |