- 1249 Views
- 3 replies
- 1 kudos
Distributed SparkXGBRanker training: failed barrier ResultStage
I'm following a variation of the tutorial [here](https://assets.docs.databricks.com/_extras/notebooks/source/xgboost-pyspark-new.html) to train an `SparkXGBRanker` in distributed mode. However, the line:pipeline_model = pipeline.fit(data) Is throwing...
- 1249 Views
- 3 replies
- 1 kudos
- 1 kudos
You have already mentioned you did turn off autoscaling, please try the num_workers too Step 1: Disable Dynamic Resource Allocation: Use spark.dynamicAllocation.enabled = false Step 2: Configure num_workers to Match Fixed Resources After disabling dy...
- 1 kudos
- 751 Views
- 1 replies
- 0 kudos
Lakehouse monitoring generates broken queries
Hi everyone,I’m setting up Databricks Lakehouse Monitoring to track my model’s performance using an inference-regression monitor. I’ve completed all the required configuration and successfully launched my first monitoring run.The quality tables are g...
- 751 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @the_p_l ,I want to confirm that I understand your situation correctly. You mentioned that you are not adding any custom code to the deployed Lakehouse Monitoring setup, and you believe the issue is related to the inline comments generated during ...
- 0 kudos
- 483 Views
- 3 replies
- 2 kudos
Unable to register Scikit-learn or XGBoost model to unity catalog
Hello, I'm following the tutorial provided here https://docs.databricks.com/aws/en/notebooks/source/mlflow/mlflow-classic-ml-e2e-mlflow-3.html for ML model management process using ML FLow, in a unity-catalog enabled workspace, however I'm facing an ...
- 483 Views
- 3 replies
- 2 kudos
- 2 kudos
Maybe add missing: mlflow.set_tracking_uri("databricks")mlflow.set_registry_uri("databricks")
- 2 kudos
- 505 Views
- 3 replies
- 1 kudos
Endpoint deployment is very slow
HI team I am testing some changes on UAT / DEV environment and noticed that the model endpoint are very slow to deploy. Since the environment is just testing and not serving any production traffic, I was wondering if there was a way to expedite this ...
- 505 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @WiliamRosa Thanks for your response on this. I have been using the setting you described aboved, with the exception of `scale_to_zero`. PFA screenshot of the endpoint settings. My deployment is a simple Pytorch Deep Learning model wrapped in a `s...
- 1 kudos
- 824 Views
- 4 replies
- 4 kudos
Resolved! Distributed Optuna and MLflow
Hello All, I just tried running the following notebook (https://docs.databricks.com/aws/en/notebooks/source/machine-learning/optuna-mlflow.html) on the Databricks Free Edition platform , through Microsoft Account Authentication. It takes 15 minutes ...
- 824 Views
- 4 replies
- 4 kudos
- 4 kudos
Great. Thank you. That worked. I still need more compute and networking resources to make it justifiable, but this confirms that it works !!!
- 4 kudos
- 604 Views
- 1 replies
- 1 kudos
[ERROR] Worker (pid:11) was sent code 132 When deploying a Custom Model in serving
Hi, I've been developing a custom model with mlflow.pyfunc.PythonModel. Among other libs, I use ANNOY. While trying to serve the model as an endpoint in "serving", After a few fixes my model worked fine as well the endpoin call.Altough, I tried updat...
- 604 Views
- 1 replies
- 1 kudos
- 1 kudos
Great observation! The difference between Using worker: sync and Using worker: gevent typically refers to the worker class used by Gunicorn, the web server behind many MLflow model deployments (like in Databricks model serving or other MLflow-compati...
- 1 kudos
- 1344 Views
- 2 replies
- 3 kudos
Resolved! Serving Endpoint: Container image creation
Hi TeamWhenever I try to create an endpoint from a model in Databricks, the process often gets stuck at the 'Container Image Creation' step. I've tried to understand what happens during this step, but couldn't find any detailed or helpful information...
- 1344 Views
- 2 replies
- 3 kudos
- 3 kudos
Thank you @Vidhi_Khaitan for sharing the detailed process ..
- 3 kudos
- 2736 Views
- 5 replies
- 3 kudos
Resolved! This API is disabled for users without the databricks-sql-access
Running a deply on github: Run databricks bundle deploydatabricks bundle deployshell: /usr/bin/bash -e {0}env:DATABRICKS_HOST: {{HOST}}DATABRICKS_CLIENT_ID: {{ID}}DATABRICKS_CLIENT_SECRET: ***DATABRICKS_BUNDLE_ENV: prodError: This API is disabled for...
- 2736 Views
- 5 replies
- 3 kudos
- 3 kudos
Got it working, yes I see it was a little confusing at first, the workspace displayed at the top right is the account information whereas the profile icon is where you can access the workspace settings. For anyone that got as confused as I did. Thank...
- 3 kudos
- 907 Views
- 1 replies
- 1 kudos
Resolved! Model Inferencing
Any links, pointers to host a model in real time (similar to sagemaker endpoint in aws) - how can we host a model in DBX in real time - any documentation please?
- 907 Views
- 1 replies
- 1 kudos
- 1 kudos
@Sachin_Amin you can find an example in our docs here: https://docs.databricks.com/aws/en/machine-learning/model-serving/model-serving-intro We also have free training courses on realtime model deployment for both classical ML (https://www.databricks...
- 1 kudos
- 3220 Views
- 2 replies
- 2 kudos
workflow not pickingup correct host value (While working with MLflow model registry URI)
Exception: mlflow.exceptions.MlflowException: An API request to https://canada.cloud.databricks.com/api/2.0/mlflow/model-versions/list-artifacts failed due to a timeout. The error message was: HTTPSConnectionPool(host='canada.cloud.databricks.com', p...
- 3220 Views
- 2 replies
- 2 kudos
- 2 kudos
Thanks for the answer. I will try this solution
- 2 kudos
- 1685 Views
- 2 replies
- 0 kudos
Model Serving Endpoint: Cuda-OOM for Custom Model
Hello all,I am tasked to evaluate a new LLM for some use-cases. In particular, I need to build a POC for a chat bot based on that model. To that end, I want to create a custom Serving Endpoint for an LLM pulled from huggingfaces. The model itself is...
- 1685 Views
- 2 replies
- 0 kudos
- 0 kudos
Here are some suggestions: 1. Update coda.yaml. Replace the current config with this optimized version: channels: - conda-forge dependencies: - python=3.10 # 3.12 may cause compatibility issues - pip - pip: - mlflow==2.21.3 - torch...
- 0 kudos
- 890 Views
- 1 replies
- 0 kudos
Not able to run end to end ML project on Databricks Trial
I started using Databricks trial version from today. I want to explore full end to end ML lifecycle on the databricks. I observed for the compute only 'serverless' option is available. I was trying to execute the notebook posted on https://docs.datab...
- 890 Views
- 1 replies
- 0 kudos
- 0 kudos
I can take up to 15 minutes for the serving endpoint to be created. Once you initiate the "create endpoint" chunk of code go and grab a cup of coffee and wait 15 minutes. Then, before you use it verify it is running (bottom left menu "Serving") by g...
- 0 kudos
-
Access control
3 -
Access Data
2 -
AccessKeyVault
1 -
ADB
2 -
Airflow
1 -
Amazon
2 -
Apache
1 -
Apache spark
3 -
APILimit
1 -
Artifacts
1 -
Audit
1 -
Autoloader
6 -
Autologging
2 -
Automation
2 -
Automl
38 -
Aws databricks
1 -
AWSSagemaker
1 -
Azure
32 -
Azure active directory
1 -
Azure blob storage
2 -
Azure data lake
1 -
Azure Data Lake Storage
3 -
Azure data lake store
1 -
Azure databricks
32 -
Azure event hub
1 -
Azure key vault
1 -
Azure sql database
1 -
Azure Storage
2 -
Azure synapse
1 -
Azure Unity Catalog
1 -
Azure vm
1 -
AzureML
2 -
Bar
1 -
Beta
1 -
Better Way
1 -
BI Integrations
1 -
BI Tool
1 -
Billing and Cost Management
1 -
Blob
1 -
Blog
1 -
Blog Post
1 -
Broadcast variable
1 -
Business Intelligence
1 -
CatalogDDL
1 -
Centralized Model Registry
1 -
Certification
2 -
Certification Badge
1 -
Change
1 -
Change Logs
1 -
Check
2 -
Classification Model
1 -
Cloud Storage
1 -
Cluster
10 -
Cluster policy
1 -
Cluster Start
1 -
Cluster Termination
2 -
Clustering
1 -
ClusterMemory
1 -
CNN HOF
1 -
Column names
1 -
Community Edition
1 -
Community Edition Password
1 -
Community Members
1 -
Company Email
1 -
Condition
1 -
Config
1 -
Configure
3 -
Confluent Cloud
1 -
Container
2 -
ContainerServices
1 -
Control Plane
1 -
ControlPlane
1 -
Copy
1 -
Copy into
2 -
CosmosDB
1 -
Courses
2 -
Csv files
1 -
Dashboards
1 -
Data
8 -
Data Engineer Associate
1 -
Data Engineer Certification
1 -
Data Explorer
1 -
Data Ingestion
2 -
Data Ingestion & connectivity
11 -
Data Quality
1 -
Data Quality Checks
1 -
Data Science & Engineering
2 -
databricks
5 -
Databricks Academy
3 -
Databricks Account
1 -
Databricks AutoML
9 -
Databricks Cluster
3 -
Databricks Community
5 -
Databricks community edition
4 -
Databricks connect
1 -
Databricks dbfs
1 -
Databricks Feature Store
1 -
Databricks Job
1 -
Databricks Lakehouse
1 -
Databricks Mlflow
4 -
Databricks Model
2 -
Databricks notebook
10 -
Databricks ODBC
1 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Runtime
9 -
Databricks SQL
8 -
Databricks SQL Permission Problems
1 -
Databricks Terraform
1 -
Databricks Training
2 -
Databricks Unity Catalog
1 -
Databricks V2
1 -
Databricks version
1 -
Databricks Workflow
2 -
Databricks Workflows
1 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksContainer
1 -
DatabricksML
6 -
Dataframe
3 -
DataSharing
1 -
Datatype
1 -
DataVersioning
1 -
Date Column
1 -
Dateadd
1 -
DB Notebook
1 -
DB Runtime
1 -
DBFS
5 -
DBFS Rest Api
1 -
Dbt
1 -
Dbu
1 -
DDL
1 -
DDP
1 -
Dear Community
1 -
DecisionTree
1 -
Deep learning
4 -
Default Location
1 -
Delete
1 -
Delt Lake
4 -
Delta lake table
1 -
Delta Live
1 -
Delta Live Tables
6 -
Delta log
1 -
Delta Sharing
3 -
Delta-lake
1 -
Deploy
1 -
DESC
1 -
Details
1 -
Dev
1 -
Devops
1 -
Df
1 -
Different Notebook
1 -
Different Parameters
1 -
DimensionTables
1 -
Directory
3 -
Disable
1 -
Distribution
1 -
DLT
6 -
DLT Pipeline
3 -
Dolly
5 -
Dolly Demo
2 -
Download
2 -
EC2
1 -
Emr
2 -
Ensemble Models
1 -
Environment Variable
1 -
Epoch
1 -
Error handling
1 -
Error log
2 -
Eventhub
1 -
Example
1 -
Experiments
4 -
External Sources
1 -
Extract
1 -
Fact Tables
1 -
Failure
2 -
Feature Lookup
2 -
Feature Store
60 -
Feature Store API
2 -
Feature Store Table
1 -
Feature Table
6 -
Feature Tables
4 -
Features
2 -
FeatureStore
2 -
File Path
2 -
File Size
1 -
Fine Tune Spark Jobs
1 -
Forecasting
2 -
Forgot Password
2 -
Garbage Collection
1 -
Garbage Collection Optimization
1 -
Github
2 -
Github actions
2 -
Github Repo
2 -
Gitlab
1 -
GKE
1 -
Global Init Script
1 -
Global init scripts
4 -
Governance
1 -
Hi
1 -
Horovod
1 -
Html
1 -
Hyperopt
4 -
Hyperparameter Tuning
2 -
Iam
1 -
Image
3 -
Image Data
1 -
Inference Setup Error
1 -
INFORMATION
1 -
Input
1 -
Insert
1 -
Instance Profile
1 -
Int
2 -
Interactive cluster
1 -
Internal error
1 -
Invalid Type Code
1 -
IP
1 -
Ipython
1 -
Ipywidgets
1 -
JDBC Connections
1 -
Jira
1 -
Job
4 -
Job Parameters
1 -
Job Runs
1 -
Join
1 -
Jsonfile
1 -
Kafka consumer
1 -
Key Management
1 -
Kinesis
1 -
Lakehouse
1 -
Large Datasets
1 -
Latest Version
1 -
Learning
1 -
Limit
3 -
LLM
3 -
LLMs
2 -
Local computer
1 -
Local Machine
1 -
Log Model
2 -
Logging
1 -
Login
1 -
Logs
1 -
Long Time
2 -
Low Latency APIs
2 -
LTS ML
3 -
Machine
3 -
Machine Learning
24 -
Machine Learning Associate
1 -
Managed Table
1 -
Max Retries
1 -
Maximum Number
1 -
Medallion Architecture
1 -
Memory
3 -
Metadata
1 -
Metrics
3 -
Microsoft azure
1 -
ML Lifecycle
4 -
ML Model
4 -
ML Practioner
3 -
ML Runtime
1 -
MlFlow
75 -
MLflow API
5 -
MLflow Artifacts
2 -
MLflow Experiment
6 -
MLflow Experiments
3 -
Mlflow Model
10 -
Mlflow registry
3 -
Mlflow Run
1 -
Mlflow Server
5 -
MLFlow Tracking Server
3 -
MLModels
2 -
Model Deployment
4 -
Model Lifecycle
6 -
Model Loading
2 -
Model Monitoring
1 -
Model registry
5 -
Model Serving
12 -
Model Serving Cluster
2 -
Model Serving REST API
6 -
Model Training
2 -
Model Tuning
1 -
Models
8 -
Module
3 -
Modulenotfounderror
1 -
MongoDB
1 -
Mount Point
1 -
Mounts
1 -
Multi
1 -
Multiline
1 -
Multiple users
1 -
Nested
1 -
New Feature
1 -
New Features
1 -
New Workspace
1 -
Nlp
3 -
Note
1 -
Notebook
6 -
Notification
2 -
Object
3 -
Onboarding
1 -
Online Feature Store Table
1 -
OOM Error
1 -
Open Source MLflow
4 -
Optimization
2 -
Optimize Command
1 -
OSS
3 -
Overwatch
1 -
Overwrite
2 -
Packages
2 -
Pandas udf
4 -
Pandas_udf
1 -
Parallel
1 -
Parallel processing
1 -
Parallel Runs
1 -
Parallelism
1 -
Parameter
2 -
PARAMETER VALUE
2 -
Partner Academy
1 -
Pending State
2 -
Performance Tuning
1 -
Photon Engine
1 -
Pickle
1 -
Pickle Files
2 -
Pip
2 -
Points
1 -
Possible
1 -
Postgres
1 -
Pricing
2 -
Primary Key
1 -
Primary Key Constraint
1 -
Progress bar
2 -
Proven Practices
2 -
Public
2 -
Pymc3 Models
2 -
PyPI
1 -
Pyspark
6 -
Python
21 -
Python API
1 -
Python Code
1 -
Python Function
3 -
Python Libraries
1 -
Python Packages
1 -
Python Project
1 -
Pytorch
3 -
Reading-excel
2 -
Redis
2 -
Region
1 -
Remote RPC Client
1 -
RESTAPI
1 -
Result
1 -
Runtime update
1 -
Sagemaker
1 -
Salesforce
1 -
SAP
1 -
Scalability
1 -
Scalable Machine
2 -
Schema evolution
1 -
Script
1 -
Search
1 -
Security
2 -
Security Exception
1 -
Self Service Notebooks
1 -
Server
1 -
Serverless
1 -
Serving
1 -
Shap
2 -
Size
1 -
Sklearn
1 -
Slow
1 -
Small Scale Experimentation
1 -
Source Table
1 -
Spark config
1 -
Spark connector
1 -
Spark Error
1 -
Spark MLlib
2 -
Spark Pandas Api
1 -
Spark ui
1 -
Spark Version
2 -
Spark-submit
1 -
SparkML Models
2 -
Sparknlp
3 -
Spot
1 -
SQL
19 -
SQL Editor
1 -
SQL Queries
1 -
SQL Visualizations
1 -
Stage failure
2 -
Storage
3 -
Stream
2 -
Stream Data
1 -
Structtype
1 -
Structured streaming
2 -
Study Material
1 -
Summit23
2 -
Support
1 -
Support Team
1 -
Synapse
1 -
Synapse ML
1 -
Table
4 -
Table access control
1 -
Tableau
1 -
Task
1 -
Temporary View
1 -
Tensor flow
1 -
Test
1 -
Timeseries
1 -
Timestamps
1 -
TODAY
1 -
Training
6 -
Transaction Log
1 -
Trying
1 -
Tuning
2 -
UAT
1 -
Ui
1 -
Unexpected Error
1 -
Unity Catalog
12 -
Use Case
2 -
Use cases
1 -
Uuid
1 -
Validate ML Model
2 -
Values
1 -
Variable
1 -
Vector
1 -
Versioncontrol
1 -
Visualization
2 -
Web App Azure Databricks
1 -
Weekly Release Notes
2 -
Whl
1 -
Worker Nodes
1 -
Workflow
2 -
Workflow Jobs
1 -
Workspace
2 -
Write
1 -
Writing
1 -
Z-ordering
1 -
Zorder
1
- « Previous
- Next »