cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

XGBoost Feature Weighting

sjohnston2
New Contributor

We are trying to train a predictive ML model using the XGBoost Classifier. Part of the requirements we have gotten from our business team is to implement feature weighting as they have defined certain features mattering more than others. We have 69 features as part of the dataset.

We are trying to fit the model with these parameters:

model.fit(X_train,
              y_train,
              classifier__feature_weights=feature_weights,
              classifier__early_stopping_rounds=5,
              classifier__verbose=False,
              classifier__eval_set=[(X_val_processed,y_val_processed)])
 
feature_weights is set accordingly to test:
feature_weights = np.zeros(X_train.shape[1])
feature_weights[:10] = 2.0
 
When running this, we are getting the following error:
The Python process exited with exit code 139 (SIGSEGV: Segmentation fault).
 
However, when we run feature_weights set to this, we don't get an error:
feature_weights = np.zeros(X_train.shape[1])
feature_weights[:5] = 1.0
 
Do you have any insight or advice on this error and how we can fix it moving forward? Our research tells us it's a memory issue, but looking at the cluster metrics shows us that 90GB/220GB of memory is being used. 
 
1 ACCEPTED SOLUTION

Accepted Solutions

Walter_C
Databricks Employee
Databricks Employee

Hello @sjohnston2 here is some information i found internally:

Possible Causes

  1. Memory Access Issue: The segmentation fault suggests that the program is trying to access memory that it's not allowed to, which could be caused by an internal bug in XGBoost when processing certain feature weight configurations
  2. XGBoost Version: This could be a bug in the specific version of XGBoost you're using. Feature weights were added in version 1.3.0, so ensure you're using a recent, stable version
  3. Incompatible Feature Weights: The error occurs with certain feature weight configurations but not others, indicating that the issue might be related to how XGBoost handles specific weight patterns.

Try modifying your feature weights to avoid the configuration that causes the error. For example:

feature_weights = np.ones(X_train.shape[1])  # Start with all weights set to 1
feature_weights[:10] = 2.0  # Increase weights for the first 10 features

View solution in original post

2 REPLIES 2

Walter_C
Databricks Employee
Databricks Employee

Hello @sjohnston2 here is some information i found internally:

Possible Causes

  1. Memory Access Issue: The segmentation fault suggests that the program is trying to access memory that it's not allowed to, which could be caused by an internal bug in XGBoost when processing certain feature weight configurations
  2. XGBoost Version: This could be a bug in the specific version of XGBoost you're using. Feature weights were added in version 1.3.0, so ensure you're using a recent, stable version
  3. Incompatible Feature Weights: The error occurs with certain feature weight configurations but not others, indicating that the issue might be related to how XGBoost handles specific weight patterns.

Try modifying your feature weights to avoid the configuration that causes the error. For example:

feature_weights = np.ones(X_train.shape[1])  # Start with all weights set to 1
feature_weights[:10] = 2.0  # Increase weights for the first 10 features

Thanks for the response, Walter! 

It seemed like the XGBoost version is what was causing us the issue. Upgrading the version and rerunning our previous tests worked perfectly. Thank you so much for the help and have a wonderful holiday!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group