- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 01:27 PM
We are trying to train a predictive ML model using the XGBoost Classifier. Part of the requirements we have gotten from our business team is to implement feature weighting as they have defined certain features mattering more than others. We have 69 features as part of the dataset.
We are trying to fit the model with these parameters:
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 01:48 PM
Hello @sjohnston2 here is some information i found internally:
Possible Causes
- Memory Access Issue: The segmentation fault suggests that the program is trying to access memory that it's not allowed to, which could be caused by an internal bug in XGBoost when processing certain feature weight configurations
- XGBoost Version: This could be a bug in the specific version of XGBoost you're using. Feature weights were added in version 1.3.0, so ensure you're using a recent, stable version
- Incompatible Feature Weights: The error occurs with certain feature weight configurations but not others, indicating that the issue might be related to how XGBoost handles specific weight patterns.
Try modifying your feature weights to avoid the configuration that causes the error. For example:
feature_weights = np.ones(X_train.shape[1]) # Start with all weights set to 1
feature_weights[:10] = 2.0 # Increase weights for the first 10 features
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 01:48 PM
Hello @sjohnston2 here is some information i found internally:
Possible Causes
- Memory Access Issue: The segmentation fault suggests that the program is trying to access memory that it's not allowed to, which could be caused by an internal bug in XGBoost when processing certain feature weight configurations
- XGBoost Version: This could be a bug in the specific version of XGBoost you're using. Feature weights were added in version 1.3.0, so ensure you're using a recent, stable version
- Incompatible Feature Weights: The error occurs with certain feature weight configurations but not others, indicating that the issue might be related to how XGBoost handles specific weight patterns.
Try modifying your feature weights to avoid the configuration that causes the error. For example:
feature_weights = np.ones(X_train.shape[1]) # Start with all weights set to 1
feature_weights[:10] = 2.0 # Increase weights for the first 10 features
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-19-2024 07:35 AM
Thanks for the response, Walter!
It seemed like the XGBoost version is what was causing us the issue. Upgrading the version and rerunning our previous tests worked perfectly. Thank you so much for the help and have a wonderful holiday!

