-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the Random Forest classifies everything to be 1 #68
Comments
@AlanSpencer2 Hi, I used the classifier with the same parameters to fit the |
Hi, the problem occurs with binary classification. That is, if the target variable is 0 or 1, True or False. Can you please try a binary classification problem? (Not regression, and not multiple classification.) Here is the Iris dataset with 3 different flower types. The target/label variable is 1 if the flower is Setosa, and 0 for the other 2 flower types: --------------------------------Python code------------------------------------ df = pd.read_csv(r'C:\Python\iris_data.csv', encoding='ISO-8859-1', low_memory=False, index_col=0) ps. I have tried all kinds of different datasets. They all had the same issue. |
I am new to thundergbm, and just trying to get a simple Random Forest classifier going. But the classifier classifies every single sample to be 1. Not one single case out of 188244 samples is classified as 0. No other classifier behaves like this. I also tried different number of trees, depth etc. But it still classies everything to 1. Is there something wrong with the following code?
from thundergbm import TGBMClassifier
clf = TGBMClassifier(depth=6, n_trees = 1, n_parallel_trees=100, bagging=1)
clf.fit(X_train, y_train)
y_pred = clf.predict(X_test)
#y_pred classifies everything in the test set (X_test) to one.
The text was updated successfully, but these errors were encountered: