I have an unbalanced textual dataset this is how it looks:
label | texts(documents)
----------
5 |1190
4 |839
3 |239
1 |204
2 |127
I tried to use the fit(X, y[, sample_weight]) parameter but I did not understand in the documentation how does this is expected. I tried the following:
from sklearn.ensemble import RandomForestClassifier
from sklearn.preprocessing import balance_weights
classifier=RandomForestClassifier(n_estimators=10,criterion='entropy')
classifier.fit(X_train, y_train,sample_weight = balance_weights(y))
prediction = classifier.predict(X_test)
But I get this exception:
/usr/local/lib/python2.7/site-packages/sklearn/utils/__init__.py:93: DeprecationWarning: Function balance_weights is deprecated; balance_weights is an internal function and will be removed in 0.16
warnings.warn(msg, category=DeprecationWarning)
Traceback (most recent call last):
File "/Users/user/RF_classification.py", line 34, in <module>
classifier.fit(X_train, y_train,sample_weight = balance_weights(y))
File "/usr/local/lib/python2.7/site-packages/sklearn/ensemble/forest.py", line 279, in fit
for i in range(n_jobs))
File "/usr/local/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 653, in __call__
self.dispatch(function, args, kwargs)
File "/usr/local/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 400, in dispatch
job = ImmediateApply(func, args, kwargs)
File "/usr/local/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 138, in __init__
self.results = func(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/sklearn/ensemble/forest.py", line 85, in _parallel_build_trees
curr_sample_weight *= sample_counts
ValueError: operands could not be broadcast together with shapes (2599,) (1741,) (2599,)
How can I balance this estimator for this "unbalanced data"?.
Update to 0.16-dev. Random Forests now support class_weight="auto", which basically rebalances classes automatically for you.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With