hb

hunga-bunga

HungaBunga: Brute-Force all sklearn models with all parameters using .fit .predict!

Showing:

Popularity

Downloads/wk

0

GitHub Stars

626

Maintenance

Last Commit

2yrs ago

Contributors

2

Package

Dependencies

0

License

MIT

Categories

Readme

Hunga-Bunga

Brute Force all scikit-learn models and all scikit-learn parameters with fit predict.


Lets brute force all sklearn models with all of sklearn parameters! Ahhh Hunga Bunga!!
from hunga_bunga import HungaBungaClassifier, HungaBungaRegressor
And then simply:


What?

Yes.

No! Really! What?

Many believe that

most of the work of supervised (non-deep) Machine Learning lies in feature engineering, whereas the model-selection process is just running through all the models or just take xgboost.

So here is an automation for that.

HOW IT WORKS

Runs through all sklearn models (both classification and regression), with all possible hyperparameters, and rank using cross-validation.

MODELS

Runs all the model available on sklearn for supervised learning here. The categories are:

  • Generalized Linear Models
  • Kernel Ridge
  • Support Vector Machines
  • Nearest Neighbors
  • Gaussian Processes
  • Naive Bayes
  • Trees
  • Neural Networks
  • Ensemble methods

Note: Some models were dropped out (nearly none of them..) and some crash or cause exceptions from time to time. It takes REALLY long to test this out so clearing exceptions took me a while.

Installation

pip install hunga-bunga

Dependencies


- Python (>= 2.7)
- NumPy (>= 1.11.0)
- SciPy (>= 0.17.0)
- joblib (>= 0.11)
- scikit-learn (>=0.20.0)
- tabulate (>=0.8.2)
- tqdm (>=4.28.1)

As any other sklearn model

clf = HungaBungaClassifier()
clf.fit(x, y)
clf.predict(x)

And import from here

from hunga_bunga import HungaBungaClassifier, HungaBungaRegressor

Option II: brain = True

As any other sklearn model

clf = HungaBungaClassifier(brain=True)
clf.fit(x, y)

The output looks this:

ModelaccuracyTime/clf (s)
SGDClassifier0.9670.001
LogisticRegression0.9400.001
Perceptron0.9000.001
PassiveAggressiveClassifier0.9670.001
MLPClassifier0.8270.018
KMeans0.5800.010
KNeighborsClassifier0.9600.000
NearestCentroid0.9330.000
RadiusNeighborsClassifier0.9270.000
SVC0.9600.000
NuSVC0.9800.001
LinearSVC0.9400.005
RandomForestClassifier0.9800.015
DecisionTreeClassifier0.9600.000
ExtraTreesClassifier0.9930.002

The winner is: ExtraTreesClassifier with score 0.993.

Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100