:: Experimental :: Abstraction for binary logistic regression results for a given model.
:: Experimental :: Abstraction for binary logistic regression training results.
:: Experimental :: Abstraction for binary logistic regression training results. Currently, the training summary ignores the training weights except for the objective trace.
:: DeveloperApi ::
:: DeveloperApi ::
Model produced by a Classifier. Classes are indexed {0, 1, ..., numClasses - 1}.
Type of input features. E.g., Vector
Concrete Model type
:: DeveloperApi ::
:: DeveloperApi ::
Single-label binary or multiclass classification. Classes are indexed {0, 1, ..., numClasses - 1}.
Type of input features. E.g., Vector
Concrete Estimator type
Concrete Model type
Decision tree model (http://en.wikipedia.org/wiki/Decision_tree_learning) for classification.
Decision tree model (http://en.wikipedia.org/wiki/Decision_tree_learning) for classification. It supports both binary and multiclass labels, as well as both continuous and categorical features.
Decision tree learning algorithm (http://en.wikipedia.org/wiki/Decision_tree_learning) for classification.
Decision tree learning algorithm (http://en.wikipedia.org/wiki/Decision_tree_learning) for classification. It supports both binary and multiclass labels, as well as both continuous and categorical features.
Gradient-Boosted Trees (GBTs) (http://en.wikipedia.org/wiki/Gradient_boosting) model for classification.
Gradient-Boosted Trees (GBTs) (http://en.wikipedia.org/wiki/Gradient_boosting) model for classification. It supports binary labels, as well as both continuous and categorical features.
Multiclass labels are not currently supported.
Gradient-Boosted Trees (GBTs) (http://en.wikipedia.org/wiki/Gradient_boosting) learning algorithm for classification.
Gradient-Boosted Trees (GBTs) (http://en.wikipedia.org/wiki/Gradient_boosting) learning algorithm for classification. It supports binary labels, as well as both continuous and categorical features.
The implementation is based upon: J.H. Friedman. "Stochastic Gradient Boosting." 1999.
Notes on Gradient Boosting vs. TreeBoost:
Multiclass labels are not currently supported.
:: Experimental ::
:: Experimental ::
This binary classifier optimizes the Hinge Loss using the OWLQN optimizer. Only supports L2 regularization currently.
:: Experimental :: Linear SVM Model trained by LinearSVC
:: Experimental :: Linear SVM Model trained by LinearSVC
Logistic regression.
Logistic regression. Supports:
This class supports fitting traditional logistic regression model by LBFGS/OWLQN and bound (box) constrained logistic regression model by LBFGSB.
Model produced by LogisticRegression.
Model produced by LogisticRegression.
:: Experimental :: Abstraction for logistic regression results for a given model.
:: Experimental :: Abstraction for logistic regression results for a given model.
Currently, the summary ignores the instance weights.
:: Experimental :: Abstraction for multiclass logistic regression training results.
:: Experimental :: Abstraction for multiclass logistic regression training results. Currently, the training summary ignores the training weights except for the objective trace.
Classification model based on the Multilayer Perceptron.
Classification model based on the Multilayer Perceptron. Each layer has sigmoid activation function, output layer has softmax.
Classifier trainer based on the Multilayer Perceptron.
Classifier trainer based on the Multilayer Perceptron. Each layer has sigmoid activation function, output layer has softmax. Number of inputs has to be equal to the size of feature vectors. Number of outputs has to be equal to the total number of labels.
Naive Bayes Classifiers.
Naive Bayes Classifiers. It supports Multinomial NB (see here) which can handle finitely supported discrete data. For example, by converting documents into TF-IDF vectors, it can be used for document classification. By making every vector a binary (0/1) data, it can also be used as Bernoulli NB (see here). The input feature values must be nonnegative.
Model produced by NaiveBayes
Model produced by NaiveBayes
Reduction of Multiclass Classification to Binary Classification.
Reduction of Multiclass Classification to Binary Classification. Performs reduction using one against all strategy. For a multiclass classification with k classes, train k models (one per class). Each example is scored against all k models and the model with highest score is picked to label the example.
Model produced by OneVsRest.
Model produced by OneVsRest. This stores the models resulting from training k binary classifiers: one for each class. Each example is scored against all k models, and the model with the highest score is picked to label the example.
:: DeveloperApi ::
:: DeveloperApi ::
Model produced by a ProbabilisticClassifier. Classes are indexed {0, 1, ..., numClasses - 1}.
Type of input features. E.g., Vector
Concrete Model type
:: DeveloperApi ::
:: DeveloperApi ::
Single-label binary or multiclass classifier which can output class conditional probabilities.
Type of input features. E.g., Vector
Concrete Estimator type
Concrete Model type
Random Forest model for classification.
Random Forest model for classification. It supports both binary and multiclass labels, as well as both continuous and categorical features.
Random Forest learning algorithm for classification.
Random Forest learning algorithm for classification. It supports both binary and multiclass labels, as well as both continuous and categorical features.
:: Experimental :: Abstraction for binary logistic regression results for a given model.
Currently, the summary ignores the instance weights.