machine_learning.xgboost_classifier¶
Functions¶
|
|
|
|
|
# THIS TEST IS BROKEN!! >>> xgboost(np.array([[5.1, 3.6, 1.4, 0.2]]), np.array([0])) |
Module Contents¶
- machine_learning.xgboost_classifier.data_handling(data: dict) tuple ¶
>>> data_handling(({'data':'[5.1, 3.5, 1.4, 0.2]','target':([0])})) ('[5.1, 3.5, 1.4, 0.2]', [0]) >>> data_handling( ... {'data': '[4.9, 3.0, 1.4, 0.2], [4.7, 3.2, 1.3, 0.2]', 'target': ([0, 0])} ... ) ('[4.9, 3.0, 1.4, 0.2], [4.7, 3.2, 1.3, 0.2]', [0, 0])
- machine_learning.xgboost_classifier.main() None ¶
>>> main()
Url for the algorithm: https://xgboost.readthedocs.io/en/stable/ Iris type dataset is used to demonstrate algorithm.
- machine_learning.xgboost_classifier.xgboost(features: numpy.ndarray, target: numpy.ndarray) xgboost.XGBClassifier ¶
# THIS TEST IS BROKEN!! >>> xgboost(np.array([[5.1, 3.6, 1.4, 0.2]]), np.array([0])) XGBClassifier(base_score=0.5, booster=’gbtree’, callbacks=None,
colsample_bylevel=1, colsample_bynode=1, colsample_bytree=1, early_stopping_rounds=None, enable_categorical=False, eval_metric=None, gamma=0, gpu_id=-1, grow_policy=’depthwise’, importance_type=None, interaction_constraints=’’, learning_rate=0.300000012, max_bin=256, max_cat_to_onehot=4, max_delta_step=0, max_depth=6, max_leaves=0, min_child_weight=1, missing=nan, monotone_constraints=’()’, n_estimators=100, n_jobs=0, num_parallel_tree=1, predictor=’auto’, random_state=0, reg_alpha=0, reg_lambda=1, …)