In [1]:
import pandas as pd
import sklearn as sfs
import matplotlib.pyplot as plt
import numpy as np
import sys
sys.path.append('..')
from model_handler import ModelHandler
from feature_selection import FeatureSelectionAndGeneration
handler = ModelHandler()
dataset = handler.dataset
train_set = dataset[handler.train_mask]

The dataset includes different risks that need a prediction. Every risk is considered as a different target of labels, namely a response variable.

The aim is to build a model able to predict each risk in the most accurate way possible. However, the learning process is different for each of them, meaning that the minimum set of variables that best explain the largest amount of variance in the dataset is unique for every risk. As a consequence, the following pipeline will be executed as much time as the number of risks in order to return as more precise predictions as possible.

Dataset splitting

The first step consists in splitting the dataset into training and test sets. The first will be used during the feature selection part, which is implemented using a boosted logistic regression model. This is a supervised learning approach, thus labels are needed for the regression to be carried out. In this dataset risks are assigned to only some of the cities, therefore it's wise to select as training set all the entries containing values for the given risk. All the rest will be referred to as test set, used for the classification task, since those cities will be the ones needing a prediction.

Feature selection

When there is a highly non-linear and complex relationship between the predictors and the labels decision trees are preferable. The dataset has many different predictors and we don't know whether this relationship is linear or not.

The most robust approach among the ensemble method is Boosting. It allows to aggregate many decision trees, differently from Random Forest, and grow them sequentially, instead of using boostrap sampling like in Bagging.

The procedure consists in fitting small trees to the residuals in order to slowly improve the prediction error. Generally, model that learn slowly tend to perform better. A pitfall of Boosting, however, is that it relies very much on its tuning parameters. Hence, it's important to undergo Cross Validation in order to select the combination returning the highest accuracy, for every target. For this purpose we decided to use 3-fold cross validation in such a way to speed up the tuning process, which is already slow given the amount of parameters that need to be optimized.

In [2]:
import xgboost as xgb
from sklearn.metrics import accuracy_score, make_scorer
from sklearn.model_selection import GridSearchCV, KFold
from sklearn.pipeline import Pipeline
import shutil
import os
memory_dir = '.pipeline_cache.tmp'

XgBoost has as default objective function reg:squarederror, which corresponds to a linear regression with mean-squared error as loss function.

In [3]:
from bayes_opt import BayesianOptimization
if os.path.isdir(memory_dir):
    shutil.rmtree(memory_dir)

def init_model(**model_params):
    return Pipeline([('generation_and_selection', FeatureSelectionAndGeneration(feats_num=500)), ('regressor', xgb.XGBRegressor(**model_params))],memory=memory_dir)
    
In [4]:
from sklearn.model_selection import cross_val_score
from data.labeled.preprocessed import RISKS_MAPPING
from classification import RANDOM_SEED
optimal_params = {}
CONSTANTS = {'objective':"reg:squarederror", "random_state": RANDOM_SEED}
for (risk, total_set, [train_set, valid_set]) in handler.get_total_train_val_set_per_risk():
    print(f"\n\n**Risk: {RISKS_MAPPING[risk]}**\n")
    print(f"Annotated Samples Size: {total_set.shape[0]}")
    print(f"To be used for parameters estimation: {train_set.shape[0]}\n")
    def xgb_evaluate(**params):
        params['n_estimators'] = int(params['n_estimators'])
        params['max_depth'] = int(params['max_depth'])
        params.update(CONSTANTS)

        model = init_model(**params)
        train_tuple = (train_set[handler.feat_names], train_set[risk])
        reg_cv = model.fit(*train_tuple)
        cv_result = np.mean(cross_val_score(model, *train_tuple, cv=3,scoring='neg_mean_squared_error'))
        return cv_result
    xgb_bo = BayesianOptimization(xgb_evaluate, {'max_depth': (1, 7), 
                                                 'alpha': (0,20),
                                                 'gamma': (0, 1),
                                                 'colsample_bytree': (0.3, 0.9),
                                                 "n_estimators":[200,1000],
                                                 "learning_rate":[0.1,0.5],
                                                 "min_child_weight": [0.5, 1.5],
                                                 "subsample": [0.8, 1]
                                                }
                                  
                                 )
    xgb_bo.maximize(init_points=20, n_iter=20)
    params = xgb_bo.max['params']
    params['max_depth'] = int(params['max_depth'])
    params['n_estimators'] = int(params['n_estimators'])
    params.update(CONSTANTS)
    optimal_params[risk] = params

**Risk: Higher water prices**

Annotated Samples Size: 87
To be used for parameters estimation: 60

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -0.976    |  12.56    |  0.5203   |  0.353    |  0.1552   |  1.692    |  0.7748   |  321.3    |  0.9703   |
|  2        | -1.108    |  3.863    |  0.6      |  0.8569   |  0.4626   |  6.346    |  0.7124   |  420.2    |  0.8729   |
|  3        | -1.164    |  2.943    |  0.3607   |  0.2629   |  0.308    |  6.941    |  1.383    |  478.5    |  0.9804   |
|  4        | -1.027    |  15.96    |  0.7929   |  0.6658   |  0.2481   |  1.401    |  1.476    |  665.0    |  0.9819   |
|  5        | -0.9523   |  8.722    |  0.5596   |  0.8213   |  0.2037   |  3.636    |  1.121    |  834.2    |  0.8978   |
|  6        | -0.9418   |  8.082    |  0.7093   |  0.3346   |  0.2707   |  3.763    |  1.261    |  459.6    |  0.9967   |
|  7        | -0.949    |  10.07    |  0.39     |  0.3423   |  0.2372   |  2.342    |  0.9643   |  723.0    |  0.9999   |
|  8        | -1.024    |  15.42    |  0.4462   |  0.5091   |  0.2354   |  1.566    |  1.327    |  248.0    |  0.9947   |
|  9        | -1.025    |  16.57    |  0.4903   |  0.8668   |  0.4399   |  5.197    |  1.492    |  559.6    |  0.8281   |
|  10       | -0.9775   |  5.481    |  0.6518   |  0.9604   |  0.4691   |  6.71     |  1.423    |  450.9    |  0.9962   |
|  11       | -0.9977   |  13.47    |  0.6428   |  0.9202   |  0.1696   |  3.72     |  1.243    |  242.6    |  0.8566   |
|  12       | -0.94     |  8.717    |  0.6548   |  0.6301   |  0.4702   |  1.039    |  0.6808   |  801.3    |  0.9204   |
|  13       | -0.9506   |  11.65    |  0.6503   |  0.7099   |  0.4727   |  5.964    |  0.9721   |  924.6    |  0.8481   |
|  14       | -0.9498   |  10.56    |  0.866    |  0.8808   |  0.3846   |  6.23     |  1.138    |  263.1    |  0.8503   |
|  15       | -0.9674   |  12.83    |  0.5388   |  0.2309   |  0.4563   |  6.958    |  1.33     |  227.6    |  0.9865   |
|  16       | -1.05     |  18.14    |  0.4417   |  0.8161   |  0.3731   |  2.773    |  1.415    |  878.3    |  0.9304   |
|  17       | -1.011    |  13.13    |  0.4389   |  0.9695   |  0.1509   |  5.986    |  1.349    |  601.3    |  0.9586   |
|  18       | -0.9802   |  14.19    |  0.3183   |  0.4282   |  0.2136   |  6.126    |  1.099    |  334.4    |  0.9288   |
|  19       | -0.996    |  14.14    |  0.4665   |  0.8108   |  0.1445   |  6.35     |  0.6815   |  893.5    |  0.8394   |
|  20       | -0.9769   |  14.82    |  0.8574   |  0.4829   |  0.1627   |  5.697    |  1.267    |  886.0    |  0.9175   |
|  21       | -0.9854   |  15.81    |  0.7733   |  0.3293   |  0.298    |  4.451    |  0.5392   |  802.8    |  0.9633   |
|  22       | -0.9489   |  7.964    |  0.8053   |  0.295    |  0.1313   |  4.86     |  0.9379   |  458.4    |  0.8878   |
|  23       | -1.487    |  0.5674   |  0.4356   |  0.4324   |  0.1951   |  1.655    |  0.9761   |  807.4    |  0.9175   |
|  24       | -0.9506   |  10.24    |  0.3042   |  0.3822   |  0.4604   |  3.892    |  0.706    |  795.9    |  0.8186   |
|  25       | -1.69     |  0.07939  |  0.732    |  0.05226  |  0.4933   |  1.069    |  1.098    |  456.0    |  0.9754   |
|  26       | -0.9926   |  13.49    |  0.7383   |  0.8933   |  0.2786   |  3.976    |  1.405    |  459.3    |  0.9022   |
|  27       | -0.949    |  9.458    |  0.9      |  0.0      |  0.1      |  7.0      |  0.5      |  464.2    |  0.8      |
|  28       | -1.023    |  12.38    |  0.3      |  1.0      |  0.1      |  7.0      |  1.5      |  450.9    |  1.0      |
|  29       | -0.9756   |  7.373    |  0.3808   |  0.3699   |  0.2086   |  4.113    |  1.155    |  716.7    |  0.9308   |
|  30       | -1.032    |  16.15    |  0.7696   |  0.7884   |  0.1548   |  2.975    |  0.9983   |  718.8    |  0.942    |
|  31       | -1.225    |  1.974    |  0.3824   |  0.5084   |  0.1778   |  1.753    |  0.8772   |  724.3    |  0.9071   |
|  32       | -1.023    |  15.31    |  0.6135   |  0.9991   |  0.4249   |  1.077    |  1.475    |  796.0    |  0.9435   |
|  33       | -0.9966   |  4.94     |  0.9      |  1.0      |  0.5      |  1.0      |  0.5      |  794.6    |  1.0      |
|  34       | -0.9066   |  8.465    |  0.7927   |  0.6341   |  0.3534   |  3.209    |  0.5682   |  257.2    |  0.8074   |
|  35       | -0.9965   |  13.31    |  0.531    |  0.9466   |  0.1996   |  6.735    |  0.8288   |  256.4    |  0.8134   |
|  36       | -1.027    |  5.683    |  0.6738   |  0.734    |  0.407    |  2.31     |  0.6124   |  262.6    |  0.9028   |
|  37       | -1.12     |  4.042    |  0.4752   |  0.08679  |  0.2101   |  5.074    |  0.6891   |  252.8    |  0.9517   |
|  38       | -0.9657   |  12.4     |  0.7808   |  0.5073   |  0.1499   |  1.068    |  0.7195   |  259.8    |  0.8626   |
|  39       | -0.929    |  10.95    |  0.3893   |  0.09512  |  0.2186   |  5.94     |  0.7151   |  931.7    |  0.9123   |
|  40       | -1.052    |  5.233    |  0.4361   |  0.7784   |  0.3709   |  6.081    |  1.264    |  928.3    |  0.9005   |
=========================================================================================================================


**Risk: Inadequate or aging infrastructure**

Annotated Samples Size: 148
To be used for parameters estimation: 103

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -0.8198   |  11.02    |  0.6243   |  0.4466   |  0.3571   |  1.014    |  0.6532   |  479.6    |  0.8418   |
|  2        | -0.8526   |  7.451    |  0.3414   |  0.4045   |  0.3367   |  4.223    |  1.107    |  722.4    |  0.9236   |
|  3        | -0.9816   |  2.071    |  0.5351   |  0.3249   |  0.1182   |  6.57     |  1.223    |  568.0    |  0.8343   |
|  4        | -0.8291   |  16.2     |  0.6908   |  0.6171   |  0.4106   |  1.451    |  0.9329   |  843.9    |  0.9049   |
|  5        | -0.8269   |  10.17    |  0.3814   |  0.8631   |  0.1145   |  1.288    |  1.462    |  952.9    |  0.858    |
|  6        | -0.8671   |  11.65    |  0.6014   |  0.1947   |  0.4111   |  3.74     |  1.183    |  385.9    |  0.8321   |
|  7        | -1.02     |  2.802    |  0.5483   |  0.2506   |  0.4981   |  2.736    |  0.7263   |  440.3    |  0.8792   |
|  8        | -0.952    |  5.121    |  0.6529   |  0.4038   |  0.4248   |  6.49     |  1.151    |  900.1    |  0.9996   |
|  9        | -0.8982   |  5.245    |  0.8315   |  0.986    |  0.2075   |  3.022    |  1.285    |  972.0    |  0.9263   |
|  10       | -0.8455   |  7.379    |  0.3023   |  0.3777   |  0.1084   |  6.471    |  1.035    |  409.1    |  0.9416   |
|  11       | -0.8932   |  15.1     |  0.765    |  0.3297   |  0.2599   |  5.949    |  0.6088   |  915.3    |  0.9717   |
|  12       | -0.8707   |  9.973    |  0.6884   |  0.1297   |  0.2809   |  4.649    |  1.242    |  231.7    |  0.8384   |
|  13       | -0.8843   |  18.05    |  0.3217   |  0.04365  |  0.2681   |  6.58     |  0.5535   |  501.0    |  0.9592   |
|  14       | -0.8543   |  14.8     |  0.7542   |  0.3821   |  0.2808   |  1.196    |  1.175    |  220.1    |  0.834    |
|  15       | -0.8377   |  7.995    |  0.458    |  0.8552   |  0.2373   |  3.098    |  0.9244   |  215.6    |  0.9366   |
|  16       | -0.8995   |  3.111    |  0.5405   |  0.8783   |  0.3042   |  1.369    |  1.158    |  544.2    |  0.8799   |
|  17       | -0.8743   |  15.62    |  0.8564   |  0.4401   |  0.4063   |  3.624    |  1.189    |  569.0    |  0.8584   |
|  18       | -0.9119   |  6.532    |  0.5334   |  0.08928  |  0.473    |  3.995    |  0.742    |  531.6    |  0.899    |
|  19       | -0.988    |  1.234    |  0.5088   |  0.5803   |  0.1247   |  5.374    |  0.7324   |  953.3    |  0.9662   |
|  20       | -0.905    |  5.893    |  0.8236   |  0.5449   |  0.3114   |  2.714    |  0.6195   |  367.2    |  0.8644   |
|  21       | -0.8544   |  9.661    |  0.731    |  0.175    |  0.2962   |  1.014    |  1.448    |  953.4    |  0.8738   |
|  22       | -0.8226   |  12.47    |  0.3      |  1.0      |  0.1      |  2.165    |  1.5      |  951.0    |  0.8      |
|  23       | -0.9001   |  7.74     |  0.7106   |  0.2857   |  0.1164   |  3.749    |  1.056    |  481.2    |  0.9787   |
|  24       | -0.8116   |  15.06    |  0.3242   |  0.7894   |  0.3851   |  1.151    |  0.542    |  477.3    |  0.9264   |
|  25       | -0.833    |  15.79    |  0.3141   |  0.8238   |  0.3584   |  1.87     |  1.249    |  482.3    |  0.8831   |
|  26       | -0.8718   |  15.49    |  0.5392   |  0.1029   |  0.1521   |  5.458    |  1.224    |  477.2    |  0.9181   |
|  27       | -0.8306   |  10.96    |  0.5962   |  0.63     |  0.3327   |  1.372    |  0.6195   |  474.3    |  0.8542   |
|  28       | -0.8346   |  15.36    |  0.3      |  1.0      |  0.1      |  1.112    |  1.5      |  955.7    |  0.9001   |
|  29       | -0.8431   |  13.5     |  0.3      |  1.0      |  0.1      |  7.0      |  1.5      |  954.7    |  1.0      |
|  30       | -0.866    |  9.939    |  0.7786   |  0.4095   |  0.3993   |  2.273    |  0.8656   |  841.4    |  0.9177   |
|  31       | -0.87     |  19.48    |  0.634    |  0.8462   |  0.2381   |  5.229    |  1.233    |  848.4    |  0.9436   |
|  32       | -0.8872   |  19.8     |  0.4582   |  0.08015  |  0.241    |  2.062    |  0.8295   |  838.8    |  0.8533   |
|  33       | -0.8515   |  17.09    |  0.3      |  1.0      |  0.5      |  1.0      |  0.5      |  471.0    |  0.8      |
|  34       | -0.8386   |  13.46    |  0.3371   |  0.9579   |  0.1427   |  5.769    |  1.191    |  212.5    |  0.8628   |
|  35       | -0.8444   |  9.527    |  0.9      |  0.0      |  0.5      |  1.0      |  1.353    |  208.6    |  1.0      |
|  36       | -1.006    |  1.93     |  0.6956   |  0.2839   |  0.3305   |  4.787    |  1.355    |  210.1    |  0.9394   |
|  37       | -0.8373   |  15.47    |  0.7667   |  0.8557   |  0.4766   |  1.808    |  0.7509   |  207.3    |  0.9411   |
|  38       | -0.8595   |  12.34    |  0.3      |  1.0      |  0.5      |  1.0      |  1.494    |  849.3    |  0.9975   |
|  39       | -0.8655   |  18.69    |  0.7914   |  0.1385   |  0.1885   |  1.891    |  0.7475   |  212.6    |  0.9122   |
|  40       | -0.8851   |  9.613    |  0.5318   |  0.01688  |  0.3891   |  6.633    |  1.372    |  221.0    |  0.9398   |
=========================================================================================================================


**Risk: Increased water stress or scarcity**

Annotated Samples Size: 261
To be used for parameters estimation: 182

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -0.3281   |  9.485    |  0.4036   |  0.3844   |  0.1532   |  2.67     |  0.5564   |  374.0    |  0.893    |
|  2        | -0.3491   |  13.37    |  0.5434   |  0.6597   |  0.4748   |  6.06     |  0.742    |  720.1    |  0.8199   |
|  3        | -0.3278   |  8.398    |  0.4316   |  0.4138   |  0.293    |  1.407    |  0.7838   |  639.6    |  0.9499   |
|  4        | -0.3429   |  15.82    |  0.8999   |  0.3894   |  0.4243   |  5.939    |  1.312    |  638.5    |  0.996    |
|  5        | -0.3171   |  7.677    |  0.6157   |  0.9715   |  0.1334   |  4.407    |  1.172    |  937.3    |  0.8044   |
|  6        | -0.331    |  14.74    |  0.4953   |  0.867    |  0.3143   |  6.215    |  1.473    |  612.8    |  0.8799   |
|  7        | -0.325    |  4.382    |  0.8106   |  0.8515   |  0.2929   |  1.315    |  1.23     |  961.2    |  0.9936   |
|  8        | -0.3677   |  10.82    |  0.4524   |  0.1496   |  0.3989   |  5.78     |  0.5854   |  296.0    |  0.8045   |
|  9        | -0.327    |  13.43    |  0.468    |  0.7353   |  0.1468   |  5.015    |  1.042    |  704.0    |  0.9341   |
|  10       | -0.3335   |  15.28    |  0.7539   |  0.7073   |  0.376    |  1.52     |  0.6837   |  983.4    |  0.8989   |
|  11       | -0.3304   |  6.638    |  0.6152   |  0.3634   |  0.1221   |  6.092    |  0.5775   |  267.4    |  0.865    |
|  12       | -0.3431   |  10.43    |  0.8293   |  0.0223   |  0.2122   |  5.028    |  0.8705   |  633.9    |  0.9555   |
|  13       | -0.3573   |  18.83    |  0.6731   |  0.2194   |  0.4503   |  3.15     |  1.187    |  441.6    |  0.983    |
|  14       | -0.3424   |  9.504    |  0.8332   |  0.04801  |  0.2117   |  4.216    |  0.764    |  731.0    |  0.8804   |
|  15       | -0.3433   |  16.88    |  0.3859   |  0.488    |  0.2663   |  2.724    |  0.609    |  930.9    |  0.8858   |
|  16       | -0.3674   |  0.4804   |  0.5781   |  0.3637   |  0.3831   |  6.517    |  1.375    |  729.9    |  0.9705   |
|  17       | -0.343    |  15.0     |  0.5225   |  0.85     |  0.2922   |  2.503    |  0.8741   |  834.1    |  0.9047   |
|  18       | -0.35     |  6.624    |  0.5092   |  0.106    |  0.332    |  2.323    |  1.272    |  632.3    |  0.9503   |
|  19       | -0.3431   |  13.46    |  0.6295   |  0.9335   |  0.4873   |  2.923    |  1.497    |  336.5    |  0.8542   |
|  20       | -0.3237   |  9.352    |  0.698    |  0.8385   |  0.2527   |  1.688    |  1.182    |  722.4    |  0.866    |
|  21       | -0.3359   |  8.87     |  0.4282   |  0.3963   |  0.1727   |  2.854    |  0.8249   |  936.2    |  0.8053   |
|  22       | -0.3293   |  7.37     |  0.3546   |  0.8295   |  0.1478   |  5.461    |  1.438    |  936.4    |  0.9905   |
|  23       | -0.3932   |  6.228    |  0.4149   |  0.4233   |  0.4511   |  4.553    |  0.6905   |  939.0    |  0.9231   |
|  24       | -0.3591   |  7.967    |  0.3719   |  0.3208   |  0.4784   |  4.366    |  1.317    |  937.3    |  0.8964   |
|  25       | -0.3468   |  16.59    |  0.7116   |  0.09403  |  0.405    |  1.379    |  1.023    |  782.0    |  0.9331   |
|  26       | -0.388    |  3.547    |  0.7318   |  0.1233   |  0.3939   |  4.061    |  0.5229   |  627.0    |  0.9397   |
|  27       | -0.324    |  2.056    |  0.7145   |  0.762    |  0.1602   |  1.475    |  0.7908   |  700.8    |  0.8131   |
|  28       | -0.3247   |  11.16    |  0.7298   |  0.8327   |  0.341    |  5.14     |  0.5732   |  462.2    |  0.9232   |
|  29       | -0.3204   |  8.149    |  0.5092   |  0.7052   |  0.1628   |  4.629    |  1.21     |  249.3    |  0.8078   |
|  30       | -0.3778   |  2.408    |  0.3926   |  0.09964  |  0.2565   |  3.206    |  0.5588   |  674.4    |  0.9762   |
|  31       | -0.3385   |  19.45    |  0.8131   |  0.3986   |  0.286    |  4.547    |  1.141    |  882.9    |  0.8843   |
|  32       | -0.3337   |  17.04    |  0.7316   |  0.1138   |  0.151    |  1.871    |  0.5148   |  208.3    |  0.8351   |
|  33       | -0.3449   |  1.46     |  0.8069   |  0.5204   |  0.4121   |  6.617    |  0.9983   |  854.3    |  0.87     |
|  34       | -0.3301   |  8.779    |  0.8432   |  0.5241   |  0.3962   |  2.688    |  1.086    |  576.6    |  0.887    |
|  35       | -0.3421   |  1.208    |  0.5459   |  0.7509   |  0.4733   |  1.684    |  0.9418   |  624.5    |  0.9782   |
|  36       | -0.331    |  14.19    |  0.5901   |  0.9458   |  0.2034   |  5.774    |  0.8884   |  373.9    |  0.8607   |
|  37       | -0.3469   |  11.32    |  0.5172   |  0.7826   |  0.4652   |  3.107    |  0.7457   |  564.2    |  0.865    |
|  38       | -0.3208   |  7.689    |  0.7648   |  0.4413   |  0.1786   |  1.792    |  1.1      |  724.1    |  0.9476   |
|  39       | -0.3405   |  7.085    |  0.6123   |  0.1234   |  0.322    |  3.815    |  0.7719   |  322.9    |  0.8477   |
|  40       | -0.3499   |  15.2     |  0.8245   |  0.1248   |  0.416    |  1.53     |  0.6271   |  654.4    |  0.9851   |
=========================================================================================================================


**Risk: Declining water quality**

Annotated Samples Size: 183
To be used for parameters estimation: 128

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -1.076    |  11.35    |  0.4297   |  0.3311   |  0.4448   |  3.21     |  1.257    |  366.4    |  0.8716   |
|  2        | -1.461    |  0.02068  |  0.6427   |  0.2271   |  0.2137   |  2.165    |  1.089    |  850.7    |  0.8977   |
|  3        | -1.068    |  16.53    |  0.5608   |  0.7401   |  0.4673   |  2.98     |  1.404    |  560.1    |  0.8234   |
|  4        | -1.131    |  3.425    |  0.3126   |  0.3419   |  0.4458   |  1.35     |  1.097    |  794.6    |  0.9491   |
|  5        | -1.037    |  13.07    |  0.5482   |  0.468    |  0.1326   |  3.683    |  0.5133   |  980.9    |  0.8412   |
|  6        | -1.045    |  16.8     |  0.421    |  0.93     |  0.3748   |  4.006    |  0.5425   |  633.6    |  0.8125   |
|  7        | -1.079    |  13.17    |  0.496    |  0.3223   |  0.4292   |  4.138    |  1.161    |  650.8    |  0.8472   |
|  8        | -1.059    |  3.899    |  0.7135   |  0.3937   |  0.2616   |  1.474    |  1.236    |  490.7    |  0.9987   |
|  9        | -1.061    |  7.062    |  0.7106   |  0.214    |  0.1198   |  6.39     |  1.323    |  384.5    |  0.8828   |
|  10       | -1.041    |  12.52    |  0.3514   |  0.2497   |  0.2943   |  1.493    |  0.6369   |  709.8    |  0.8329   |
|  11       | -1.044    |  8.785    |  0.7837   |  0.9025   |  0.1351   |  6.476    |  1.213    |  556.2    |  0.822    |
|  12       | -1.049    |  6.24     |  0.769    |  0.6981   |  0.1632   |  4.266    |  0.6909   |  387.7    |  0.829    |
|  13       | -1.058    |  17.0     |  0.4642   |  0.3398   |  0.1527   |  5.287    |  0.6933   |  463.8    |  0.9234   |
|  14       | -1.041    |  9.479    |  0.7892   |  0.5711   |  0.4804   |  3.815    |  1.36     |  925.2    |  0.9392   |
|  15       | -1.277    |  0.9942   |  0.8288   |  0.7709   |  0.3474   |  3.717    |  1.163    |  562.5    |  0.9482   |
|  16       | -1.034    |  19.36    |  0.827    |  0.7127   |  0.1971   |  4.538    |  1.191    |  327.2    |  0.919    |
|  17       | -1.102    |  5.447    |  0.6136   |  0.9093   |  0.2856   |  5.015    |  0.8487   |  592.4    |  0.9048   |
|  18       | -1.223    |  1.599    |  0.7584   |  0.4932   |  0.2725   |  2.799    |  0.9453   |  790.9    |  0.9411   |
|  19       | -1.079    |  15.0     |  0.856    |  0.3536   |  0.3476   |  3.325    |  1.201    |  948.5    |  0.893    |
|  20       | -1.022    |  14.24    |  0.3869   |  0.3285   |  0.1884   |  3.804    |  0.5181   |  799.9    |  0.8038   |
|  21       | -1.039    |  14.61    |  0.6563   |  0.9573   |  0.1      |  7.0      |  1.099    |  549.1    |  0.8      |
|  22       | -1.077    |  7.745    |  0.7716   |  0.9341   |  0.3554   |  2.506    |  1.408    |  808.3    |  0.8618   |
|  23       | -1.034    |  20.0     |  0.3      |  0.0      |  0.1      |  7.0      |  0.5      |  808.1    |  0.8      |
|  24       | -1.068    |  19.1     |  0.8478   |  0.6456   |  0.2297   |  4.257    |  1.076    |  340.3    |  0.8727   |
|  25       | -1.07     |  6.417    |  0.3426   |  0.7473   |  0.3129   |  3.588    |  1.499    |  328.1    |  0.9343   |
|  26       | -1.029    |  19.37    |  0.6888   |  0.5123   |  0.4757   |  2.768    |  1.17     |  969.0    |  0.9822   |
|  27       | -1.013    |  6.293    |  0.4006   |  0.8547   |  0.162    |  1.567    |  1.385    |  969.0    |  0.866    |
|  28       | -1.412    |  0.05126  |  0.6119   |  0.03441  |  0.2407   |  6.934    |  1.489    |  980.2    |  0.9947   |
|  29       | -1.046    |  10.96    |  0.5307   |  0.726    |  0.2867   |  1.984    |  1.061    |  962.7    |  0.9504   |
|  30       | -1.017    |  19.68    |  0.3566   |  0.9176   |  0.2101   |  2.527    |  0.5155   |  978.4    |  0.9177   |
|  31       | -1.016    |  20.0     |  0.9      |  1.0      |  0.1      |  1.0      |  0.5      |  990.3    |  0.8      |
|  32       | -1.039    |  18.01    |  0.4747   |  0.5552   |  0.1888   |  3.25     |  0.7726   |  313.9    |  0.9375   |
|  33       | -1.273    |  0.5517   |  0.8933   |  0.5635   |  0.296    |  1.427    |  1.223    |  936.1    |  0.9622   |
|  34       | -1.089    |  17.04    |  0.701    |  0.5698   |  0.5      |  5.822    |  1.466    |  916.3    |  0.9175   |
|  35       | -1.073    |  18.12    |  0.7574   |  0.3681   |  0.3013   |  6.446    |  1.395    |  999.8    |  0.9341   |
|  36       | -1.126    |  6.135    |  0.8921   |  0.05173  |  0.3441   |  6.258    |  0.7195   |  311.2    |  0.9121   |
|  37       | -1.113    |  4.733    |  0.7851   |  0.7087   |  0.1775   |  6.364    |  0.6953   |  701.0    |  0.9226   |
|  38       | -1.02     |  16.33    |  0.8291   |  0.5933   |  0.3228   |  2.345    |  0.6569   |  721.1    |  0.9582   |
|  39       | -1.107    |  5.362    |  0.3      |  0.0      |  0.1029   |  1.0      |  0.5      |  720.7    |  0.8      |
|  40       | -1.04     |  19.96    |  0.5005   |  0.9126   |  0.3848   |  6.906    |  1.027    |  713.8    |  0.8768   |
=========================================================================================================================


**Risk: Increased water demand**

Annotated Samples Size: 98
To be used for parameters estimation: 68

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -1.076    |  5.745    |  0.3147   |  0.03118  |  0.2611   |  5.243    |  0.9388   |  288.8    |  0.9923   |
|  2        | -1.138    |  8.688    |  0.779    |  0.8536   |  0.4607   |  5.184    |  0.8194   |  751.0    |  0.9752   |
|  3        | -1.119    |  14.05    |  0.6121   |  0.9493   |  0.1625   |  3.06     |  0.9632   |  972.5    |  0.8644   |
|  4        | -1.16     |  3.781    |  0.5699   |  0.3533   |  0.1423   |  1.639    |  0.7987   |  748.6    |  0.8918   |
|  5        | -1.213    |  1.828    |  0.417    |  0.4836   |  0.3657   |  3.066    |  1.454    |  474.6    |  0.8681   |
|  6        | -1.098    |  13.57    |  0.8709   |  0.556    |  0.1673   |  5.833    |  1.333    |  606.5    |  0.8169   |
|  7        | -1.22     |  1.757    |  0.8033   |  0.5448   |  0.1326   |  3.288    |  1.451    |  584.8    |  0.8814   |
|  8        | -1.074    |  10.71    |  0.5829   |  0.8531   |  0.4773   |  6.732    |  0.5464   |  564.5    |  0.8521   |
|  9        | -1.221    |  19.56    |  0.555    |  0.8976   |  0.2013   |  4.162    |  1.427    |  470.9    |  0.8597   |
|  10       | -1.08     |  13.93    |  0.3648   |  0.1656   |  0.1698   |  4.642    |  0.6899   |  972.4    |  0.941    |
|  11       | -1.2      |  18.11    |  0.6855   |  0.7447   |  0.2409   |  1.116    |  0.9774   |  941.2    |  0.9757   |
|  12       | -1.215    |  1.35     |  0.4691   |  0.6036   |  0.4155   |  5.58     |  1.336    |  778.0    |  0.8905   |
|  13       | -1.099    |  7.468    |  0.7976   |  0.6025   |  0.2195   |  3.102    |  0.9842   |  758.7    |  0.9724   |
|  14       | -1.173    |  3.195    |  0.4438   |  0.7025   |  0.1595   |  2.926    |  1.138    |  635.6    |  0.9681   |
|  15       | -1.156    |  14.72    |  0.6289   |  0.931    |  0.1022   |  3.197    |  1.422    |  343.8    |  0.9366   |
|  16       | -1.11     |  5.521    |  0.3827   |  0.1207   |  0.3554   |  5.701    |  0.8178   |  667.1    |  0.8443   |
|  17       | -1.082    |  13.9     |  0.8211   |  0.5137   |  0.4541   |  3.006    |  0.8208   |  533.9    |  0.915    |
|  18       | -1.097    |  10.48    |  0.6734   |  0.4772   |  0.495    |  2.723    |  1.265    |  644.7    |  0.9873   |
|  19       | -1.184    |  18.98    |  0.5123   |  0.6884   |  0.451    |  2.831    |  1.048    |  642.5    |  0.9725   |
|  20       | -1.325    |  0.3174   |  0.6821   |  0.5994   |  0.1503   |  5.07     |  0.5149   |  435.8    |  0.8518   |
|  21       | -1.043    |  11.41    |  0.3      |  0.0      |  0.3152   |  7.0      |  0.5      |  970.4    |  1.0      |
|  22       | -1.019    |  8.177    |  0.3      |  0.0      |  0.3004   |  7.0      |  0.5      |  975.5    |  1.0      |
|  23       | -1.483    |  0.175    |  0.6365   |  0.464    |  0.4532   |  6.673    |  1.333    |  974.0    |  0.9537   |
|  24       | -1.081    |  11.38    |  0.8622   |  0.637    |  0.2449   |  6.671    |  1.386    |  977.0    |  0.8648   |
|  25       | -1.075    |  9.313    |  0.3      |  0.0      |  0.1      |  4.484    |  0.5      |  973.2    |  1.0      |
|  26       | -1.131    |  15.94    |  0.4294   |  0.5419   |  0.1056   |  5.387    |  1.293    |  965.9    |  0.855    |
|  27       | -1.102    |  10.01    |  0.801    |  0.03775  |  0.2086   |  1.264    |  0.7218   |  562.4    |  0.8176   |
|  28       | -1.069    |  13.62    |  0.4746   |  0.4652   |  0.281    |  7.0      |  1.006    |  559.2    |  0.8702   |
|  29       | -1.012    |  7.431    |  0.3      |  1.0      |  0.5      |  7.0      |  1.5      |  558.9    |  1.0      |
|  30       | -1.154    |  8.525    |  0.9      |  1.0      |  0.5      |  5.916    |  0.5      |  554.3    |  0.8      |
|  31       | -1.051    |  4.603    |  0.3      |  0.0      |  0.1      |  7.0      |  1.5      |  562.3    |  1.0      |
|  32       | -1.402    |  0.4325   |  0.494    |  0.8048   |  0.3261   |  5.003    |  1.183    |  558.0    |  0.8654   |
|  33       | -1.062    |  8.95     |  0.8181   |  0.04032  |  0.1196   |  4.77     |  1.097    |  561.4    |  0.8052   |
|  34       | -1.1      |  5.789    |  0.4169   |  0.6392   |  0.4642   |  6.886    |  0.8808   |  569.4    |  0.9487   |
|  35       | -1.273    |  2.709    |  0.5831   |  0.1776   |  0.1859   |  2.302    |  1.384    |  293.7    |  0.8151   |
|  36       | -1.038    |  10.11    |  0.6726   |  0.08261  |  0.4333   |  5.931    |  1.285    |  286.2    |  0.9211   |
|  37       | -1.099    |  5.347    |  0.7501   |  0.01408  |  0.1971   |  6.885    |  0.9091   |  282.2    |  0.8723   |
|  38       | -1.074    |  8.893    |  0.3224   |  0.8484   |  0.4658   |  1.278    |  0.942    |  283.4    |  0.9964   |
|  39       | -1.163    |  13.16    |  0.3      |  1.0      |  0.1      |  2.266    |  0.5      |  288.6    |  1.0      |
|  40       | -1.067    |  12.28    |  0.3      |  0.0      |  0.5      |  6.834    |  1.5      |  281.2    |  0.8031   |
=========================================================================================================================


**Risk: Regulatory**

Annotated Samples Size: 65
To be used for parameters estimation: 45

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -0.634    |  0.4007   |  0.5598   |  0.9457   |  0.2311   |  5.176    |  0.8188   |  754.5    |  0.8361   |
|  2        | -0.7769   |  0.3596   |  0.817    |  0.1019   |  0.1208   |  1.696    |  0.7175   |  552.6    |  0.9423   |
|  3        | -0.65     |  7.713    |  0.6203   |  0.6829   |  0.1289   |  1.041    |  0.7867   |  861.9    |  0.949    |
|  4        | -0.65     |  7.868    |  0.3402   |  0.397    |  0.3539   |  3.93     |  1.216    |  409.7    |  0.9293   |
|  5        | -0.65     |  10.33    |  0.6372   |  0.6311   |  0.2617   |  5.342    |  1.434    |  306.5    |  0.9166   |
|  6        | -0.65     |  8.907    |  0.8101   |  0.1318   |  0.1887   |  3.294    |  1.311    |  707.8    |  0.9639   |
|  7        | -0.6096   |  4.518    |  0.8846   |  0.4758   |  0.1651   |  1.63     |  1.069    |  414.2    |  0.8413   |
|  8        | -0.65     |  11.51    |  0.5639   |  0.9098   |  0.4053   |  5.112    |  1.325    |  646.5    |  0.8121   |
|  9        | -0.65     |  10.64    |  0.3396   |  0.04532  |  0.2224   |  3.354    |  1.353    |  408.5    |  0.998    |
|  10       | -0.6622   |  1.782    |  0.5775   |  0.02547  |  0.1564   |  1.213    |  0.7556   |  678.0    |  0.8641   |
|  11       | -0.6141   |  2.953    |  0.5196   |  0.2862   |  0.3428   |  1.947    |  1.462    |  905.9    |  0.8426   |
|  12       | -0.6892   |  1.95     |  0.3439   |  0.3788   |  0.4834   |  5.959    |  1.182    |  671.5    |  0.8683   |
|  13       | -0.65     |  19.92    |  0.849    |  0.7879   |  0.1698   |  3.648    |  0.5338   |  976.6    |  0.9056   |
|  14       | -0.65     |  9.48     |  0.3824   |  0.5275   |  0.1574   |  4.803    |  1.062    |  687.4    |  0.9909   |
|  15       | -0.65     |  17.03    |  0.5383   |  0.1769   |  0.4261   |  1.588    |  0.5148   |  390.3    |  0.8065   |
|  16       | -0.5781   |  2.14     |  0.3976   |  0.6766   |  0.1321   |  6.421    |  1.344    |  617.6    |  0.8268   |
|  17       | -0.65     |  14.56    |  0.3759   |  0.1913   |  0.4286   |  3.331    |  1.072    |  904.6    |  0.9314   |
|  18       | -0.65     |  15.89    |  0.6491   |  0.9024   |  0.1373   |  1.534    |  0.5274   |  560.6    |  0.9585   |
|  19       | -0.6179   |  4.277    |  0.555    |  0.7307   |  0.4937   |  4.455    |  1.365    |  514.8    |  0.949    |
|  20       | -0.65     |  11.81    |  0.7337   |  0.8675   |  0.2799   |  4.125    |  0.5778   |  276.1    |  0.8058   |
|  21       | -0.5844   |  1.561    |  0.3      |  0.6733   |  0.3454   |  5.761    |  1.03     |  611.0    |  0.918    |
|  22       | -0.65     |  11.07    |  0.6943   |  0.4496   |  0.3094   |  4.612    |  0.8393   |  615.9    |  0.8629   |
|  23       | -1.024    |  0.0      |  0.3      |  0.0      |  0.5      |  1.0      |  0.5      |  616.0    |  1.0      |
|  24       | -0.5974   |  2.187    |  0.6036   |  0.7042   |  0.4978   |  6.36     |  1.195    |  616.3    |  0.9587   |
|  25       | -0.5811   |  3.623    |  0.3      |  1.0      |  0.1      |  7.0      |  1.5      |  612.5    |  0.8      |
|  26       | -0.5349   |  4.09     |  0.7532   |  0.1634   |  0.4104   |  6.92     |  1.367    |  608.5    |  0.9585   |
|  27       | -0.6215   |  6.344    |  0.5103   |  0.359    |  0.3083   |  4.51     |  0.9547   |  609.3    |  0.9787   |
|  28       | -0.6287   |  3.111    |  0.7198   |  0.141    |  0.4278   |  6.779    |  1.062    |  605.2    |  0.8506   |
|  29       | -0.6082   |  5.0      |  0.3918   |  0.6282   |  0.4713   |  6.385    |  0.9347   |  620.2    |  0.9916   |
|  30       | -0.5962   |  7.681    |  0.9      |  0.0      |  0.5      |  7.0      |  1.5      |  606.5    |  0.8      |
|  31       | -0.6335   |  0.6167   |  0.9      |  1.0      |  0.1      |  7.0      |  1.5      |  622.2    |  0.8      |
|  32       | -0.6088   |  1.528    |  0.7577   |  0.3278   |  0.411    |  1.375    |  1.467    |  418.9    |  0.8788   |
|  33       | -0.6672   |  0.6888   |  0.4138   |  0.5163   |  0.4346   |  6.652    |  0.5832   |  415.7    |  0.9485   |
|  34       | -0.65     |  7.065    |  0.9      |  0.6726   |  0.1      |  1.0      |  1.5      |  419.1    |  0.8      |
|  35       | -0.6728   |  1.945    |  0.8615   |  0.7369   |  0.2115   |  5.804    |  0.8446   |  910.2    |  0.927    |
|  36       | -0.6469   |  3.401    |  0.8743   |  0.6481   |  0.2516   |  2.601    |  0.6929   |  900.8    |  0.9294   |
|  37       | -0.9489   |  0.0      |  0.6484   |  0.0      |  0.5      |  1.0      |  1.5      |  424.1    |  0.8917   |
|  38       | -0.65     |  12.77    |  0.8338   |  0.6879   |  0.4608   |  6.81     |  1.309    |  608.4    |  0.8454   |
|  39       | -0.6375   |  6.352    |  0.414    |  0.6834   |  0.359    |  6.472    |  0.9875   |  616.6    |  0.9475   |
|  40       | -0.65     |  8.382    |  0.8514   |  0.331    |  0.1921   |  6.688    |  0.7405   |  517.9    |  0.9153   |
=========================================================================================================================
Using normal split for label risk6 due to underrepresentated levels. The levels counts for that label are:
 0.0    50
2.0     6
3.0     2
1.0     1
Name: risk6, dtype: int64


**Risk: Energy supply issues**

Annotated Samples Size: 59
To be used for parameters estimation: 41

|   iter    |  target   |   alpha   | colsam... |   gamma   | learni... | max_depth | min_ch... | n_esti... | subsample |
-------------------------------------------------------------------------------------------------------------------------
|  1        | -0.587    |  12.73    |  0.8291   |  0.6905   |  0.2992   |  1.114    |  0.8315   |  228.8    |  0.8776   |
|  2        | -0.5623   |  5.529    |  0.6455   |  0.5687   |  0.4132   |  4.259    |  0.6743   |  605.0    |  0.8682   |
|  3        | -0.587    |  13.12    |  0.7924   |  0.7277   |  0.3834   |  5.808    |  0.9065   |  239.3    |  0.9327   |
|  4        | -0.587    |  16.62    |  0.6002   |  0.4284   |  0.1191   |  6.524    |  1.065    |  886.2    |  0.8222   |
|  5        | -0.587    |  13.36    |  0.5952   |  0.7449   |  0.1717   |  6.759    |  1.185    |  479.7    |  0.9776   |
|  6        | -0.5799   |  9.103    |  0.8876   |  0.1953   |  0.2015   |  5.784    |  0.7178   |  622.4    |  0.8564   |
|  7        | -0.587    |  19.71    |  0.6442   |  0.5601   |  0.4241   |  5.821    |  1.231    |  951.9    |  0.8061   |
|  8        | -0.6271   |  3.363    |  0.8408   |  0.6474   |  0.3758   |  6.05     |  1.209    |  710.0    |  0.8649   |
|  9        | -0.587    |  17.32    |  0.6912   |  0.9283   |  0.2531   |  6.839    |  0.7316   |  229.9    |  0.833    |
|  10       | -0.587    |  17.4     |  0.3115   |  0.5729   |  0.347    |  2.837    |  1.191    |  904.8    |  0.9567   |
|  11       | -0.5873   |  10.32    |  0.5781   |  0.6387   |  0.3746   |  2.053    |  0.6984   |  704.1    |  0.9274   |
|  12       | -0.5877   |  8.476    |  0.3876   |  0.7123   |  0.1279   |  5.637    |  1.264    |  678.8    |  0.9907   |
|  13       | -0.658    |  1.564    |  0.5358   |  0.4241   |  0.3498   |  2.466    |  0.5006   |  923.2    |  0.8219   |
|  14       | -0.587    |  18.42    |  0.6724   |  0.2568   |  0.2438   |  2.124    |  1.478    |  526.0    |  0.9889   |
|  15       | -0.587    |  19.88    |  0.4332   |  0.273    |  0.2203   |  5.586    |  0.8418   |  566.5    |  0.9692   |
|  16       | -0.5735   |  2.067    |  0.3779   |  0.822    |  0.3635   |  2.507    |  0.8111   |  288.8    |  0.9353   |
|  17       | -0.5818   |  8.058    |  0.7886   |  0.77     |  0.1621   |  6.767    |  1.3      |  689.9    |  0.8208   |
|  18       | -0.587    |  13.7     |  0.7474   |  0.806    |  0.3101   |  4.35     |  1.156    |  481.8    |  0.9216   |
|  19       | -0.556    |  2.946    |  0.6039   |  0.8929   |  0.4203   |  4.871    |  0.7708   |  574.7    |  0.9023   |
|  20       | -0.5682   |  4.747    |  0.8856   |  0.577    |  0.228    |  1.042    |  1.063    |  504.3    |  0.8343   |
|  21       | -0.6513   |  0.1337   |  0.6406   |  0.9842   |  0.1039   |  5.487    |  0.8636   |  588.3    |  0.949    |
|  22       | -0.5609   |  5.817    |  0.3443   |  0.3187   |  0.185    |  5.38     |  0.7205   |  570.2    |  0.9018   |
|  23       | -0.7742   |  0.0      |  0.9      |  1.0      |  0.5      |  1.0      |  0.8688   |  570.0    |  0.912    |
|  24       | -0.563    |  6.101    |  0.802    |  0.7602   |  0.347    |  4.579    |  1.275    |  572.8    |  0.9393   |
|  25       | -0.5639   |  5.709    |  0.7149   |  0.2042   |  0.2878   |  6.729    |  1.067    |  576.3    |  0.8202   |
|  26       | -0.5929   |  4.905    |  0.3      |  1.0      |  0.5      |  3.073    |  0.5      |  577.6    |  1.0      |
|  27       | -0.579    |  9.917    |  0.3239   |  0.03289  |  0.3406   |  6.707    |  1.194    |  572.9    |  0.9229   |
|  28       | -0.7393   |  1.264    |  0.5194   |  0.3378   |  0.4088   |  6.175    |  1.365    |  577.8    |  0.9217   |
|  29       | -0.57     |  4.615    |  0.8296   |  0.7711   |  0.2197   |  5.75     |  0.8064   |  572.6    |  0.9534   |
|  30       | -0.5653   |  6.846    |  0.3      |  0.0      |  0.5      |  6.53     |  0.5      |  574.0    |  0.8      |
|  31       | -0.5678   |  8.674    |  0.4545   |  0.02216  |  0.1068   |  4.87     |  1.07     |  575.9    |  0.8457   |
|  32       | -0.5783   |  8.788    |  0.4788   |  0.1732   |  0.2638   |  4.566    |  1.347    |  568.5    |  0.9074   |
|  33       | -0.5768   |  9.276    |  0.4384   |  0.1234   |  0.1766   |  2.502    |  0.708    |  573.2    |  0.9508   |
|  34       | -0.6179   |  3.177    |  0.583    |  0.3986   |  0.4085   |  6.703    |  1.495    |  605.5    |  0.805    |
|  35       | -0.5664   |  6.128    |  0.5565   |  0.9066   |  0.3004   |  1.147    |  0.5555   |  604.5    |  0.8346   |
|  36       | -0.5517   |  6.025    |  0.8965   |  0.1584   |  0.469    |  2.295    |  1.12     |  607.4    |  0.9848   |
|  37       | -0.5847   |  8.6      |  0.6871   |  0.8713   |  0.3003   |  2.006    |  1.157    |  606.1    |  0.8259   |
|  38       | -0.6337   |  2.482    |  0.528    |  0.4878   |  0.313    |  2.417    |  1.095    |  609.5    |  0.8245   |
|  39       | -0.5801   |  7.297    |  0.4879   |  0.9732   |  0.3663   |  4.493    |  0.5141   |  607.6    |  0.9684   |
|  40       | -0.5757   |  5.151    |  0.7761   |  0.4539   |  0.3696   |  2.781    |  1.486    |  601.7    |  0.8561   |
=========================================================================================================================
In [5]:
from data.model import MODEL_BEST_PARAMS_PATH
pd.DataFrame(optimal_params).to_csv(MODEL_BEST_PARAMS_PATH)