| dc.creator |
Ozturk, Muhammed Maruf |
|
| dc.creator |
Ipekci, Deniz |
|
| dc.creator |
Cankaya, İbrahim Arda |
|
| dc.date |
2020-11-01T00:00:00Z |
|
| dc.date.accessioned |
2021-12-03T11:29:45Z |
|
| dc.date.available |
2021-12-03T11:29:45Z |
|
| dc.identifier |
6582b7f5-ec17-4627-bc61-d909d056f6e0 |
|
| dc.identifier |
10.1016/j.neucom.2020.07.034 |
|
| dc.identifier |
https://avesis.sdu.edu.tr/publication/details/6582b7f5-ec17-4627-bc61-d909d056f6e0/oai |
|
| dc.identifier.uri |
http://acikerisim.sdu.edu.tr/xmlui/handle/123456789/92267 |
|
| dc.description |
Hyperparameter optimization is a challenging process that has the potential to improve machine learning algorithms. Since it creates a remarkable computational burden for machine learning tasks, there have been few works coping with tuning strategies of a specific algorithm. In this paper, an improved Stochastic Gradient Descent (SGD) based on Fisher Maximization is developed for tuning hyperparameters of an Echo State Network (ESN) which has a wide range of applications. The results of the method are then compared with those of traditional Gradient Descent and Grid Search. According to the obtained results; 1) The scale of the data sets greatly affects the reliability of hyperparameter optimization results; 2) Feature selection is critical in terms of mean error of training when hyperparameter optimization is applied on some methods such as ESN; 3) SGD falls in a good local minima if Fisher Maximization is performed to find a good starting point. (C) 2020 Elsevier B.V. All rights reserved. |
|
| dc.language |
eng |
|
| dc.rights |
info:eu-repo/semantics/closedAccess |
|
| dc.title |
Optimizing echo state network through a novel fisher maximization based stochastic gradient descent |
|
| dc.type |
info:eu-repo/semantics/article |
|