Show simple item record

dc.contributor.authorChesang, s
dc.contributor.authorMuasya, T.K
dc.contributor.authorNgeno, K
dc.date.accessioned2024-03-06T07:44:18Z
dc.date.available2024-03-06T07:44:18Z
dc.date.issued2022
dc.identifier.citationChesang S, Muasya, T. K and Ngeno K. (2022). Application of artificial neural network to evaluate extend of non linearity among explanatory variables within and between genotypes and phenotypes. In: Isutsa, D.K. (Ed) Proceedings of the 8th International Research Conference held in Chuka University from 7th to 8 th October2021, Chuka, Kenya, p.124-134.en_US
dc.identifier.urihttp://repository.chuka.ac.ke/handle/chuka/16025
dc.descriptioncsumukwo@chuka.ac.ke;chesangsumukwo@gmail.com;aarapngeno@gmail.com;en_US
dc.description.abstractArtificial neural networks (ANN) have been described as one of the models used for marker-based genomic predictions of complex traits in the field of animal breeding. It accommodates noisy, non-linearity in data set and makes decisions based on prior knowledge. This study evaluated the extent of non-linearity among explanatory variables within and between genotypes and phenotypes using ANN. A feedforward ANN was adopted with different number of neurons where Levenberg-Marquardt back-propagation algorithm was used to train the network.The construction and training of the network were done with matrix laboratory (MATLAB). Mean absolute error (MAE) and Pearson’s correlation coefficients (R) were used to measure the ANN predictive performance as a measure of extent of non-linearity among explanatory variables within and between genotypes and phenotypes. Results showed that the ANN models differed in predictive performance depending on the number of neurons in the hidden layer, for instance the neural network with one hidden layer containing 10 neurons in the hidden layer yielded high R-value of 0.86 and MAE of 2.98E-3. When the network dimension was increased to 16 neurons the performance decreased to 0.67 for R and MAE increased to 7.73E-2. After a further increase of neurons to 32 the model yielded R value of 0.27 and MAE of 7.60E-2. The benchmark model for this study had an R of 0.77 and MAE of 5.72. Thus a model with 10 neurons is enough to handle non-linearity in this kind of data set thus chosen as the best non-linear model. This is because the dimension reduction of neurons in the hidden layer led to higher, more accurate, and more consistent predictions for growth rate. In comparison to the linear model, the best non-linear model performed better though the more complex non-linear architectures with 16 and 32 neurons could not outperform the linear model. Thus, linear models can as well produce reliable results for making genomic predictions. Keywords: Artificial neural network, Backpropagation, Mean absolute erroren_US
dc.description.sponsorshipCHUKA UNIVERSITYen_US
dc.language.isoenen_US
dc.publisherChuka universityen_US
dc.subjectArtificial neural network, Backpropagation, Mean absolute erroren_US
dc.titleAPPLICATION OF ARTIFICIAL NEURAL NETWORK TO EVALUATE EXTEND OF NON LINEARITY AMONG EXPLANATORY VARIABLES WITHIN AND BETWEEN GENOTYPES AND PHENOTYPESen_US
dc.typeArticleen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record