Una revisión de redes MLP como clasificadores de múltiples clases

A survey on MLP neural networks as multi-class classifiers

  • Ricardo Majalca-Martínez Instituto Tecnológico de Chihuahua
  • Pedro Rafael Acosta-Cano de los Ríos Instituto Tecnológico de Chihuahua
Palabras clave: clasificador múltiples clases MLP, red neuronal, entrenamiento MLP, aplicación clasificadores


Se presenta el estado actual de clasificadores de múltiples clases implementados con redes Multi Layer Perceptron, MLP. Los clasificadores de múltiples clases basados en redes MLP han sido utilizados en muchos casos con éxito. Se presentan, primero, los aspectos generales y las diferentes formas de implementar clasificadores de múltiples clases, incluyendo las redes MLP. Después se presentan aspectos de arquitectura de las redes MLP clasificadoras incluyendo consideraciones de diseño y organización tales como: capas de entrada, ocultas y de salida, así como la cantidad de neuronas en cada capa. Luego viene una revisión acerca de las metodologías existentes para su entrenamiento, y cómo es que la organización de la red afecta las condiciones de entrenamiento. A continuación, se presentan casos de uso de las redes MLP como clasificadores, sus características y detalles acerca de los parámetros referentes al diseño de la red y también se revisan los resultados de su aplicación. En el material revisado, pareciera ser que el desempeño depende en gran medida de su aplicación específica, aunque no existe trabajo que demuestre esto en forma determinante.


The current state of classifiers multiple classes implemented Multi Layer Perceptron networks, MLP, is presented. Multi-class classifiers based on MLP neural network have been successfully used in many cases. First, general aspects and existing approaches of implementing multi-class classifiers are introduced, including MLP neural networks. Afterwards, aspects on MLP network architecture are described, including the design and organization considerations such as input layers, hidden layers and output layers, as well as amount of neurons in each layer. Then comes a review on existing methodologies for training, and how the network organization affects the training conditions. Afterwards, some cases of MLP networks used as classifiers are revised, considering their characteristics and details about network design along with its results in the particular application. Although it seems from the review of literature that the performance of this kind of classifiers largery depends on the specific application, there exist no concluding results on it.

Keywords: MLP multi-class classifier, neural network, MLP training, classifiers application.


Allwein, E. L., R. E. Schapire & Y. Singer. 2001. Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers. Journal of Machine Learning Research 1:113–141. https://www.jmlr.org/papers/volume1/allwein00a/allwein00a.pdf

Anastassiou, G. A. 2011. Multivariate sigmoidal neural network approximation. Neural Networks 24(4):378–386. http://doi.org/10.1016/j.neunet.2011.01.003

Angulo, C., X. Parra & A. Català. 2003. K-SVCR. A support vector machine for multi-class classification. Neurocomputing 55(1-2):57–77. http://doi.org/10.1016/S0925-2312(03)00435-1

Barron, A. R. 1993. Universal Approximation Bounds for Superposition of a Sigmoid Function. IEEE Transactions on Information Theory 39(3):930–945. https://doi.org/10.1109/18.256500

Blum, C., & K. Socha. 2005. Training feed-forward neural networks with ant colony optimization: An application to pattern classification. Fifth International Conference on Hybrid Intelligent Systems (HIS´05). https://doi.org/10.1109/ICHIS.2005.104

Chattopadhyay, S. & G. Chattopadhyay. 2008. Identification of the best hidden layer size for three-layered neural net in predicting monsoon rainfall in India. Journal of Hydroinformatics 10(2):181-188. http://doi.org/10.2166/hydro.2008.017

Che, Z. G., T. A. Chiang & Z. H. Che. 2011. Feed-forward neural networks training: A comparison between genetic algorithm and back-propagation learning algorithm. International Journal of Innovative Computing, Information and Control 7(10):839–5850.

Cheong, S., S. Oh & S. Lee. 2004. Support vector machines with binary tree architecture for multi-class classification. Neural Information Processing - Letters and Reviews 2(3):47–51. http://logos.mokwon.ac.kr/pub/NIPLR2004.pdf

Ciresan, D., U. Meier & J. Schmidhuber. 2012. Multi-column Deep Neural Networks for Image Classification. Conference on Computer Vision and Pattern Recognition (pp. 3642–3649). http://doi.org/10.1109/CVPR.2012.6248110

Cunha, R. H., I. Nunes, A. Goedtel & W. F. Godoy. 2015. A comprehensive evaluation of intelligent classifiers for fault identification in three-phase induction motors. Electric Power Systems Research 127:249–258. http://doi.org/10.1016/j.epsr.2015.06.008

Cybenko, G. 1989. Degree of approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems 9(3):303–314.

Galar, M., A. Fernández, E. Barrenechea, H. Bustince & F. Herrera. 2011. An overview of ensemble methods for binary classifiers in multi-class problems: Experimental study on one-vs-one and one-vs-all schemes. Pattern Recognition 44(8):1761–1776. http://doi.org/10.1016/j.patcog.2011.01.017

Gardner, M. W. & S. R. Dorling. 1998. Artificial Neural Networks ( the Multilayer Perceptron )— a Review of Applications in the Atmospheric Sciences 32(14-15): 2627–2636. https://doi.org/10.1016/S1352-2310(97)00447-0

Gertrudes, J. C., V. G. Maltarollo, R. A. Silva, P. R. Oliveira, K. M. Honório & A. B. F. Da Silva. 2012. Machine learning techniques and drug design. Current Medicinal Chemistry 19(25):4289–97. http://doi.org/10.2174/092986712802884259

Hagan, M. T., H. B. Demuth & M. H. Beale. 1995. Neural Network Design. PWS Publishing Company. https://hagan.okstate.edu/NNDesign.pdf

Hagan, M. T. & M. B. Menhaj. 1994. Training feedforward networks with the Marquardt algorithm. IEEE Transactions on Neural Networks 5(6):989–993. http://doi.org/10.1109/72.329697

Harp, S. A. & T. Samad. 1992. Optimizing neural networks with genetic algorithms. In Neural network computing for the electric power industry: proceedings of the 1992 INNS summer workshop (pp. 41–44). Psychology Press. ISBN 0805814671, 9780805814675.

Huang, G. B., Y. Q. Chen & H. A. Babri. 2000. Classification ability of single hidden layer feedforward neural networks. IEEE Transactions on Neural Networks 11(3): 799–801. http://doi.org/10.1109/72.846750

Huang, G. B. 2003. Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Transactions on Neural Networks 14(2):274–281. http://doi.org/10.1109/TNN.2003.809401

Huang, G.B., D. H. Wang & Y. Lan. 2011. Extreme learning machines: a survey. International Journal of Machine Learning and Cybernetics 2(2):107–122. http://doi.org/10.1007/s13042-011-0019-y

Huynh, H. T., Y. Won & J.J. Kim. 2008. An improvement of extreme learning machine for compact single-hidden-layer feedforward neural networks. International Journal of Neural Systems 18(5):433–441. https://doi.org/10.1142/S0129065708001695

Irani, R. & R. Nasimi. 2011. Evolving neural network using real coded genetic algorithm for permeability estimation of the reservoir. Expert Systems with Applications 38(8): 9862–9866. http://doi.org/10.1016/j.eswa.2011.02.046

Jadav, K. & M. Panchal. 2012. Optimizing Weights of Artificial Neural Networks using Genetic Algorithms. International Journal of Advanced Research in Computer Science and Electronics Engineering 1(10):47–51.

Jayalakshmi, T. & A. Santhakumaran. 2011. Statistical normalization and back propagation for classification. International Journal of Computer Theory and Engineering 3(1):1–5. http://www.ijcte.org/papers/288-L052.pdf

Karlaftis, M. G. & E. I. Vlahogianni. 2011. Statistical methods versus neural networks in transportation research: Differences, similarities and some insights. Transportation Research Part C: Emerging Technologies 19(3):387–399. http://doi.org/10.1016/j.trc.2010.10.004

Khan, K. & A. Sahai. 2012. A Comparison of BA, GA, PSO, BP and LM for Training Feed forward Neural Networks in e-Learning Context. International Journal of Intelligent Systems and Applications 4(7):23–29. http://doi.org/10.5815/ijisa.2012.07.03

Kumar, M. P. 2012. Backpropagation Learning Algorithm Based on Levenberg Marquardt. En Computer Science & Information Technology (pp. 393–398). CS & IT-CSCP. http://doi.org/10.5121/csit.2012.2438

Lange, T., K. Mosler & P. Mozharovskyi. 2014. Fast nonparametric classification based on data depth. Statistical Papers 55(1):49–69. https://doi.org/10.1007/s00362-012-0488-4

Lee, Y., S.H. Oh & M. W. Kim. 1993. An analysis of premature saturation in back propagation learning. Neural Networks 6(5):719–728. http://doi.org/10.1016/S0893-6080(05)80116-9

Lee, J.S. & I.S. Oh. 2003. Binary classification trees for multi-class classification problems. En Seventh International Conference on Document Analysis and Recognition, 2003. Proceedings. (pp. 770–774). IEEE Comput. Soc. http://doi.org/10.1109/ICDAR.2003.1227766

Li, J., J.H. Chen, J.Y. Shi & F. Huang. 2012. Brief introduction of backpropagation (BP) neural network algorithm and its improvement. Advances in Computer Science and Information Engineering 169:553–558. https://doi.org/10.1007/978-3-642-30223-7_87

Lorena, A. C., A. C. P. L. F. De Carvalho & J. M. P. Gama. 2008. A review on the combination of binary classifiers in multiclass problems. Artificial Intelligence Review 30(2008):19–37. http://doi.org/10.1007/s10462-009-9114-9

Martínez, J., C. Iglesias, J. M. Matías, J. Taboada & M. Araújo. 2014. Solving the slate tile classification problem using a DAGSVM multiclassification algorithm based on SVM binary classifiers with a one-versus-all approach. Applied Mathematics and Computation 230(1):464–472. http://doi.org/10.1016/j.amc.2013.12.087

Mavrovouniotis, M. & S. Yang. 2015. Training neural networks with ant colony optimization algorithms for pattern classification. Soft Computing 19(6):1511–1522. http://doi.org/10.1007/s00500-014-1334-5

Mayoraz, E. & E. Alpaydin. 1999. Support vector machines for multi-class classification. En Engineering Applications of Bio-Inspired Artificial Neural Networks. IWANN 1999. Lecture Notes in Computer Science. Springer. https://doi.org/10.1007/BFb0100551

Misra, J. & I. Saha. 2010. Artificial neural networks in hardware: A survey of two decades of progress. Neurocomputing 74(1-3):239–255. http://doi.org/10.1016/j.neucom.2010.03.021

Moraes, R., J. F. Valiati & W. P. Gavião. 2013. Document-level sentiment classification: An empirical comparison between SVM and ANN. Expert Systems with Applications 40(2):621–633. http://doi.org/10.1016/j.eswa.2012.07.059

Müller, K. R., S. Mika, G. Rätsch, K. Tsuda, & B. Schölkopf. 2001. An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks 12(2):181–201. http://doi.org/10.1109/72.914517

Ou, G. & Y. L. Murphey. 2007. Multi-class pattern classification using neural networks. Pattern Recognition 40(1):4–18. http://doi.org/10.1016/j.patcog.2006.04.041

Panchal, G., A. Ganatra, Y. Kosta & D. Panchal. 2011. Behaviour analysis of multilayer perceptrons with multiple hidden neurons and hidden layers. International Journal of Computer Theory and Engineering 3(2):332–337. http://www.ijcte.org/papers/328-L318.pdf

Sheela, K. G. & S. N. Deepa. 2013. Review on methods to fix number of hidden neurons in neural networks. Mathematical Problems in Engineering. http://doi.org/10.1155/2013/425740

Singh, A. K., S. Tiwari & V. P. Shukla. 2012. Wavelet based Multi Class image classification using Neural Network. International Journal of Computer Applications 37(4):21–25. http://dx.doi.org/10.5120/4597-6555

Stathakis, D. 2009. How many hidden layers and nodes? International Journal of Remote Sensing 30(8):2133–2147. http://doi.org/10.1080/01431160802549278

Tax, D. M. J. & R. P. W. Duin. 2002. Using two-class classifiers for multiclass classification. En 2002 International Conference on Pattern Recognition Vol.2. (pp.124-127). ICPR. http://doi.org/10.1109/ICPR.2002.1048253

Thabtah, F., P. Cowling & Y. Peng. 2005. MCAR: multi-class classification based on association rule. En The 3rd ACS/IEEE International Conference on Computer Systems and Applications (pp.33). http://doi.org/10.1109/AICCSA.2005.1387030

Valtierra-Rodriguez, M., R. De Jesus Romero-Troncoso, R. A. Osornio-Rios & A. Garcia-Perez. 2014. Detection and classification of single and combined power quality disturbances using neural networks. IEEE Transactions on Industrial Electronics 61(5):2473–2482. http://doi.org/10.1109/TIE.2013.2272276

Vellido, A., P.J.G. Lisboa & J. Vaughan. 1999. Neural networks in business: a survey of applications (1992–1998). Expert Systems with Applications 17(1):51–70. http://doi.org/10.1016/S0957-4174(99)00016-0

Windeatt, T. & R. Ghaderi. 2003. Coding and decoding strategies for multi-class learning problems. Information Fusion 4(1):11–21. http://doi.org/10.1016/S1566-2535(02)00101-X

Wu, T.F., C.J. Lin & R. C. Weng. 2004. Probability Estimates for Multi-class Classification by Pairwise Coupling. J. Mach. Learn. Res. 5:975–1005. https://proceedings.neurips.cc/paper/2003/file/03e7ef47cee6fa4ae7567394b99912b7-Paper.pdf

Yeung, D.Y. & C. Chow. 2002. Parzen-window network intrusion detectors. En 2002 International Conference on Pattern Recognition Vol. 4. (pp.385–388). http://doi.org/10.1109/ICPR.2002.1047476

Zhang, Y., S. Wang, G. Ji & P. Phillips. 2014. Fruit classification using computer vision and feedforward neural network. Journal of Food Engineering 143:167–177. http://doi.org/10.1016/j.jfoodeng.2014.07.001

Cómo citar
Majalca-Martínez, R., & Acosta-Cano de los Ríos, P. R. (2020). Una revisión de redes MLP como clasificadores de múltiples clases: A survey on MLP neural networks as multi-class classifiers. TECNOCIENCIA Chihuahua, 9(3), 148-159. https://doi.org/10.54167/tch.v9i3.587