Banner Portal
Redes neurais para regressão uni- e multivariada
PDF

Palavras-chave

Redes neurais artificiais
Regressão
Algoritmo genético
Recozimento simulado
Retropropagação
Otimização

Como Citar

1.
Pereira GC, Custodio R. Redes neurais para regressão uni- e multivariada. Rev. Chemkeys [Internet]. 9º de setembro de 2021 [citado 29º de março de 2024];3(00):e021003. Disponível em: https://econtents.bc.unicamp.br/inpec/index.php/chemkeys/article/view/15880

Resumo

Redes Neurais Artificiais têm ganhado notoriedade na aproximação de funções uni e multivariadas em virtude a alta capacidade aproximativa desse tipo de modelo. Neste artigo é apresentada uma descrição dos modelos de regressão baseados em redes neurais juntamente com os algoritmos comumente usados para otimizá-los. A performance deste tipo de modelo é exemplificada através da aproximação de uma função univariada que relaciona a fração em mol na fase líquida de um dos componentes de uma mistura água-acetona com sua fração em mol na fase de vapor. O desempenho do modelo é, ainda, comparado com o desempenho de outros modelos baseados em métodos de regressão clássicos utilizados para solucionar o mesmo problema. Ao final do texto, é apresentado o código PYTHON para a criação do modelo de rede neural discutido aqui.

https://doi.org/10.20396/chemkeys.v3i00.15880
PDF

Referências

.[1] - Custodio R, Andrade J.C., Augusto F. Curve fitting of mathematical functions to experimental data. Química Nova 1997; 20(2): 219-225.

– Reinders W, Minjet C. Vapour equilibria in ternary systems. VI. The system water-acetone-chloroform. Recueil des Travaux Chimiques des Pays-Bas 1947; 66(9): 576-604.

- Funahashi K. On the approximate realization of continuous mappings by neural networks. Neural Networks 1989; 2(3): 183-192.

- Lewicki G, Marino G. Approximation by superpositions of a sigmoidal function. Zeitschrift fur Analysis und ihre Anwendung 2003; 22(2): 463-470.

– Stinchcombe, White. Universal approximation using feedforward networks with non-sigmoid hidden layer activation functions. Trabalho apresentado no International 1989 Joint Conference on Neural Networks; 1989 613-617 [acesso em 21 jul. 2021]; Washington. Disponível em: https://ieeexplore.ieee.org/document/118640/citations?tabFilter=patents

– Cotter N. The Stone-Weierstrass Theorem and Its Application to Neural Networks. IEEE Transactions on Neural Networks 1990; 1(4), 1990: 290-295.

– Ito Y. Representation of functions by superpositions of a step or sigmoid function and their applications to neural network theory. Neural Networs 1991; 4(3): 385-394.

– Hornik K. Approximation capabilities of multilayer feedforward networks. Neural Networks 1991; 4(2): 251-257.

– Hornik k, Stinchcombe, White H. Multilayer feedforward networks are universal approximators. Neural Networks 1989; 2(5): 359-366.

– Kreinovich V. Arbitrary nonlinearity is sufficient to represent all functions by neural networks: A theorem. Neural Networks 1991; 4(3): 381-383.

– Ripley B. Pattern recognition and neural networks. Cambridge: Cambridge University Press; 1996.

– Silva I, Spatti D, Flauzino R. Redes Neurais Artificiais Para Engenharia e Ciências Aplicadas. São Paulo: Artliber; 2010.

– Hassoun M. Fundamentals of Artificial Neural Networks. Massachusetts: MIT-press; 1995.

– Sammut C, Webb G. Encyclopedia of Machine Learning. Boston: Springer; 2010.

– Yang X. Nature-Inspired Optimization Algorithms. 2. ed. London: Academic Press; 2011.

– Haeser G, Gomes M. Aspectos Teóricos de Simulated Annealing e um Algoritmo duas Fases em Otimização Global. Trends in Computational and applied Mathematics 2008; 9(3): 395-404.

– Sloss A, Gustafson S. 2019 Evolutionary Algorithms Review. Neural and Evolutionary Computing. [Internet]. 2019 [ acesso em 21 jul. 2021]. Disponível em: https://arxiv.org/pdf/1906.08870.pdf

– Radcliffe N. Equivalence Class Analysis of Genetic Algorithms. Complex Systems 1991; 5(2):183-205.

– Wright A. Genetic Algorithms for Real Parameter Optimization. Foundations of Genetic Algorithms 1991; 1: 205-218.

– Michalewics Z. Genetic Algorithms + Data Structures = Evolution Programs. 3.ed. Berlin: Springer-Berlin-Heidelberg; 1992.

– Mühlenbein H, Schlierkamp-Voosen D. Predictive Models for the Breeder Genetic Algorithm I. Continuous Parameter Optimization. Evolutionary Computation 1993; 1: 25-49.

– Eshelman L. Schaffer D. Real-Coded Genetic Algorithms and Interval Schemata. Foundations of Genetic Algorithms 1993; 2: 187-202.

– Rodrigues M, Machado C, Lima M. Simulated annealing aplicado ao problema de alocação de berços. Journal of Transport Literature 2013; 7(3): 117-136.

– Chen Y, Roux B. Generalized Metropolis acceptance criterion for hybrid non-equilibrium molecular dynamics – Monte Carlo simulations. J. Chem. Phys. 2015; 142: 024101.

– Duchi J, Singer Y. Adaptative Subgradient Methods for Online Learning and Stochastic Optimization. Journal of Machine Learning Research 2011; 12: 2121-2159.

– Graves A. Generating Sequences with Recurrent Neural Networks. Neural and Evolutionary Computing. [Internet]. 2013 [acesso em 21 jul 2021]. Disponível em: https://arxiv.org/abs/1308.0850

– Yu, E. A method of solving a convex programming problem with convergence rate O(1/k^2). Dokl. Akad. Nauk SSSR 1983; 269(3): 543-547.

– Zeiler M. ADADELTA: An adaptive Learning Rate Method. Machine Learning. [Internet]. 2012 [acesso em 21 jul 2021]. Disponível em: https://arxiv.org/abs/1212.5701

– Kingma D, Ba J. Adam: A method for stochastic optimization. [Internet]. 2017 [acesso em 21 jul 2021]. Disponível em: https://arxiv.org/abs/1412.6980

– Ruder S. An overview of gradient descente optimization algorithms. [Internet]. 2016 [acesso em 21 jul 2021]. Disponível em: https://arxiv.org/abs/1609.04747

– Vamathevan J, Clark D, Czodrowski P, Dunham I, Edgardo F, George L. et al. Applications of machine learning in drug discovery and development. Nature Reviews Drug Discovery 2019; 18(6): 463-477.

– Patel L, Shukla T, Huang X, Ussery D, Wang S. Machine Learning Methods in Drug Discovery. Molecules 2020; 25(22): 5277.

– Chen, H, Engkvist O, Wang Y, Olivecrona M, Blaschke T. The rise of deep learning in drug discovery. Drug Discovery Today 2018;23(6): 1241-1250.

– Tkatchenko A. Machine learning for chemical discovery. Nature Communications 2020; 11: 1-4.

– von Lilienfeld O, Burke K. Retrospective on a decade of machine learning for chemical discovery. Nature Communications 2020; 11: 1-4.

– Cova T, Pais A. Deep Learning for Deep Chemistry: Optimizing the Prediction of Chemical Patterns. Frontiers in Chemistry 2019; 7: 809.

– Gastegger M, Behler P, Marquetand. Machine learning molecular dynamics for the simulation of infrared spectra. Chemical Science 2017; 8(10): 6924-6935.

– Gebhardt J, Kiesel M, Riniker S, Hansen N. Combining Molecular Dynamics and Machine Learning to Predict Self-Solvation Free Energies and Limiting Activity Coefficients. Journal of Chemical Information and Modeling 2020; 60(11): 5319-5330.

– Li H, Collins C, Tanha M, Gordon J, Yaron D. A Density Functional Tight Binding Layer for Deep Learning of Chemical Hamiltonians. Journal of Chemical Theory and Computation 2018; 14(11): 5764-5776.

– von Lilienfeld O, Ramakrishnan R, Rupp M, Knoll A. Fourier series of atomic radial distribution functions: A molecular fingerprint for machine learning models of quantum chemical properties. International Journal of Quantum Chemistry 2015; 115(16): 1084-1093.

– Lopez-Bezanilla A, von Lilienfeld O. Modeling electronic quantum transport with machine learning. Physical Review B – Condensed Matter and Materials Physics 2014; 89: 235411.

Creative Commons License

Este trabalho está licenciado sob uma licença Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Copyright (c) 2021 Gabriel César Pereira, Rogério Custodio

Downloads

Não há dados estatísticos.