LEADER 04646nam 22006135 450 001 9911007495103321 005 20250523130345.0 010 $a3-031-88091-9 024 7 $a10.1007/978-3-031-88091-9 035 $a(CKB)39124520700041 035 $a(DE-He213)978-3-031-88091-9 035 $a(MiAaPQ)EBC32127769 035 $a(Au-PeEL)EBL32127769 035 $a(EXLCZ)9939124520700041 100 $a20250523d2025 u| 0 101 0 $aeng 135 $aur||||||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aFractional-Order Activation Functions for Neural Networks $eCase Studies on Forecasting Wind Turbines' Generated Power /$fby Kishore Bingi, Ramadevi Bhukya, Venkata Ramana Kasi 205 $a1st ed. 2025. 210 1$aCham :$cSpringer Nature Switzerland :$cImprint: Springer,$d2025. 215 $a1 online resource (XVII, 238 p. 135 illus., 134 illus. in color.) 225 1 $aStudies in Systems, Decision and Control,$x2198-4190 ;$v588 311 08$a3-031-88090-0 327 $aIntroduction -- Fractional-order Activation Functions -- Fractional-order Neural Networks -- Forecasting of Texas Wind Turbines? Generated Power -- Forecasting of Jeju Islands Wind Turbines? Generated Power -- Forecasting of Renewable Energy Using Fractional-Order Neural Networks -- Fractional Feedforward Neural Network-Based Smart Grid Stability Prediction Model. 330 $aThis book suggests the development of single and multi-layer fractional-order neural networks that incorporate fractional-order activation functions derived using fractional-order derivatives. Activation functions are essential in neural networks as they introduce nonlinearity, enabling the models to learn complex patterns in data. However, traditional activation functions have limitations such as non-differentiability, vanishing gradient problems, and inactive neurons at negative inputs, which can affect the performance of neural networks, especially for tasks involving intricate nonlinear dynamics. To address these issues, fractional-order derivatives from fractional calculus have been proposed. These derivatives can model complex systems with non-local or non-Markovian behavior. The aim is to improve wind power prediction accuracy using datasets from the Texas wind turbine and Jeju Island wind farm under various scenarios. The book explores the advantages of fractional-order activation functions in terms of robustness, faster convergence, and greater flexibility in hyper-parameter tuning. It includes a comparative analysis of single and multi-layer fractional-order neural networks versus conventional neural networks, assessing their performance based on metrics such as mean square error and coefficient of determination. The impact of using machine learning models to impute missing data on the performance of networks is also discussed. This book demonstrates the potential of fractional-order activation functions to enhance neural network models, particularly in predicting chaotic time series. The findings suggest that fractional-order activation functions can significantly improve accuracy and performance, emphasizing the importance of advancing activation function design in neural network analysis. Additionally, the book is a valuable teaching and learning resource for undergraduate and postgraduate students conducting research in this field. . 410 0$aStudies in Systems, Decision and Control,$x2198-4190 ;$v588 606 $aElectric power production 606 $aEngineering mathematics 606 $aEngineering$xData processing 606 $aProduction engineering 606 $aMechanical Power Engineering 606 $aMathematical and Computational Engineering Applications 606 $aProcess Engineering 615 0$aElectric power production. 615 0$aEngineering mathematics. 615 0$aEngineering$xData processing. 615 0$aProduction engineering. 615 14$aMechanical Power Engineering. 615 24$aMathematical and Computational Engineering Applications. 615 24$aProcess Engineering. 676 $a621.31 700 $aBingi$b Kishore$4aut$4http://id.loc.gov/vocabulary/relators/aut$01226854 702 $aBhukya$b Ramadevi$4aut$4http://id.loc.gov/vocabulary/relators/aut 702 $aKasi$b Venkata Ramana$4aut$4http://id.loc.gov/vocabulary/relators/aut 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9911007495103321 996 $aFractional-Order Activation Functions for Neural Networks$94388862 997 $aUNINA