Regularization by Intrinsic Plasticity and Its Synergies with Recurrence for Random Projection Methods

Neumann, Klaus and Emmerich, Christian and Steil, Jochen J. (2012) Regularization by Intrinsic Plasticity and Its Synergies with Recurrence for Random Projection Methods. Journal of Intelligent Learning Systems and Applications, 04 (03). pp. 230-246. ISSN 2150-8402

[thumbnail of JILSA20120300008_65071686.pdf] Text
JILSA20120300008_65071686.pdf - Published Version

Download (1MB)

Abstract

Neural networks based on high-dimensional random feature generation have become popular under the notions extreme learning machine (ELM) and reservoir computing (RC). We provide an in-depth analysis of such networks with respect to feature selection, model complexity, and regularization. Starting from an ELM, we show how recurrent connections increase the effective complexity leading to reservoir networks. On the contrary, intrinsic plasticity (IP), a biologically inspired, unsupervised learning rule, acts as a task-specific feature regularizer, which tunes the effective model complexity. Combing both mechanisms in the framework of static reservoir computing, we achieve an excellent balance of feature complexity and regularization, which provides an impressive robustness to other model selection parameters like network size, initialization ranges, or the regularization parameter of the output learning. We demonstrate the advantages on several synthetic data as well as on benchmark tasks from the UCI repository providing practical insights how to use high-dimensional random networks for data processing.

Item Type: Article
Subjects: Bengali Archive > Engineering
Depositing User: Unnamed user with email support@bengaliarchive.com
Date Deposited: 06 Feb 2023 07:30
Last Modified: 25 May 2024 09:26
URI: http://science.archiveopenbook.com/id/eprint/144

Actions (login required)

View Item
View Item