Online Sequential Extreme Learning Machine with Kernels

The paper proposes an application of the Kernel Recursive Least-Square algorithm for training the kernel-based Extreme Learning Machine. It is currently available at IEEE Transactions on Neural Networks and Learning Systems. Below you will find detailed instructions on how to repeat the experiments taken from the paper. If you use this code or a derivative thereof in your research, please cite the paper as:

Please consider citing also the Kernel Methods Toolbox which is used in the code.

The code in the paper has been implemented in the Lynx MATLAB toolbox. An alternative, standalone version of the KOS-ELM algorithm can be downloaded from here. The code is basic, and has not been extensively debugged. It is provided as-is, for comparison purposes.

 

Step 1 – Install Lynx

Download the latest version from the master branch of the toolbox:

https://github.com/ispamm/Lynx-Toolbox/

After downloading it, install it using the “install” script in the root folder. Refer to Chapter 3 of the user manual for more information on the guided procedure.

Step 2 – Download the datasets

Lynx comes with a small set of preinstalled datasets. To download all the datasets required by the article, run the “download_datasets” script. Please check that the following datasets are installed: uci_wdbc, sylva, mackey_glass.

Step 3 – Define the configuration

Lynx works by defining the details of a simulation in a configuration file. To this end, create a file called “config_koselm.m” in the “configs” folder. Below is the code for running a simulation on the WDBC dataset:

Step 4 – Run the simulation

To run the simulation, execute the script “run_simulation” and select the file you created previously. A few comments:

  • If you change the dataset (line 17) remember to change accordingly the parameters of the models. Please refer to Section 3.1 of the user manual for details on the syntax of the function “add_dataset”, and to Table 3 of the paper for details on the optimal parameters. Additionally, note that the actual regularization parameter to KOS-ELM is the inverse of that in Table 3 (for compatibility with older versions).
  • The K-ELM and KOS-ELM algorithm are in the classes “StandardRLS” and “OnlineRLS” respectively (names are due to historical reasons with respect to previous versions of the toolbox).
  • To repeat the grid-search procedure for the parameters, refer to Section 3.4.1 of the user manual (in the folder “manual”).