The sRVM is a powerful smoothness prior extension of Tipping's Relevance Vector Machine and can be used for arbitrary (sparse) regression or shrinkage tasks that you might currently use e.g. RVM, SVM or wavelet shrinkage for. Although the default settings (see below) give pretty good results for many types of signals one might encounter in practice, best results are obtained through judicious selection of "kernel" and "priorType". The scheme is efficient -- linear per step with wavelets (and similarly for other kernels that can be implemented using efficient transforms rather than matrix-multiplication) and cubically in the number of selected components (which is typically quite small) otherwise.

Please refer to our paper sRVM: a smoothness prior extension of the Relevance Vector Machine (Machine Learning, 68:2, 107-135, (2007) doi: 10.1007/s10994-007-5012-z) for a detailed discussion, and the section Quick Intro, below for a short practical introduction.

Installation

  • unpack (e.g. in /home/joe/matlab/):
> tar xzf srvm.tgz
  • Edit your startup.m file to include /home/joe/matlab/srvm and /home/joe/matlab/awms in the matlab search-path, by adding something like:
addpath('/home/joe/matlab/awms');
addpath('/home/joe/matlab/srvm');

Dependencies

The code is tested under Matlab 7 -- earlier versions of Matlab might require some modifications.

No particular 3rd party library is required, but in order to use symmlet or haar kernels it is necessary to have some wavelet toolkit installed. The simplest choice is to use the free WaveLab package, because the code in @wtMat and makeKernel expects interface-compatibility to the WaveLab functions MakeONFilter, FWT_PO and IWT_PO. However a straightforward wrapper to any other wavelet toolkit should also suffice. The MakeSignal function from WaveLab is also used by rvmopts to create test signals, but an alternative makeSignal function is also supplied and will be used if MakeSignal is not in the path.

Quick Intro

For a quick illustration create a signal (yy in the example) and add noise to it to obtain some fake regression targets tt:

>> yy = sinc(linspace(-10,10,128)'./pi);
>> tt = yy + 1/7 .* std(yy) .* randn(128,1);

Then do sRVM regression with the default options, type:

>> opts = rvmopts(struct('tt',tt));
>> stats = rvm(opts);

With the default plot options the targets appear as blue dots and the regression result yy_hat as magenta line. You might want to do individual plots of yy, tt and the regression result stats.yy_hat yourself and check how well the noise has been estimated (stats.sigma_sq / var(yy-tt)).

Note that for the default symmlet wavelet kernels N=length(tt) must be a power of 2. To use for example an gaussian kernel with width r=3.0 instead (which does not have this restriction and also doesn't require Wavelab or a similar wavelet package), do:

>> opts = rvmopts(struct('tt', tt, 'kernel', 'gauss', 'r', 3.0))

instead (you may notice that this type of kernel is particularly well suited to simple Sinc-style datasets).

Here is a slightly more interesting example that the classical RVM would have some trouble with (try setting opts.priorType = 'None' to obtain classical RVM behavior for any of the examples; also try zooming into the left part of the plot):

>> opts = rvmopts(struct('sig', 'Doppler', 'N', 4096, 'kernel', 'symmlet','SNR', 6.0));
>> stats = rvm(opts);

Reproducing results from the article

Most of the figures of the article can be very straightforwardly obtained by plugging the right parameters into the following template:

>> stats = rvm(rvmopts(struct('sig', sig, 'N', N, 'kernel', kernel, ...
    'r',  r, 'SNR', SNR, 'priorType', priorType, 'seeds', seeds)))

seeds(1) is the random seed used to construct the noisy targets from the signal; seeds(2) is the random seed that is used for the control the order of component selection the other options, all other options should be clear from the article (else see Calling rvmopts); note that r is simply ignored for symmlets. Multiple runs with different values for seeds(2) give an indication of the robustness of the solution for a particular data set.

Sinc and Bumps data (Figures 1 and 3)

Set priorType = 'None' for figure 1 and priorType = 'BIC' for figure 3, the parameters for the subfigures from left to right, top to bottom are then:

sig='Sinc';  SNR=2.0; N=128; kernel='symmlet';       seeds = [8 24];
sig='Sinc';  SNR=2.0; N=128; kernel='lspline'; r=3.0; seeds = [8 24];
sig='Bumps'; SNR=7.0; N=128; kernel='lspline'; r=3.0;seeds = [10 30];
sig='Bumps;  SNR=7.0; N=128; kernel='symmlet';        seeds = [10 30];

Individual Basis functions (Figure 2)

To examine individual basis functions for different kernels, try something like:

>> symmlet = makeKernel('symmlet', 512, struct('pseudoMatrix', false));
>> lspline = makeKernel('lspline', 512, struct('r', 3.0));
>> js=[1,10,23,230];
>> for i=1:4, subplot(4,1,i); plot(symmlet(:,js(,i))); end

Doppler data (Figure 5)

Again both priorType = None and priorType = BIC are used, the other options are:

sig='Doppler'; SNR=7.0; N=1024; kernel='symmlet';        seeds = [10 20];    
sig='Doppler'; SNR=7.0; N=1024; kernel='gauss';  r=3.0;  seeds = [10 20];
sig='Doppler'; SNR=7.0; N=1024; kernel='gauss';  r=0.05; seeds = [10 20];

N.B: In general, with bad parameter values (i.e. kernel, r and priorType ill-suited for the data, especially using priorType='None' which can lead to a huge number of active components degrading performance and numerics considerably -- remember that computational load scales with the number of active components; cubically for normal kernels, linearly for wavelets) convergence can pretty much take for ever, especially since the convergence criterion is quite fuzzy to avoid premature stopping. For N=1024 this will be more noticeable than in the N=128 Sinc and Bumps experiments above. So, apart from yielding poor results, the non-symmlet non-smoothness prior trials for the Doppler data will take much longer than with a smoothness prior, and it seems possible that not all seedings will converge. Addressing this issue of the classical RVM is of course the whole point of the sRVM, but to get a quicker impression how the classical RVM does without relaxing the convergence criterion either set maxSteps to force early termination or specify 'plot', 2, which will periodically plot the current solution as the search progresses.

Shrinkage Plots (Figure 6)

To make a simple shrinkage plot, just add computeLS', true to the parameters, run and then:

>> plot(stats.mmu_LS, stats.mmu_LS, 'k', stats.mmu_LS,stats.mmu_M, 'r.');
>> axis(.5*[-1 1 -1 1]); axis square; grid

Detailed Description

The only files that the user would normally want to use directly are rvmopts.m, rvm.m (and to a lesser extent plotRvm.m and makeKernel.m ).

Calling rvm

rvm expects a structure with options (opts in these examples) and returns a structure with results (stats in these examples).

Option Structure (opts)

Typically one will want to create the options using the rvmopts command (see `Calling rvmopts`_), but to just vary a certain option in an existing opts structure without modifying the original you might find it useful to use the restruct utility in awms, e.g. to turn off plotting and verbose messages try rvm(restruct(opts, 'plot', false, 'chatty', false)).

Result Structure (stats)

The most important fields in stats are as follows:

yy_hat
The mean posterior prediction for the true signal yy.
mmu_M
The posterior mean regression weights.
sel
A boolean vector signifying the selected components.
mmu
mmu_M(sel).
sigma_sq
The estimated value for the noise variance.
aalpha
The final values for the weight precision vector aalpha.
Post
The log posterior probability of the result (L_mathcal_hat in the article).
steps
The number of steps that it took to reach the result.
realSteps
The number of steps during which an actual addition/deletion/reestimation happened.
converged
False if the model just stopped because maxSteps was exceeded, true otherwise.

Calling rvmopts

rvmopts takes a structure with parameters for the sRVM and supplies some sensible default values for everything that is not specified -- in the most extreme case, if nothing is specified at all (opts = rvmopts(struct())), some fake data is generated for demonstration, too. Typically one would want to specify at least the regression targets tt and the kernel type, which can be done as follows:

  • to specify the targets for regression one can either
    1. pass a column vector tt (and optionally the true signal yy if known, for calculating the MSE of the result yy_hat)
    2. pass N (default 512), the number of data points and sig, a signal name that is either 'Sinc' or recognized by MakeSignal (or makeSignal), (default 'Doppler'). To control the amount of noise in the targets one can either specify:
      SNR
      the signal to noise ratio of the desired targets or
      sigma
      the std of the noise.
      The seeding for the the random noise added to the signal is controlled by the first entry in the seeds vector.
  • to specify the kernel to use one can either pass
    1. PPhi -- a kernel matrix or
    2. kernel, a kernel name that is recognized by MakeKernel (e.g. 'symmlet' (default), 'lspline', 'gauss' or 'tpspline'). For some kernels one should also supply a width parameter r (default 3.0).
  • the amount of sparsity is controlled by setting priorType which can be one of (in increasing order of sparsity): None (for classical RVM behavior); AIC; BIC (default); or RIC.
  • to control the noise restimation process one can either:
    1. set reest__sigma to a certain perior of realSteps at which noise reestimation is to occur (default 10) and optionally set sigma_0, the initial guess (default (std(tt)/100).^2). An imaginary value will result in a multiple of the true sigma, if it is also specified.
    2. set reest__sigma to 0 to turn off noise reestimation and give the desired noise level via the sigma parameter. A multiple of the true sigma might be specified by setting sigma_0 to an imaginary value, e.g. 2j.
  • the seeding of the random number generator for fake target creation and the order in which component selection proceeds is respectively controlled by the first and second element in the 2 element vector seeds (default rand(2,1)).
  • further options include:
    plot
    Plot results if nonzero; if > 1 also plot progressive results during run.
    chatty
    An integer >=0 that controls the verbosity of printouts during execution.
    save
    A boolean that controls whether the experiment is saved in a .mat-file, the name of which is partly controlled by the DUB parameter
    maxSteps
    The maximum number of steps that is allowed before (default 1E6).

Calling plotRvm

If the opts structure passed to rvm has 'opts.plot == true (default) then results will be automatically plotted, but you can also manually create a plot by using plotRvm(opts, stats).

Calling makeKernel

The desired kernel can in most cases by created by just passing the appropriate parameters to rvmopts, e.g.

opts = rvmopts('kernel', 'gauss', 'r', 3.0, ...);

However calling MakeKernel can be useful to built more customized (e.g. overcomplete) kernels. The usage is:

>> K = makeKernel(type, N, kernelOpts)

where [N,N] = size(K) and type can currently be one of:

  • symmlet, haar, gauss, lspline (linear spline) and tpspline (thin plate spline)
  • the optional kernelOpts structure can be used to specify:
    1. the kernel width (r) for kernel for which it makes sense (spline kernels)
    2. via pseudoMatrix option (default true), whether K will be a proper matrix or just an internally more efficient matrix like-object, for kernels were matrix-multiplication is mathematically equivalent to a much more efficient transform (currently wavelet kernels, i.e. symmlet and haar).

Notes

Bear in mind that this code is mostly meant as an illustration and implementation for the method described our research paper, not as production quality software, so it will certainly suffer from various shortcomings.

Queries and bug reports should be directed to Alex Schmolck with sRVM in the subject line (no html mail, please).

Download

The code is available as gzipped tar archive: Attach: srvm.tgz

All Recent Changes Page History Upload Edit Page Edit SideBar