#### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

# Bilinear regression

Started work on a page on Bilinear regression.

## Comments

• Options
1.

Interesting. What's the relationship between bilinear regression and the "error in variables" problem?

Comment Source:Interesting. What's the relationship between bilinear regression and the "error in variables" problem? See http://azimuth.ch.mm.st/BayesianRegressionWithErrorsInVariables--JGalkowski.pdf
• Options
2.
edited August 2014

Hi Jan, I can't access that link, but I found this on wikipeida. Assuming it's talking about the same thing, I think they're talking about two different problems. One of the motivations in the papers I've found for bilinear regression is providing a way to enforce sparsity in the model: if each sample is an $A \times B$ matrix,then simply flattening it to a vector and doing linear regression gives a model with $AB$ coefficients, whereas bilinear regression with $m$ pairs of $(u_i,v_i)$ has $m (A+B)$ coefficients. This reduction has been obtained by some loss of flexibility, but the reports I've read seem to indicated that overall it does well at preventing over-fitting. If there is a connection I'd be very interested to know more about it.

I'm going to have a go at using bilinear regression on the El Nino dataset, so the entry is partly just a place to store my calculations of the derivatives -- which are done for the way more complicated logistic regression model in the papers I've found.

Comment Source:Hi Jan, I can't access that link, but I found this on [wikipeida](http://en.wikipedia.org/wiki/Errors-in-variables_models). Assuming it's talking about the same thing, I think they're talking about two different problems. One of the motivations in the papers I've found for bilinear regression is providing a way to enforce sparsity in the model: if each sample is an $A \times B$ matrix,then simply flattening it to a vector and doing linear regression gives a model with $AB$ coefficients, whereas bilinear regression with $m$ pairs of $(u_i,v_i)$ has $m (A+B)$ coefficients. This reduction has been obtained by some loss of flexibility, but the reports I've read seem to indicated that overall it does well at preventing over-fitting. If there is a connection I'd be very interested to know more about it. I'm going to have a go at using bilinear regression on the El Nino dataset, so the entry is partly just a place to store my calculations of the derivatives -- which are done for the way more complicated logistic regression model in the papers I've found.
• Options
3.
edited August 2014

If you send your email address to me at empirical_bayesian -at- ieee -dot- org happy to send you a copy. Mystery to me why you can't access. I could put it in Google.

And I've queued-up K. R. Gabriel, "Generalised bilinear regression", Biometrika (1998), 85, 3, 689-700, and M. Linder, R. Sundberg, "Second-order calibration: bilinear least squares regression and a simple alternative", Chemometric and Intelligent Laboratory Systems, 42 (1998), 159-178 to read ...

Comment Source:If you send your email address to me at empirical_bayesian -at- ieee -dot- org happy to send you a copy. Mystery to me why you can't access. I could put it in Google. And I've queued-up K. R. Gabriel, "Generalised bilinear regression", <i>Biometrika</i> (1998), <b>85</b>, 3, 689-700, and M. Linder, R. Sundberg, "Second-order calibration: bilinear least squares regression and a simple alternative", <i>Chemometric and Intelligent Laboratory Systems</i>, <b>42</b> (1998), 159-178 to read ...
• Options
4.

http://azimuth.ch.mm.st gives:

Directory Index Denied

This page is unable to be displayed because the website owner has disabled directory indexing for this site and has not provided an index.html file.

Please contact the website owner to find the correct URL.

If this is your site you may wish to add an index.html page with more useful information.

hth

Comment Source:http://azimuth.ch.mm.st gives: Directory Index Denied This page is unable to be displayed because the website owner has disabled directory indexing for this site and has not provided an index.html file. Please contact the website owner to find the correct URL. If this is your site you may wish to add an index.html page with more useful information. hth
• Options
5.

I've been a bit lazy in putting in references, but one of the papers I've been looking at is Sparse Bilinear Logistic Regression. One thing I've found is that there seem to be several things that go under the name of bilinear regression, but it's the kind of formulation in that paper that I'm looking at.

I've sent you an email to give you my email address.

Comment Source:I've been a bit lazy in putting in references, but one of the papers I've been looking at is [Sparse Bilinear Logistic Regression](ftp://ftp.math.ucla.edu/pub/camreport/cam14-12.pdf). One thing I've found is that there seem to be several things that go under the name of bilinear regression, but it's the kind of formulation in that paper that I'm looking at. I've sent you an email to give you my email address.
• Options
6.
edited August 2014

in Firefox gives

No page found

We couldn't find a page for the link you visited. Please check that you have the correct link and try again.

If you are the owner of this domain, you can setup a page here by creating a page/website in your account.

Comment Source:http://azimuth.ch.mm.st/BayesianRegressionWithErrorsInVariables–JGalkowski.pdf in Firefox gives ~~~~ No page found We couldn't find a page for the link you visited. Please check that you have the correct link and try again. If you are the owner of this domain, you can setup a page here by creating a page/website in your account. ~~~~
• Options
7.
edited August 2014

Suddenly figured why trying to get Jan's document didn't work: the forum has merged two dashes into an n-dash in the output, so pasting doesn't work. If you select "Source" and paste the link from there, it works and I can see the paper.

Comment Source:Suddenly figured why trying to get Jan's document didn't work: the forum has merged two dashes into an n-dash in the output, so pasting doesn't work. If you select "Source" and paste the link from there, it works and I can see the paper.
• Options
8.
edited August 2014

This seems like a relevant reference in Jan's paper:

] N. Cahill, A. C. Parnell, A. C. Kemp, B. P. Horton, “Modeling sea-level change using errors-in-variables integrated Gaussian processes”, 24th December 2013,

Comment Source:This seems like a relevant reference in Jan's paper: ] N. Cahill, A. C. Parnell, A. C. Kemp, B. P. Horton, “Modeling sea-level change using errors-in-variables integrated Gaussian processes”, 24th December 2013, http://arxiv.org/pdf/1312.6761.pdf .
• Options
9.

Just to note I've been trying to figure out a way to implement the bilinear model in a not-too-wasteful way in an array language like Matlab or NumPy (in order to avoid writing low-level code myself), but it's not proving obvious to me how to do this.

Comment Source:Just to note I've been trying to figure out a way to implement the bilinear model in a not-too-wasteful way in an array language like Matlab or NumPy (in order to avoid writing low-level code myself), but it's not proving obvious to me how to do this.
• Options
10.

Just for the record, here's some Matlab/Octave code for evaluating a set of bilinear regression coefficients. I'm not putting it on a more permanent place because it's so ugly, inefficient and is unbelievably slow even for smaller test examples when put into Octaves fminunc optimization function. I think I do need to think about writing some lower level code that will perform even remotely reasonably on the El Nino data.

function score=getScore(matrices,values,lambda,params)
NO_VECS=3;
vcs=reshape(params,64,NO_VECS);
u=vcs(1:32,NO_VECS);
v=vcs(33:64,NO_VECS);
acc=0;
for i = 1:size(matrices,3)
sc=sum(sum(u .* (matrices(:,:,i) * v)));
t=sc - values(i);
acc = acc + t * t;
end
%use L_1/2 regularisation/sparsification weights
acc=acc+lambda*sum(sqrt(abs(params)));
score=acc;
end

Comment Source:Just for the record, here's some Matlab/Octave code for evaluating a set of bilinear regression coefficients. I'm not putting it on a more permanent place because it's so ugly, inefficient and is unbelievably slow even for smaller test examples when put into Octaves fminunc optimization function. I think I do need to think about writing some lower level code that will perform even remotely reasonably on the El Nino data. ~~~~ function score=getScore(matrices,values,lambda,params) NO_VECS=3; vcs=reshape(params,64,NO_VECS); u=vcs(1:32,NO_VECS); v=vcs(33:64,NO_VECS); acc=0; for i = 1:size(matrices,3) sc=sum(sum(u .* (matrices(:,:,i) * v))); t=sc - values(i); acc = acc + t * t; end %use L_1/2 regularisation/sparsification weights acc=acc+lambda*sum(sqrt(abs(params))); score=acc; end ~~~~
Sign In or Register to comment.