2

Given that we have:

  • x is 2d matrix with size [numSamples x numFeatures]
  • A is 2d square matrix with size [numFeatures x numFeatures]
  • B is a 1d vector with size [1 x numFeatures]

I would like to evaluate the following code without a loop: (or in a way faster way)

out = zeros(1,numSamples);
for i = 1:numSamples
    res = sum(repmat(B - x(i,:), numSamples, 1)*A.*(x - repmat(x(i,:), numSamples, 1)), 2).^2;
    out(i) = var(res);
end

If you have other suggestions on a faster improvement of the above, it is also more than welcome.

4
  • 1
    Let's take a step back. How fast is this code when you run it on your test data? Are you not happy with the time? Commented Oct 28, 2015 at 19:32
  • Unfortunately no. For a large number of samples it becomes very slow. Commented Oct 28, 2015 at 19:45
  • In addition to the comment above, is there a particular formula you try to implement? Commented Oct 28, 2015 at 19:46
  • Nothing famous. This is a simplified part of the whole formula, the rest is normalization expressions and I felt that including them will only complicate things. I feel that if you can help me with this simplified expression I am certain that I can extrapolate to the more general expression. Commented Oct 28, 2015 at 19:50

1 Answer 1

7

You can bsxfun those piece-by-piece for a vectorized solution -

P1 = bsxfun(@minus,B,x)*A;
P2 = bsxfun(@minus,x,permute(x,[3 2 1]));
out = var(squeeze(sum(bsxfun(@times,P1,P2),2)).^2.');

Partially vectorized approach -

P = (bsxfun(@minus,B,x)*A).';  %//'
out = zeros(1,numSamples);
for i = 1:numSamples
    out(i) = var((bsxfun(@minus,x,x(i,:))*P(:,i)).^2);
end
Sign up to request clarification or add additional context in comments.

6 Comments

thank you for the reply, let me verify that this works :)
@user2324712 Damn, I see some disbelief there! ;)
@user2324712 Sure, that's the best way to do it!
That sure solves it. On my aging laptop, the jit solution in matlab makes the original implementation faster. But I am quite sure that for big data on a gpu your solution takes the cake. Thank you!
@user2324712 bsxfun loves cake aka memory, so feed it good ;) Just be careful, because with huge data, the memory transfers, rather than the actual computations might slow it down. But, all those would depend a lot on the GPU and the associated hardware.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.