2
$\begingroup$

After doing a bit of digging, I can't find any native method in Julia's linear algebra package that let's me quotient a vector by a subspace. The Wikipedia article seemed to mainly focus on the theory of performing the quotient (which I already know), and I couldn't easily find a source for good quotienting algorithms online.

Notably, writing out a column vector $$V = \underset i \bigoplus \,v_i$$

for some subspace $W \subseteq V$, I want to take the quotient $V / W$.

What are some good algorithms that I can set up quickly and do reliable quotients with? I'd prefer to keep this mostly hand coded, although I have no issue using basic packages like Linear Algebra.

The preference here is for simplicity and speed; I'd rather have something easy to code in relatively few lines rather than something optimized but highly time consuming to work with.

I'm naive to this subject so I don't know if there are challenging edge cases or whatnot, but the dimension of these vector spaces shouldn't be absurdly high, and I don't imagine there should be any pathological numerical inaccuracies when it comes to finding the quotients of these vectors. Papers, textbook pages, and articles are welcome, but I would need to be able to quickly read and copy the information. Julia isn't required, just general computational methods that are fast to implement.

$\endgroup$
2
  • 1
    $\begingroup$ You are familiar with and can implement Gaussian elimination? $\endgroup$ Commented Oct 7, 2024 at 3:29
  • $\begingroup$ @MartinBrandenburg For sure! $\endgroup$ Commented Oct 7, 2024 at 3:45

1 Answer 1

2
$\begingroup$

The simplest algorithm I can think of: make a basis of $V$ starting with a basis of $W$, then delete the basis of $W$ from the result.

Start with bases $\{w_i\}_{i=1}^{\dim W}$ of $W$ and $\{v_i\}_{i=1}^{\dim V}$ of $V$.

  1. Apply Gram-Schmidt to the set of vectors (in this order) $$\{w_1, \dots, w_{\dim W}, v_1, \dots v_{\dim V}\} \text{,}$$ discarding any $v_i$ that are reduced to $0$ by projection on the span of the prior vectors. (The LinearAlgebra module's qr() can be used to do this. See https://discourse.julialang.org/t/gram-schmidt-decomposition/75417 .)
  2. Delete the first $\dim W$ vectors from the result. What remains is a (-n orthonormal) basis for the complement of $W$ in $V$.

First, the $\{w_i\}$ form a basis, so all $\dim W$ of them will survive Gram-Schmidt, although they will generically be altered. Then every $v_i$ will have its projection into $W$ subtracted, leaving either $\vec{0}$ or a vector in the complement of $W$. Since the $\{v_i\}$ span $V$, projecting out their $W$ parts leaves a set of vectors than span $V/W$. So we end up with the collection of vectors $$ \{ \text{orthonormalized basis of $W$}, \text{orthornomalized basis of $V/W$}\} \text{.}$$ Now delete the first $\dim W$ vectors and we are done.

$\endgroup$
7
  • $\begingroup$ I attempted your solution but my attempt is not working. I represented your set of vectors as being two column space matrices that are horizontally concatenated to each other. Then I took the QR factorization. The result only has $\dim V$ vectors, and nothing else. Attached is an image with two vectors fed in, the constructed matrix $A$, and the Q-factor of the QR factorization of $A$. You can view the image to see the issue, and potentially see what I'm doing wrong: imgur.com/GnObbbj $\endgroup$ Commented Oct 7, 2024 at 7:00
  • $\begingroup$ Also, in your last line you said you had a set of vectors that span $W/V$. Did you intentionally switch the numerator and denominator? I thought quotients required that the denominator be a subspace of the numerator. $\endgroup$ Commented Oct 7, 2024 at 14:11
  • 1
    $\begingroup$ @Nate : Apparently, ... I switched the roles of $V$ and $W$ throughout this Answer. That's fixed now. $\endgroup$ Commented Oct 7, 2024 at 14:34
  • $\begingroup$ Thanks, but I'm still having trouble constructing the set and performing QR factorization to get the quotient. For the toy example of the 6-dimensional matrix $V=(1,3,5,7,9,11)^\intercal$, I constructed the basis set as a $6\times 6$ matrix $A_V$ where all entries are zero besides $A[i,i] = V[i]$. I constructed $A_W$ from $W=(1,0,5,0,9,0)^\intercal$ similarly, but I removed all columns from the matrix that were zero. This gave $A_W$ as $6 \times 3$ matrix. I then used hcat to attach $A_W$ to $A_V$ and took the Q factor of the QR decomp, but this didn't work (see imgur.com/GnObbbj). $\endgroup$ Commented Oct 7, 2024 at 15:08
  • $\begingroup$ If I were to represent the basis of $W$ as a square $A_W$ matrix whose column space spans $W$, should I get rid of any columns that are all zeros before Hcatting and performing QR factorization? $\endgroup$ Commented Oct 7, 2024 at 15:12

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.