Context: Suppose I have a matrix $P_k\in\mathbb{R}^{n\times n}$ that evolves in time $k$ according to $$ P_{k+1} = H_{\sigma(k)}^TP_kH_{\sigma(k)} $$ where $H_{\sigma(k)}\in\{H_1,\dots,H_L\}$, $H_i\in\mathbb{R}^{n\times n}$ are some given matrices and $\sigma(k)\in\{1,\dots,L\}$ is a scheduling signal. My goal is to obtain $\sigma(0),\dots,\sigma(N-1)$ such that $$ J = \sum_{k=0}^N \text{trace}(P_{k}) $$ is minimized, provided some initial positive definite matrix $P_0$.
My naive attempt: Up to now, I have only been able to solve this by enumerating all possible combinations of $[\sigma(0),\dots,\sigma(N-1)]\in\{1,\dots,L\}^N$, compute the cost for each of them and choose the one with the least cost. However, as you can see, this solution is not very efficient and can only be used for small $N$. In fact, this solution has complexity that grows as $L^N$.
The question(s):
- Do you think there is a better way to solve this problem, say by a polynomial time algorithm, or using dynamic programming to find a more efficient (still exponential) solution?
- Do you think there is a way to find an approximate solution to this problem which can be obtained in polynomial time?
- Do any of the previous questions answers change if we change (in "high level") $P_{k+1} = F(P_k,\sigma(k))$ for some nonlinear function $F(\bullet,\sigma(k))$.
Any ideas, suggestions or references for me to read are appreciated.