Eigenvalue and eigenvector derivative examples
The repo contains code that implements two formulas (one using the adjoint method and the other using reverse algorithmic differentiation (RAD)) to efficiently compute the derivative of the functionals of the form
The adjoint method is more general but is a bit more complex. The RAD formula is more elegant but specific. For an eigenvalue problem that takes the following form:
$\mathbf{A}\boldsymbol{\phi} = \lambda \boldsymbol{\phi}$
And the corresponding left eigenproblem is formed to be used later.
$\mathbf{A}^\tilde{\boldsymbol{\phi}} = \lambda^ \tilde{\boldsymbol{\phi}}$
The derivative of the eigenvalue,
$ \begin{aligned} \frac{\mathrm{d} \lambda_r}{\mathrm{d} \mathbf{A}_r} &= \mathrm{Re}\left(\frac{\tilde{\boldsymbol{\phi}} {\boldsymbol{\phi}}^}{\boldsymbol{\phi}^\tilde{\boldsymbol{\phi}}}\right),\ \frac{\mathrm{d} \lambda_r}{\mathrm{d} \mathbf{A}_i} &= \mathrm{Im}\left(\frac{\tilde{\boldsymbol{\phi}} {\boldsymbol{\phi}}^}{\boldsymbol{\phi}^\tilde{\boldsymbol{\phi}}}\right),\ \frac{\mathrm{d} \lambda_i}{\mathrm{d} \mathbf{A}_r} &= -\frac{\mathrm{d} \lambda_r}{\mathrm{d} \mathbf{A}_i},\ \frac{\mathrm{d} \lambda_i}{\mathrm{d} \mathbf{A}_i} &= \frac{\mathrm{d} \lambda_r}{\mathrm{d} \mathbf{A}_r}. \end{aligned} $
It is assumed that the eigenvalue is an analytic function of the coefficient matrix (This is in general true. But there are corner cases that this does not hold, e.g., repeated eigenvalues). Use the link for more details or check the bottom of the description for more details.
In the literature, most formulas are "forward" methods meaning the derivative is propagated in the forward direction of the computation graph.
The method's computational cost is proportional to the number of design variables,
NOTE: There is another classic adjoint method for eigenvalue-eigenvector derivative computation in the literature by Murthy and Haftka but this method has a computational cost that scales with
$\mathit{O}(n_x)$ . It was named after the adjugate matrix (or the classic adjoint matrix).
For more details about derivative (sensitivity, gradient) computation, read the Chapter 6 of mdobook.
This repo contains the code of several eigenvalue and eigenvector derivative formulas derived in the following publication:
Sicheng He, Yayun Shi, Eirikur Jonsson, and Joaquim R.R.A. Martins. Eigenvalue problem derivatives computation for a complex matrix using the adjoint method. Mechanical Systems and Signal Processing, 185:109717, 2023. doi:10.1016/j.ymssp.2022.109717
The paper is also available to download via the link.