Graph Spectral Image Processing. Gene Cheung
Чтение книги онлайн.
Читать онлайн книгу Graph Spectral Image Processing - Gene Cheung страница 10
I.3. Graph spectrum
Denote by W ∈ RN×N an adjacency matrix, where the (i, j)th entry is Wi,j = wi,j. Next, denote by D ∈ RN×N a diagonal degree matrix, where the (i, i)th entry is Di,i = j Wi,j. A combinatorial graph Laplacian matrix L is L = D − W (Shuman et al. 2013). Because L is real and symmetric, one can show, via the spectral theorem, that it can be eigen-decomposed into:
[I.2]
where Λ is a diagonal matrix containing real eigenvalues λk along the diagonal, and U is an eigen-matrix composed of orthogonal eigenvectors ui as columns. If all edge weights wi,j are restricted to be positive, then graph Laplacian L can be proven to be positive semi-definite (PSD) (Chung 1997)2, meaning that λk ≥ 0, ∀k and xLx ≥ 0, ∀x. Non-negative eigenvalues λk can be interpreted as graph frequencies, and eigenvectors U can be interpreted as corresponding graph Fourier modes. Together they define the graph spectrum for graph G.
The set of eigenvectors U for L collectively form the GFT (Shuman et al. 2013), which can be used to decompose a graph signal x into its frequency components via α = Ux. In fact, one can interpret GFT as a generalization of known discrete transforms like the Discrete Cosine Transform (DCT) (see Shuman et al. 2013 for details).
Note that if the multiplicity mk of an eigenvalue λk is larger than 1, then the set of eigenvectors that span the corresponding eigen-subspace of dimension mk is non-unique. In this case, it is necessary to specify the graph spectrum as the collection of eigenvectors U themselves.
If we also consider negative edge weights wi,j that reflect inter-pixel dissimilarity/anti-correlation, then graph Laplacian L can be indefinite. We will discuss a few recent works (Su et al. 2017; Cheung et al. 2018) that employ negative edges in later chapters.
I.4. Graph variation operators
Closely related to the combinatorial graph, Laplacian L, are other variants of Laplacian operators, each with their own unique spectral properties. A normalized graph Laplacian Ln = D−1/2LD−1/2 is a symmetric normalized variant of L. In contrast, a random walk graph Laplacian Lr = D−1L is an asymmetric normalized variant of L.A generalized graph Laplacian Lg = L + diag(D) is a graph Laplacian with self-loops di,i at nodes i – called the loopy graph Laplacian in Dörfler and Bullo (2013) – resulting in a general symmetric matrix with non-positive off-diagonal entries for a positive graph (Biyikoglu et al. 2005). Eigen-decomposition can also be performed on these operators to acquire a set of graph frequencies and graph Fourier modes. For example, normalized variants Ln and Lr (which are similarity transforms of each other) share the same eigenvalues between 0 and 2. While L and Ln are both symmetric, Ln does not have the constant vector as an eigenvector. Asymmetric Lr can be symmetrized via left and right diagonal matrix multiplications (Milanfar 2013a). Different variation operators will be used throughout the book for different applications.
I.5. Graph signal smoothness priors
Traditionally, for graph G with positive edge weights, signal x is considered smooth if each sample xi on node i is similar to samples xj on neighboring nodes j with large wi,j. In the graph frequency domain, it means that x mostly contains low graph frequency components, i.e. coefficients α = Ux are zeros (or mostly zeros) for high frequencies. The smoothest signal is the constant vector – the first eigenvector u1 for L, corresponding to the smallest eigenvalue λ1 = 0.
Mathematically, we can declare that a signal x is smooth if its graph Laplacian regularizer (GLR) xLx is small (Pang and Cheung 2017). GLR can be expressed as:
[I.3]
Because L is PSD, xLx is lower bounded by 0 and achieved when x = cu1 for some scalar constant c. One can also define GLR using the normalized graph Laplacian Ln instead of L, resulting in xLnx. The caveats is that the constant vector u1 – typically the most common signal in imaging – is no longer the first eigenvector, and thus
.In Chen et al. (2015), the adjacency matrix W is interpreted as a shift operator, and thus, graph signal smoothness is instead defined as the difference between a signal x and its shifted version Wx. Specifically, graph total variation (GTV) based on lp-norm is:
where λmax is the eigenvalue of W with the largest magnitude (also called the spectral radius), and p is a chosen integer. As a variant to equation [I.4], a quadratic smoothness prior is defined in Romano et al. (2017), using a row-stochastic version Wn = D−1W of the adjacency matrix W:
To avoid confusion, we will call equation [I.5] the graph shift variation (GSV) prior. GSV is easier to use in practice than GTV, since the computation of λmax is required for GTV. Note that GSV, as defined in equation [I.5], can also be used for signals on directed graphs.