LightGCN

Summarizing of the paper

Head - LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation.

Algorithms:

euk+1=∑i∈Nu1∣Nu∣∣Ni∣ei(k)eik+1=∑u∈Ni1∣Ni∣∣Nu∣eu(k)\mathbf{e}^{k+1}_u=\sum_{i\in N_u}\dfrac{1}{\sqrt{|N_u|}\sqrt{|N_i|}}\mathbf{e}^{(k)}_i \\ \mathbf{e}^{k+1}_i=\sum_{u\in N_i}\dfrac{1}{\sqrt{|N_i|}\sqrt{|N_u|}}\mathbf{e}^{(k)}_u

The final representation is the form of combined layer embeddings.eu=∑k=0Kαkeu(k);    ei=∑k=0Kαkei(k)\mathbf{e}_u=\sum^K_{k=0}\alpha_k \mathbf{e}_u^{(k)}; \;\; \mathbf{e}_i=\sum^K_{k=0}\alpha_k\mathbf{e}^{(k)}_i

The model prediction is defined as the inner product of user and item final representations: y^ui=euTei\hat{y}_{ui}=\mathbf{e}^T_u\mathbf{e}_i . It implies the similarity between the user and item.

Matrix Form:

A=[0        RRT    0],    E(k+1)=(D−1/2AD−1/2)E(k)\mathbf{A}=\begin{bmatrix} \mathbf{0} \;\;\; \;\mathbf{R} \\ \mathbf{R}^T \;\; \mathbf{0} \end{bmatrix} , \;\; \mathbf{E}^{(k+1)}=(\mathbf{D}^{-1/2}\mathbf{A}\mathbf{D}^{-1/2})\mathbf{E}^{(k)}
  • R\mathbf{R} is a M×NM \times N user-item interaction matrix. Each entries 1 if uu is connected to ii

  • D\mathbf{D}is a (M+N)×(M+N)(M+N)\times(M+N) diagonal matrix, in which each entry DiiD_{ii} denotes the number of nonzero entries in the ithi_{th}row vector of A\mathbf{A}

  • E\mathbf{E} is a (M+N)×T(M+N)\times Tmatrix where TT is the embedding size.

We easily make this as a code using torch_geomtric.utils (reference)

Last updated