Spectral Gap Rewiring Layer

GAP-Layer is a graph neural network layer that helps to optimize the spectral gap of a graph by minimizing or maximizing the bottleneck size. The goal of GAP-Layer is to create more connected or separated communities depending on the mining task required. The Spectral Gap Rewiring The first step in implementing GAP-Layer is to minimize the spectral gap by minimizing the loss function. The loss function is given by: $$ L\_{Fiedler} = \|\tilde{\mathbf{A}}-\mathbf{A}\| \_F + \alpha(\lambda\_2)^

Stochastic Steady-state Embedding

What is SSE? SSE, which stands for Stochastic Steady-state Embedding, is an algorithm used to learn steady-state algorithms on graphs. Unlike other graph neural network models, SSE is stochastic and only requires 1-hop information to efficiently and effectively capture fixed point relationships. How does SSE work? SSE works by taking in a graph and a given steady-state algorithm. It then uses stochastic gradient descent to learn the parameters of the algorithm through backpropagation. SSE us

StoGCN

StoGCN is an algorithm used in machine learning to help with optimizing data. Specifically, this algorithm is designed to help with gathering information from the data's neighbors. This algorithmmatic process works to find a local optimum value of GCN (graph convolutional network). How does StoGCN work? At its core, this algorithm is based on the idea that by controlling the variance, you can use a smaller neighbor size to sample the data. Essentially, the algorithm helps to randomly select t

Symbolic Deep Learning

Symbolic Deep Learning: An Overview Symbolic deep learning is a technique that involves converting a neural network into an analytic equation. This general approach allows for a better understanding of the neural network's learned representations and has applications in discovering novel physical principles. The Technique The technique used in symbolic deep learning involves three steps: 1. Encourage sparse latent representations Sparse latent representations refer to the idea that the ne

TaxoExpan

Overview of TaxoExpan TaxoExpan is a unique self-supervised taxonomy expansion framework that is designed to automatically generate pairs of query concepts and anchor concepts from the existing taxonomy as training data. This framework is incredibly useful as it can learn to predict whether a query concept is the direct hyponym of an anchor concept. TaxoExpan features two primary components: a position-enhanced graph neural network and a noise-robust training objective. The primary goal of Tax

Temporal Graph Network

What is TGN? Temporal Graph Network, or TGN for short, is a type of framework used in deep learning on dynamic graphs. These graphs are represented as sequences of timed events. So, TGNs are used to analyze graph data where the information changes over time. This makes it different from other types of deep learning frameworks that focus only on static graphs. How Does TGN Work? The memory or state of the Temporal Graph Network is represented by a vector $\mathbf{s}_i(t)$ for each node $i$ th

Prev 234 4 / 4