Attentive Walk-Aggregating Graph Neural Network

Have you ever heard of AWARE? It stands for Attentive Walk-Aggregating GRaph Neural NEtwork. It may sound complicated, but it's actually a simple, interpretive, and supervised GNN model for graph-level prediction. What is AWARE and How Does it Work? AWARE is a model that aggregates walk information by means of weighting schemes at distinct levels such as vertex, walk, and graph level. The weighting schemes are incorporated in a principled manner, which means that they are carefully and system

DeepDrug

DeepDrug is a cutting-edge deep learning framework that has revolutionized the process of drug design and discovery. By combining the power of artificial intelligence and graph convolutional networks, DeepDrug is able to learn the graphical representations of various drugs and proteins to boost the prediction accuracy of drug-protein interactions. Understanding DeepDrug The process of drug discovery and design is fraught with challenges, and one of the biggest hurdles is the accurate predicti

Graph Finite-State Automaton

GFSA or Graph Finite-State Automaton is a layer that can be used for learning graph structure. This layer is differentiable, which means it can be trained end-to-end to add derived relationships or edges to arbitrary graph-structured data. GFSA works by adding a new edge type, expressed as a weighted adjacency matrix, to a base graph. This layer has been designed to be used in machine learning applications. What is GFSA? If you are familiar with machine learning and graph structure, you may h

Graph sampling based inductive learning method

Introduction to GraphSAINT GraphSAINT is a powerful new tool that helps train large scale graph neural networks (GNNs) more efficiently. GNNs are a type of artificial intelligence that can learn from data that exists in the form of graphs. Graphs are used to represent relationships between different objects. For example, a social network could be represented as a graph, where each person is a node in the graph and relationships between people (friends, family members, colleagues, etc.) are edg

Graphic Mutual Information

What is GMI? GMI, also known as Graphic Mutual Information, is a measurement method used to determine the correlation between input graphs and high-level hidden representations. Different from the conventional mutual information computations that take place in vector space, GMI extends the calculation to the graph domain. Measuring mutual information from two aspects of node features and topological structure is essential in the graph domain, and GMI makes that possible. Benefits GMI provide

Learnable adjacency matrix GCN

In recent years, graph neural networks (GNNs) have been gaining popularity in the field of deep learning for their ability to work with non-Euclidean data, such as graphs and networks. GNNs have been used for various applications, such as node classification, link prediction, and graph classification. However, a limitation with traditional GNNs is that their structures are not learnable, meaning that the architecture of the network is fixed before training and cannot adapt to the specifics of th

Multi-partition Embedding Interaction

MEI is a novel approach that addresses the efficiency--expressiveness trade-off issue in knowledge graph embedding, which has been a challenging task in machine learning. This technique uses the *multi-partition embedding interaction* with block term tensor format to separate the embedding vectors into multiple partitions and learn the local interaction patterns from the data. This way, MEI is able to achieve the optimal balance between efficiency and expressiveness, rather than being exclusivel

1 / 1