logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Neural Message Passing

    Concept icon
Concept icon
Concept

Subsampling of Graph-level Message Passing

To use mini-batch training, one can work with a subset of nodes during message passing.

0

1

Concept icon
Updated 2022-07-17

Contributors are:

Ge Zhang
Ge Zhang
🏆 1

Who are from:

University of Michigan - Ann Arbor
University of Michigan - Ann Arbor
🏆 1

References


  • Graph Representation Learning by William Hamilton

Tags

Data Science

Related
  • Formal Definition of Neural Message Passing

    Concept icon
  • Steps of Neural Message Passing

  • Message Passing with Self-loops

    Concept icon
  • Generalized Neighborhood Aggregation

  • Graph-level Implementations of Message Passing

    Concept icon
  • Subsampling of Graph-level Message Passing

    Concept icon
  • Generalized Message Passing

    Concept icon
Learn After
  • Why Subsampling of Graph-level Message Passing is Needed?

    Concept icon
  • Challenge of Subsampling of Graph-level Message Passing

    Concept icon
  • Inductive representation learning on large graphs.

  • Basic Idea of Inductive Representation Learning on Large Graphs

    Concept icon
logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github