Document Type
Event
Start Date
23-4-2023 5:00 PM
Description
In recent years, CNNs (Convolutional neural Network) and GNNs (Graph Neural Network) have gained a lot of attention and popularity in various fields such as computer vision, natural language processing, and social network analysis. Training large scale CNN and GNN models may take up to several months or sometimes years to complete. SCALE-Sim is a CNN accelerator with systolic array and SRAMs. However, SCALE-Sim only support CNN inference and not back propagation. To evaluate the CNN training, in this work we extend the SCALE-Sim to support CNN back propagation. Because the importance of the GNN applications and their training bottlenecks, we also explore how to integrate GNN training in SCALE-Sim via GraphSAGE to generate node embedding. The goal of this work is to extend SCALE-Sim to support CNN and GNN training, which enable us to evaluate the efficiency of machine training systems.
GR-362 SCALE-Sim extension to support GNN inputs and CNN back propagation
In recent years, CNNs (Convolutional neural Network) and GNNs (Graph Neural Network) have gained a lot of attention and popularity in various fields such as computer vision, natural language processing, and social network analysis. Training large scale CNN and GNN models may take up to several months or sometimes years to complete. SCALE-Sim is a CNN accelerator with systolic array and SRAMs. However, SCALE-Sim only support CNN inference and not back propagation. To evaluate the CNN training, in this work we extend the SCALE-Sim to support CNN back propagation. Because the importance of the GNN applications and their training bottlenecks, we also explore how to integrate GNN training in SCALE-Sim via GraphSAGE to generate node embedding. The goal of this work is to extend SCALE-Sim to support CNN and GNN training, which enable us to evaluate the efficiency of machine training systems.