You are here
Colloquia - Jie Ren, Enabling Big Memory Applications with Memory Heterogeneity, Virtual, 4:25 - 5:25 pm
Enabling Big Memory Applications with Memory Heterogeneity
The memory system has been evolving at a fast pace recently, driven by the emergence of large-scale applications and the advance of hardware technology. This trend calls for the birth of memory systems with extreme heterogeneity, which combines multiple memory technologies with different latency, bandwidth, and capacity to construct main memory. The heterogeneity of memory systems brings a substantial disparity in the performance and efficiency, making the decision of which technology to use at what times intricate. This talk will discuss how to efficiently use heterogeneous memory systems for big memory applications. First, I will show how to train multi-billion parameter models with limited GPU recourses by exploring the usage of heterogeneous memory. Then, I will show how to enable large-scale plasma simulations on a single machine at unprecedented scales with memory heterogeneity. As a closing remark, I will share my view on the memory management for next-generation heterogeneous memory systems and how they handle emerging workloads more efficiently.
Jie Ren is a Ph.D. candidate in Electrical Engineering and Computer Science at the University of California, Merced. Her research focuses on developing practical techniques to solve memory management issues in parallel computing systems, particularly in developing runtime support on heterogeneous memory systems. Jie obtained the Graduate Dean's Dissertation Fellowship at UC Merced. Her publication appears in top-tier conferences in machine learning, computer architecture, and systems communities, such as USENIX ATC, HPCA, ICS, NeurIPS, etc.