Trends in Numerical Methods in the Era of GPU Accelerated Computing and AI
Seminar: EE: CS | February 11 | 4-5 p.m. | 3110 Etcheverry Hall
Harun Bayraktar, Ph.D., NVIDIA
Over the last decade, GPU accelerated computing has dramatically changed the HPC world, making exascale computing a reality. But these changes are not likely to slow down anytime soon. Thanks to newer GPU architectures and advances in AI related technologies, new computational methods are emerging that leverage hardware and software technologies from both AI and traditional HPC. In this talk, we will look at one of these recently emerging trends: low and mixed precision computing in numerical methods. In particular, we will look at one application of mixed-precision computing that accelerates a very commonly used dense linear algebra routine by almost a factor of four while still delivering the solution in double precision on the same computer hardware.