About Me
Welcome to my academic webpage! I am a PhD student at Universityof South Carolina, specializing in the intersection of compiler optimizations and distributed machine learning. My research is driven by a passion for developing innovative solutions to enhance the efficiency and effectiveness of machine learning systems. You can find my CV here.
Research Interests
- Compiler optimizations for machine learning workloads
- Distributed machine learning
- Efficient inference strategies for large language models (LLMs)
Experience
Past Internship at d-Matrix.ai
Duration: 9 months
- Designed a high-level multi-pass framework bridging MLIR compiler and simulator backend for efficient LLM inference on Corsair devices.
- Explored parallelism strategies and identified architectural bottlenecks, aiding in design improvements for Corsair devices.
Current Research:
My current research is focused on optimizing compilers for DNN workloads. The goal is to leverage the advanced capabilities of LLMs to analyze and predict optimal compiler strategies for DNN workloads. LLMs, known for their vast knowledge bases and sophisticated pattern recognition capabilities, offer a promising avenue for refining the compilation process. Central to our research is the exploration of how LLMs can influence and improve compiler optimizations.