Meiqi Wang is a Software Engineer based in San Francisco with 10 years of experience, currently on Rakuten’s AIDD Group and holding an MS in Computational Linguistics from Brandeis. She specializes in NLP and machine learning and is a prolific open-source contributor to transformer and generative-model ecosystems, with foundational work in projects like x-transformers, RETRO-pytorch, DALL·E/Imagen and multiple video/audio diffusion repos. Contributions span high-level architecture and low-level performance engineering — implementing core transformer layers, rotary embeddings and retrieval cross-attention, and even integrating the Lion optimizer with custom CUDA kernels into bitsandbytes. She has applied ML to scientific domains (AlphaFold/ColabFold) and also shipped product-facing features such as UI enhancements for flagr, demonstrating a blend of research, engineering and UX sensibilities. Her GitHub motto, "Working with Attention. It's all we need," mirrors a career-long focus on attention mechanisms across language, vision, audio and video.
10 years of coding experience
4 years of employment as a software developer
MD, Medicine, MD, Medicine at University of Michigan Medical School
Bachelor's degree, Electrical and Computer Engineering (ECE), Summa Cum Laude, Bachelor's degree, Electrical and Computer Engineering (ECE), Summa Cum Laude at Cornell University
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Role in this project:
ML Engineer
Contributions:177 releases, 5 reviews, 397 commits in 1 year 6 months
Contributions summary:Phil's commits primarily focused on implementing and refining core components of a DALL-E-like text-to-image generation model in PyTorch. They built foundational elements such as a VAE and CLIP model and integrated them with a transformer-based decoder to generate images from text. The user contributed to crucial features, including the ability to generate images, integrate various attention mechanisms, and utilize techniques like classifier-free guidance.
🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch
Role in this project:
ML Engineer
Contributions:15 releases, 29 commits, 9 PRs in 26 days
Contributions summary:Phil primarily contributed to the development and optimization of the Lion optimizer, a novel approach to gradient descent. Their work included implementing the core Lion algorithm in PyTorch, refining the code, and integrating a high-performance Triton implementation for faster computation. They also focused on improving performance by utilizing inplace operations and addressing compatibility issues. This indicates a focus on the practical implementation and efficiency of the optimizer within the PyTorch ecosystem.
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.