Tongzhou Wang

Member Of Technical Staff at OpenAI

United States
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts

Summary

🤩
Rockstar
🎓
Top School
Tongzhou Wang is a Member of Technical Staff at OpenAI with 10 years of experience bridging AI research and production engineering. He completed a PhD at MIT CSAIL after roles at FAIR where he helped build PyTorch and contributed to core repos like pytorch/pytorch, torchvision, and tutorials—work that spans both technical writing (improving docs and examples) and hands-on ML engineering. His open-source contributions include optimizing memory-heavy dataset distillation training loops and improving image-to-image translation pipelines, demonstrating a rare mix of usability-focused documentation and low-level model/code fixes. A dual-degree pedigree from Carnegie Mellon and UC Berkeley (top GPAs) underpins his work, and he maintains a professional site at tongzhouwang.info.
code11 years of coding experience
job5 years of employment as a software developer
bookBachelor of Science (B.S.), Computer Science, Statistics, GPA: 3.90; Technical GPA: 3.98, Bachelor of Science (B.S.), Computer Science, Statistics, GPA: 3.90; Technical GPA: 3.98 at University of California, Berkeley
bookMiddle School & High School, Middle School & High School at Shanghai Foreign Language School
bookComputer Science; Electrical and Computer Engineering, GPA 4.0, Computer Science; Electrical and Computer Engineering, GPA 4.0 at Carnegie Mellon University
github-logo-circle

Github Skills (25)

pytorch10
develop10
documentations10
python10
image-processing10
machine-learning10
deep-learning10
computer-vision10
generative-adversarial-networks10
documentation10
algorithm9
code-optimization9
optimizations9
algorithms9
tensorflow9

Programming languages (14)

JavaCSSC++CHTMLJupyter NotebookGroovyShell

Github contributions (5)

github-logo-circle
ssnl/dataset-distillation

Dec 2018 - May 2022

Dataset Distillation
Role in this project:
userML Engineer
Contributions:1 review, 26 commits, 5 PRs in 3 years 6 months
Contributions summary:Tongzhou primarily contributes to the `base_options.py` file, adding and modifying arguments related to the training process, and model architecture, including dropout, distillation parameters, and learning rates. Several commits focus on improving the data loading and transformation process. Further contributions involve modifications to the distillation training loop within `train_distilled_image.py`, and fixing issues related to optimizing distilled image generation and memory usage.
deep-learningpytorchdatasetdistillation
pytorch/pytorch

Sep 2017 - Oct 2022

Tensors and Dynamic neural networks in Python with strong GPU acceleration
Role in this project:
userTechnical Writer
Contributions:13 reviews, 506 commits, 653 PRs in 5 years 1 month
Contributions summary:Tongzhou primarily contributed to the project by updating and improving the documentation. Their commits focused on fixing grammar issues, correcting math displays, adding missing colons, and improving code examples within the documentation files. The user also added documentation for new features like `weight_norm` and refined sections related to extending and using the PyTorch framework, enhancing the overall clarity and usability of the documentation.
pythongpu-accelerationdeep-learninggpunumpy
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.
Request Free Trial