Chiyuan Zhang

Research Scientist at Google Brain

Boston, Massachusetts, China
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts

Summary

🤩
Rockstar
🎓
Top School
award
Top expert inHigh-Performance Machine Learning Computing
Chiyuan Zhang is a research scientist at Google Brain with 17 years of software and research experience spanning deep learning systems, language runtimes, and developer tooling. Trained at MIT (PhD) and having interned at DeepMind and Google, he operates at the intersection of ML research and low-level engineering—contributing to MXNet and Caffe backends, NDArrays, HDF5 data-layer fixes, and Julia deep-learning tooling. His open-source footprint is unusually broad for an ML researcher, ranging from VM and FFI fixes in Rubinius to elisp work on the Emacs yasnippet template system, showing fluency across C++, Julia, and elisp. Based in Boston and originally from China, he combines academic rigor with production-focused engineering to turn research ideas into robust infrastructure.
code18 years of coding experience
job5 years of employment as a software developer
bookDoctor of Philosophy (PhD), Computer Science, Doctor of Philosophy (PhD), Computer Science at MIT
bookSummer Exchange Program, Computer Science, Summer Exchange Program, Computer Science at Kyoto University
bookBE, Computer Science, BE, Computer Science at Zhejiang University
languagesChinese, English, Japanese
github-logo-circle

Github Skills (47)

sgd10
c-plus10
c-language10
caffe10
gd10
python10
data-science10
api-design10
testing10
machine-learning10
templater10
template-engine10
virtual-machine10
ruby10
machinelearning10

Programming languages (14)

C++RustCTeXHTMLJupyter NotebookJuliaTypeScript

Github contributions (5)

github-logo-circle
pluskid/Mocha.jl

Oct 2014 - Dec 2018

Deep Learning framework for Julia
Role in this project:
userML Engineer
Contributions:885 commits, 85 PRs, 212 pushes in 4 years 2 months
Contributions summary:Chiyuan made significant contributions to a deep learning framework for Julia. Their commits focused on implementing and integrating various features, including removing dependencies, adding momentum for stochastic gradient descent, implementing a basic weight initializer, making the backend explicit, adding regularizers, and implementing a softmax loss layer. These contributions indicate a focus on the core functionality and optimization of the framework, along with building out the layer capabilities.
automatic-differentiationdeep-learningmachine-learningneural-networkframework-learning
apache/mxnet

Oct 2015 - May 2017

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
Role in this project:
userBack-end Developer
Contributions:699 commits, 109 PRs, 43 pushes in 1 year 7 months
Contributions summary:The user, pluskid, contributed to the development of the MXNet.jl project by focusing on the low-level infrastructure. Their work included generating files for Julia, implementing basic API testing, creating and manipulating NDArrays, and resolving code typos. Their contributions primarily involved building and maintaining the core components of the library.
pythonschedulerdataflowmutationdata-science
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.
Request Free Trial