Graham Neubig

Pittsburgh, Pennsylvania, United States
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts

Summary

🤩
Rockstar
Graham Neubig is an Associate Professor of Computer Science at Carnegie Mellon University and Co-Founder & Chief Scientist at All Hands AI, combining rigorous NLP research with startup product and strategy. Over 14 years he has shipped core ML tooling and model code—contributing to DyNet and the widely used nn4nlp codebase—while publishing open course materials and video lectures to broaden access to NLP. He has founded and led startups (Inspired Cognition) and brings real-world engineering chops in efficient numerical operations and language modeling alongside academic leadership. Trained at Kyoto University (PhD) and UIUC (BS CS), his career includes international academic posts and early work as an educator and international-relations coordinator in Japan, a background that fuels his emphasis on accessible, global-facing research.
code15 years of coding experience
github-logo-circle

Github Skills (21)

pytorch10
c-language10
operation10
python10
tensorrt10
word-embeddings10
machine-learning10
tensorflow10
neural-networks10
natural-language-processing10
language-modeling10
word-embedding10
nlp10
tensor10
c-programming-language10

Programming languages (18)

C#MDXC++CSSTeXMustacheHTMLPerl

Github contributions (5)

github-logo-circle
clab/dynet

May 2015 - Mar 2020

DyNet: The Dynamic Neural Network Toolkit
Role in this project:
userBack-end Developer & ML Engineer
Contributions:8 releases, 1 review, 1052 commits in 4 years 10 months
Contributions summary:Graham made several commits focused on enhancing the functionality of the DyNet toolkit. Specifically, they worked on making nodes use the default device, fixing warnings, and "tensorifying" component-wise operations. Their contributions indicate a focus on optimizing the core operations and making them more efficient, which is vital in a numerical deep learning library.
dynetdynamic-neural-networkdeep-learningneural-networksmachine-learning
neubig/nn4nlp-code

Aug 2017 - Jan 2020

Code Samples from Neural Networks for NLP
Role in this project:
userML Engineer
Contributions:48 commits, 22 PRs, 41 pushes in 2 years 5 months
Contributions summary:Graham made modifications to several code samples related to neural networks for NLP, indicating a focus on machine learning within this repository. They implemented and modified code for various models, including bag-of-words, continuous bag-of-words (CBOW), and deep CBOW architectures, showcasing an involvement in core model development. The user also added new code, specifically for language modeling, including both neural network and log-linear models. Furthermore, they added improvements related to efficiency, negative sampling, and binary prediction.
nlpnatural-language-processingneural-networksmachine-learning
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.
Request Free Trial