
Jeff Dean
Born 1968 · Age 57
American computer scientist and software engineer; long-time Google engineer and leader — co-designer of MapReduce, Bigtable, Spanner, TensorFlow; co-founder and long-time lead of Google Brain; lead of Google AI since 2018; appointed Google's Chief Scientist in 2023.
Compare Your Trajectory
See how your career milestones stack up against Jeff Dean and other industry leaders.
Life & Career Timeline
Early computing exposure (approx.)
Began learning and doing programming in youth and high school (later worked summers on Epi Info); formative experience that led to career in CS.
Summer work on Epi Info (CDC/WHO)
Worked during high school and college summers at the Centers for Disease Control and later the World Health Organization developing versions of Epi Info (epidemiological analysis software).
Worked at WHO Global Programme on AIDS
Joined World Health Organization Global Programme on AIDS (1990–1991) developing software for statistical modelling and forecasting of the HIV/AIDS pandemic.
B.S. summa cum laude, University of Minnesota
Received B.S. in Computer Science and Economics (summa cum laude). Undergraduate honors thesis on parallel training of neural networks in C, advised by Vipin Kumar.
End of WHO role prior to grad school
Completed work at WHO Global Programme on AIDS before starting graduate school.
Ph.D. in Computer Science, University of Washington
Received Ph.D. working under Craig Chambers on compilers and whole-program optimization for object-oriented languages; thesis: Whole-program optimization of object-oriented languages.
Joined Digital Equipment Corporation Western Research Lab
Started working at DEC/Compaq Western Research Laboratory (1996–1999) on profiling tools, microprocessor architecture, and information retrieval.
SOSP Best Paper Award (co-author)
Best Paper Award at SOSP 1997 for work 'Continuous Profiling: Where Have All the Cycles Gone?'
Joined Google (mid-1999)
Left DEC/Compaq and joined Google in mid-1999 as an early engineer; began designing and implementing core infrastructure (crawling, indexing, ads, etc.).
Protocol Buffers design (approx.)
Co-designed Protocol Buffers, an efficient extensible data interchange format used extensively at Google (implementation later open-sourced).
Manual answering of queries during outage (anecdote)
Anecdote/meme: In early 2002 when index servers went down, Jeff reportedly answered user queries manually for two hours, improving Eval quality by 5 points.
MapReduce paper published (OSDI 2004)
Co-authored 'MapReduce: Simplified Data Processing on Large Clusters' (OSDI'04), describing the MapReduce system used widely within Google and inspiring external projects like Hadoop.
PLDI 10-year Retrospective Award (paper contribution recognition)
Received a 10-year Retrospective Most Influential Paper Award from PLDI 2005 (for work on selective specialization for OO languages).
Bigtable paper published (OSDI 2006)
Co-authored 'Bigtable: A Distributed Storage System for Structured Data' (OSDI'06), foundational large-scale semi-structured storage used across Google.
ACM SIGOPS Mark Weiser Award
Awarded the Mark Weiser Award by ACM SIGOPS for contributions to systems research.
Wired profile: 'The Friendship That Made Google Huge'
Featured in Wired article profiling Jeff Dean and Sanjay Ghemawat and their influence at Google.
Elected to National Academy of Engineering
Elected to the NAE in recognition of work on large-scale distributed computer systems. Also named an ACM Fellow in 2009.
Work on Google Search infrastructure (ongoing milestone)
Co-led design and implementation of multiple generations of Google crawling, indexing, and query serving systems; continued improvements in ads and internal systems.
Co-developed early Google ML systems (DistBelief)
Worked on DistBelief, a proprietary distributed ML training system that later was refactored into TensorFlow.
Joined Google X to investigate deep neural networks
Moved to Google X to study deep neural networks; this work produced the unsupervised 'cat neuron' project and helped spawn Google Brain.
Co-founded Google Brain
Co-founded the Google Brain project (team) focused on large-scale deep neural networks and distributed ML systems; project formed in 2011.
Started Hopper-Dean Foundation
Founded the Hopper-Dean Foundation with his wife Heidi Hopper; philanthropic giving to support STEM diversity started in 2011.
ACM-Infosys Foundation Award / ACM Prize in Computing (2012)
Recipient of the ACM-Infosys Foundation Award and ACM Prize in Computing in 2012 (with Sanjay Ghemawat).
Became leader of Google Brain
Assumed leadership of the Google Brain team in 2012 after the group formed.
Spanner paper (OSDI 2012) published
Co-authored Spanner: Google's globally-distributed database system (OSDI 2012). Spanner later became available as Cloud Spanner.
DistBelief used for large-scale training (approx.)
DistBelief, a distributed system for training deep neural nets (used in early Google Brain projects), trained models with ~2 billion non-embedding parameters in this era.
ACM Prize in Computing (with Sanjay Ghemawat)
Received the ACM Prize in Computing in 2012 with long-time collaborator Sanjay Ghemawat for their contributions to large-scale computing systems.
NeurIPS Test of Time Award (word2vec paper)
Co-author of the word2vec paper which later received the NeurIPS 2023 Test of Time Award (paper originally published 2013).
Cat neuron / large-scale unsupervised paper (Building high-level features...)
Published work on large-scale unsupervised learning on YouTube video frames that produced interpretable units ('cat neuron'); paper by Le et al., 2013.
RankBrain / neural embeddings contributions (milestone)
Contributed infrastructure and research that helped create early embedding models and systems such as word2vec and RankBrain used in search and other products.
LevelDB open-source release (approx.)
Designed and released LevelDB, a high-performance on-disk key-value store (open-sourced and widely used in projects like Chrome).
TensorFlow open-sourced (2015)
Primary designer and implementor of TensorFlow's initial system; advocated and helped open-source TensorFlow in 2015; it became widely used worldwide.
Elected Fellow of the American Academy of Arts and Sciences
Elected to the American Academy of Arts and Sciences (2016).
SIGOPS Hall of Fame Awards (for Bigtable and MapReduce papers)
Bigtable and MapReduce papers inducted into SIGOPS Hall of Fame (Bigtable 2016, MapReduce 2015 retrospective honors noted on research profile).
TensorFlow Systems paper / OSDI 2016
TensorFlow: A System for Large-Scale Machine Learning (OSDI 2016) — system paper describing TensorFlow large-scale ML system.
Hopper-Dean Foundation large donations
Hopper-Dean Foundation (started by Dean and his wife) donated $2M each to UC Berkeley, MIT, University of Washington, Stanford, and Carnegie Mellon in 2016 to support STEM diversity (total ~$10M).
Featured in The New Yorker profile (2018)
James Somers wrote 'The Friendship That Made Google Huge' profiling Jeff Dean and Sanjay Ghemawat and their collaboration (2018).
Appointed head of Google's AI efforts (head of Google AI)
In April 2018 Jeff Dean was appointed the head of Google's AI division following John Giannandrea's departure to Apple; took broader responsibility for Google Brain and research teams.
UW Allen School Distinguished Lecture (Oct 2019)
Delivered a major invited lecture on 'Deep Learning to Solve Challenging Problems' (UW Allen School Distinguished Lecture, Oct 2019).
Keynotes and major talks (2019)
Keynote at Stanford Medicine Big Data | Precision Health conference, Khipu 2019, and other major conference keynotes (AI in healthcare, deep learning topics).
Machine Learning for Medicine & Healthcare contributions
Co-authored and led or supported multiple projects and publications applying ML to healthcare and medical diagnostics (various papers 2019–2022).
IEEE John von Neumann Medal
Recipient of the IEEE John von Neumann Medal in recognition of contributions to computing and AI (2021).
TED Talk: 'AI isn't as smart as you think — but it could be'
Gave a TED talk (2021) discussing AI capabilities, limitations, and roadmap for responsible, useful AI.
Research on carbon footprint of ML
Co-authored papers and presentations addressing carbon emissions and best practices to reduce ML training CO2e, advocating efficiency and TPUs (2022).
Pathways systems paper (MLSys 2022) and PaLM (PaLM arXiv/2022)
Co-author on Pathways: Asynchronous Distributed Dataflow for ML (MLSys 2022) and PaLM language model (PaLM 2022) scaling using Pathways and TPUs.
Research paper awards and hall-of-fame recognitions
Co-authored papers received awards: MLSys Outstanding Paper (Pathways), SIGOPS Hall of Fame awards for Spanner/Bigtable/MapReduce in various years including 2016/2015/2022 retrospectives.
Continued leadership of Google Research (~3,500 people)
Leads Google Research / Google DeepMind efforts — research.google notes leadership of a ~3,500 person organization dedicated to computer science and AI research.
Bigtable scale milestone (as of 2023)
Google Bigtable processes more than 6 billion requests per second at peak and manages over 10 exabytes of data under management (as reported in 2023).
DeepMind merged with Google Brain; named Google's Chief Scientist
Alphabet reorganized AI groups by merging DeepMind and Google Brain into Google DeepMind; Jeff Dean was appointed Google's Chief Scientist (2023).
NeurIPS workshop invited keynote (Dec 2024)
Invited keynote at NeurIPS 2024 Workshop on ML for Systems discussing exciting directions in machine learning for computer systems.
NeurIPS 2024 and other invited talks
Invited talks and workshop leadership including NeurIPS 2024 workshop on ML for systems and many other conferences; continued public-facing leadership on AI.
2025 ACM SIGMOD Systems Award (co-recipient for Spanner)
Recognized on the Google research page as recipient of the 2025 ACM SIGMOD Systems Award for Spanner (award listed on his research profile).
ETH Zürich Distinguished Lecture (April 2025)
Delivered an ETH Zürich distinguished lecture on AI trends and shaping AI's future (April 2025).
University of Minnesota Commencement Address (May 2025)
Delivered the University of Minnesota College of Science and Engineering commencement address (May 2025).
Key Achievement Ages
Explore what Jeff Dean and others achieved at these notable ages:
Similar Trajectories
Daniel Butterfield
Born 1973 · Age 52
Canadian entrepreneur; co‑founder of Flickr and Slack; former CEO of Slack; built early Web 2.0 products and pivoted failed game projects into major communication/photo platforms.
Daniel Shiffman
Born 1973 · Age 52
Computer programmer, educator, author and creator of The Coding Train; Associate Arts Professor at NYU ITP and board member / co-founder of the Processing Foundation. Author of Learning Processing and The Nature of Code; creator of tutorials and libraries for Processing and p5.js (including ml5.js).
Larry Page
Born 1973 · Age 52
Co-founder of Google and Alphabet Inc.; computer scientist and entrepreneur; co-creator of PageRank.
Stewart Butterfield
Born 1973 · Age 52
Canadian entrepreneur; co-founder of Flickr and Slack; former CEO of Slack; known for building user-focused collaboration and photo-sharing products.
John Gruber
Born 1973 · Age 52
American technology blogger, UI designer, co-creator of Markdown, author of Daring Fireball and host of The Talk Show podcast.
Dharmesh Shah
Born 1973 · Age 52
Dharmesh Shah is a technology entrepreneur, co-founder and Chief Technology Officer (CTO) of HubSpot, founder of OnStartups.com, and founder/CEO of Pyramid Digital Solutions (acquired by SunGard). Known for popularizing inbound marketing, authoring the HubSpot Culture Code, angel investing, and speaking widely in the SaaS/startup community.