Jakub Konečný

Research Scientist, Google

I care about making Federated Learning happen. Some of that includes research.

I am currently based in Beijing.

Previously, I completed my PhD at the University of Edinburgh under supervision of Peter Richtárik and I enjoyed support through Google PhD Fellowship in Optimization Algorithms.

Social Profiles

FL NeurIPS workshop Dec 1, 2019

We are organizing a FL focused workshop at NeurIPS this year. You can certainly find me there, on Dec 13.

We are hosting 8 invited speakers from industry and academia, and have accepted 33 contributed papers for poster presentation, 12 of which are also giving a 10 minute talk.


SysML Conference Mar 31, 2019

I am attending the SysML conference at Stanford. It seems to be very promising new conference - see the SysML whitepaper. We are presenting Towards Federated Learning at Scale: System Design.


Federated Learning updates Mar 6, 2019

I'm very excited to announce that we have launched TensorFlow Federated, an open source framework for expressing and simulating federated learning algorithms and other computations on decentralized data. The primary goal of this project is to make it easier to do research in the federated learning setting

Recently, we or our colleagues have also released a number of papers related to Federated Learning:

Towards Federated Learning at Scale: System Design, a description of the production system design.

Expanding the Reach of Federated Learning by Reducing Client Resource Requirements, decreasing computational and communication requirements in Federated Learning.

LEAF: A Benchmark for Federated Settings, with our colleagues from CMU.

Federated Learning for Mobile Keyboard Prediction and Applied Federated Learning: Improving Google Keyboard Query Suggestions, two works about experience in using Federated Learning in practice, mainly by colleagues from GBoard.


UW visit Jan 30, 2018

I gave a talk at University of Washington, Seattle about our team's work on Federated Learning. My slides from the talk.


NIPS 2017 Dec 3, 2017

I am present at NIPS 2017, let me know if you want to meet.


Google Jul 6, 2017

In my next steps, I am going to move to Google Seattle in role of Research Scientist at the end of August. I am going to work mainly on Federated Learning to which I contributed during my two summer internships and since January 2017 as Visiting Researcher.


PhD complete Jul 6, 2017

I successfully defended my PhD thesis several days ago. If interested, you can find my thesis here.

I thank for my accomplishements to many amazing researchers I had the opportunity to meet and work with during the last four years. Thank you! I plan to stay actively involved in research, so I hope to meet you at conferences again!

Since the last update, I visited Peter at KAUST, gave a talk at SIAM Conference on Optimization in Vancouver, and several of our released papers have been accepted for publication in journals.


Late NIPS update Dec 17, 2016

Wow, NIPS this year was busier than ever. Thanks everybody for fruitful discussions and setting up potential collaborations!


New paper out Nov 25, 2016

We have released a new paper, Randomized Distributed Mean Estimation: Accuracy vs Communication, joint with Peter Richtárik. We consider the problem of distributed computation of arithmetic average of vectors, under constraints on communication. We propose a flexible family of randomized algorithms exploring the trade-off between expected communication cost and estimation error.


Federated Learning Oct 19, 2016

We have recently released two new papers in collaboration with Google, focusing on Federated Learning - a setting where we need to learn from massively decentralized data, such as when the training data reside on phones of users that generated them.

The first work, Federated Optimization: Distributed Machine Learning for On-Device Intelligence describes the setting in detail, and proposes algorithms with particular focus on convex setting. In Federated Learning: Strategies for Improving Communication Efficiency , we address the communication efficiency of systems for Federated Learning. In the best case, we were able to train a deep model, while the amount of bits communicated over network was smaller than the original data itself.


Learning, Privacy, Mobile Data Sep 13, 2016

I am attending Learning, Privacy, and Mobile Data Workshop organized by Google in Seattle. Organized by the team I interned with, the goal is to better understand interactions between the fields of Machine Learning, Optimization, Privacy and Cryptography to facilitate move to mobile computing devices, and fundamentally change the way machine learning is deployed.


IMA Birmingham Sep 7, 2016

Together with colleagues from our research group, we organized two minisymposia at the 5th IMA Conference on Numerical Linear Algebra and Optimization in Birmingham.

I spoke about our recent work, AIDE: Fast and Communication Efficient Distributed Optimization.


New paper out [AIDE] Aug 29, 2016

In collaboration with Sashank Reddi, Barnabás Póczos and Alex Smola from Carnegie Mellon University, and Peter Richtárik, we propose and analyse a new algorithm, AIDE: Fast and Communication Efficient Distributed Optimization, in short an accelerated inexact DANE algotihm.

Earlier this summer, I presented a poster on AIDE at the Internalional Conference on Machine Learning 2016 in New York.


PhD Fellowship Summit Aug 23, 2016

I am speaking at the Google PhD Fellowship Summit in Mountain View, about my experience of merging research efforts in academia and in Google. Also, I met many passionate young researchers that enjoy the same support by Google during their studies, a very motivating experience.


Google Summer Internship May 16, 2016

Until end of August, I am at Google Seattle, doing a summer internship, working mainly with Brendan McMahan. I am going to be working some of the algorithmic chellenges in Federated Optimization/Learning --- decentralized and highly unbalanced datasets.


Future of Humanity Institute Mar 14, 2016

As of today, I am visiting the Future of Humanity Institute in Oxford, trying to better understand where the ideas regarding AI safety stand at the moment and how I might be able to contribute. I highly recommend reading Superintelligence, if you haven't yet.


Optimization Without Borders Feb 7, 2016

I am excited to be attending the Optimization Without Borders in beautiful Les Houches under Mount Blanc. The event is dedicated to the 60th birthday of Yuri Nesterov and his lifelong contributions to the field.


Late updates Jan 11, 2016

Thanks to everybody at NIPS for the amazing atmosphere, inspiring discussions and many new ideas. I presented Stop Wasting My Gradients: Practical SVRG, which I co-authored with Reza Babanezhad, Mohamed Osama Ahmed, Alim Virani, Mark Schmidt and Scott Sallinen from University of British Columbia. I also presented a poster on Federated Optimization: Distributed Optimization Beyond the Datacenter. in Optimization in Machine Learning Workshop.

Before that (Nov 25-27) I spoke at one of the scoping workshops of The Alan Turing Institute , which is being set up, with the topic Distributed Machine Learning and Optimization.

Even before that (Nov 23) I briefly visited the Future of Humanity Institute in Oxford, and discussed some of the long term dangers arising from technological development. For those of you who haven't noticed it yet, I highly recommend Nick's book on Superintelligence. The book is free of bold unjustified predictions, and rather summarizes the open problems and discussed current ideas.


Federated Optimization Nov 14, 2015

New preliminary paper out! Federated Optimization: Distributed Optimization Beyond the Datacenter , which is based on my summer work in Google with Brendan McMahan. We introduce a new important setting, in which data are distributed across a very large number of computers, each having access only to few data points. This is primarily motivated by the setting, where users keep their data on their devices, but the goal is still to train a high quality global model.


Practical SVRG Nov 10, 2015

New paper out! Stop Wasting My Gradients: Practical SVRG, which I co-authored with Mark Schmidt and his group has been accepted to the NIPS conference. I am looking forward to meeting you all there!


INFORMS Annual Meeting Nov 1, 2015

Not many people showed up in Halloween costumes at INFORMS Annual Meeting in Philadelphia. Never mind, I am feeling overwhelmed by the 80 parallel sessions going on. I am giving a talk on Federated Optimization on Monday. Slides are here , paper available soon (available upon request).


Alma Mater Oct 21, 2015

I am gave a talk at seminar at Comenius University in Bratislava, speaking about S2GD and work on Federated Optimization I have done during summer in Google, the next step in large-scale distributed optimization. Preliminary paper coming out soon...


ISMP 2015 Jul 17, 2015

I am giving a talk today at the ISMP conference in Pittsburgh. So far very inspirational event, in a suprisingly pleasant city.


Google Summer Internship May 26, 2015

Until end of August, I am at Google, doing a summer internship, working mainly with Brendan McMahan. Hope there's lots of cool stuff to learn!


Optimization & Big Data 2015 May 6, 2015

At home, in Edinburgh, we are organizing three day workshop - Optimization & Big Data 2015 - focused on large-scale optimization, with keynote speaker Arkadi Nemirovski. I am giving an invited talk on Thursday, my slides are available here.


Full paper - mS2GD Apr 20, 2015

Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting, joint work with Jie Liu, Martin Takáč and Peter Richtárik. This is the full-size version of the following short paper which was presented at the NIPS Optimization workshop.


Edinburgh SIAM Student
Chapter Conference Apr 13, 2015

The Edinburgh SIAM Student Chapter, for which I am in the committee, organized Annual Conference, with speakers including Yves Wiaux (Heriott-Watt University), Marta Faias (New University of Lisbon), Michel Destrade (NUI Galway), Klaus Mosegaard (University of Copenhagen), Samuel Cohen (Oxford) and Julien Dambrine (University of Poitiers).

The aims of the Chapter are promote interdisciplinary and multi-institutional collaboration, increasing student engagement and increase awareness of the application of mathematical techniques to real world problems in science and industry.


SUTD Singapore Feb 27, 2015

Until Mar 13, I am at the Singapore University of Technology and Design, visiting Ngai-Man Cheung and Selin Damla Ahipasaoglu.


MLSS 2015 Sydney Feb 16, 2015

As of today, I am attending Machine Learning Summer School in Sydney. Seems to be full of interesting program and people.


BASP Jan 26, 2015

I am attending a very interesting multidisciplinary conference International Biomedical and Astronomical Signal Processing (BASP) Frontiers workshop in beautiful Villars-sur-Ollon. The main areas are Cosmology, Medical Imaging and Signal Proccessing. I am giving a talk on Wednesday (slides) about the Semi-Stochastic Gradient Descent.

UPDATE: I received the Best Contribution Award in the area of Signal Processing. Overall, the conference had an amazing apmosphere, and I managed to talk to many interesting people outisde of my field. Highly reccommend to anyone!


Late updates Jan 7, 2015

The year finished with very interesting conference, Foundations of Computational Mathematics in Montevideo. I presented a poster on Mini-batch semi-stochastic gradient descent in the proximal setting (see poster here).

We also managed to release Semi Stochastic Coordinate Descent, joint with Zheng Qu and Peter Richtárik. This is the full-length version of this brief paper, presented at 2014 NIPS workshop on Optimization in Machine Learning.


NIPS 2014 Dec 8, 2014

I am attending NIPS (Neural Informations Processing Systems) in Montreal. Obviously, it's really cold here.

I am presenting two posters at the Optimization for Machine Learning Workshop. The poters are on the Mini-batch semi-stochastic gradient descent in the proximal setting (see poster here) and the Semi-Stochastic Coordinate Descent (see poster here).


Research visit
Lehigh University Nov 17, 2014

The following three weeks, I am visiting Martin Takáč at Lehigh University. Optimization group is strong here, so I look forward to hopefully fruitful time. For now, I can say it's surprisingly cold here these days.

And here are some refined slides.


Research visit - ETH Zurich Nov 3, 2014

This week, I am on a research visit at the Data Analytics Lab led by Thomas Hofmann at ETH Zurich. First impression is amazing, Zurich seems like a great place to live and work.

Slides from my talk are here .


An interesting article Oct 23, 2014

Here is an interesting interview with Michael Jordan about the big thing about Big Data and more...


Two short papers out Oct 19, 2014

This week we released two short papers. First one, S2CD: Semi-stochastic coordinate descent , coauthored with Zheng Qu and Peter Richtárik, extends the S2GD algorithm to coordinate setting.

The other one, mS2GD: Mini-batch semi-stochastic gradient descent in the proximal setting , joint with Jie Liu, Martin Takáč and Peter Richtárik, discusses extension of the S2GD algorithm into minibatched setting, which admits twofold speedup. One from better estimates, and other from simple parallelism.


New paper out Sep 30, 2014

Simple complexity analysis of direct search , joint with Peter Richtárik.

See the abstract in the right-hand column --->


MMEI 2014 (in a castle!) Sep 9, 2014

Staying in beutiful Smolenice Castle in Slovakia, attending International Conference on Mathematical Methods in Economy and Industry, a conference dating its tradition back to 1973.


4th IMA NLAO Sep 3, 2014

This week I am attending 4th IMA Conference on Numerical Linear Algebra and Optimisation. I co-organized minisymposium on First order methods and big data optimization (organized with Zheng Qu and Peter Richtárik), in which I am also giving a talk.


S2GD code available Jul 9, 2014

Efficient implementation of Semi-Stochastic Gradient Descent for logistic regression is available in the MLOSS repository. The code is works with MATLAB, and is implemented in C++.


SIAM Annual Meeting Jul 8, 2014

On Monday I gave an invited talk at the SIAM Annual Meeting in Chicago on the S2GD algorithm (slides) . Withinin the same minisymposium, very interesting talk was delivered by S V N Vishwanathan on NOMAD (slides)

As a representative of Edinburgh SIAM Student Chapter, I attended a breakfast with other representatives and SIAM executives and got some ideas on how to improve our chapter.

Regarding my observations, the whole conference feels much more (compared to other conferences) like a networking event for anyone involved with SIAM activities.


Google PhD Fellowship Jun 18, 2014

I was awarded the 2014 Google Europe Doctoral Fellowship in Optimization Algorithms! The news was announced today in the Google Research Blog.

The University of Edinburgh was particularly successful this year, bagging two awards (each University can nominate up to two students). Out of the 15 Europe Fellowships, 4 were awarded to universities in the UK: 2 others in Cambridge. The rest went to students in Switzerland (4), Germany (3), Israel (2), Austria (1) and Poland (1).

This is what Google says about these Fellowships:
Nurturing and maintaining strong relations with the academic community is a top priority at Google. Today, we're announcing the 2014 Google PhD Fellowship recipients. These students, recognized for their incredible creativity, knowledge and skills, represent some of the most outstanding graduate researchers in computer science across the globe. We're excited to support them, and we extend our warmest congratulations.

I would like to thank everyone for their support!


London Optimization W. Jun 9, 2014

This week, King's College is hosting London Optimization Workshop. The first day saw, among others, very interesting talk by Panos Parpas on A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization.


SIAM OP14 May 18, 2014

I am off to San Diego to attend SIAM Conference on Optimization. I am giving a talk about S2GD (slides) . The weather is obviously good :)


Funding awarded Apr 25, 2014

Thanks to support from SIAM and School Research & Development Fund, I am going to attend 2014 SIAM Annual Meeting in Chicago, where I will represent Edinburgh SIAM Student Chapter.


Logarithm anniversary Apr 2, 2014

It has been 400 hundred years since Scot John Napier published his Mirifici Logarithmorum Canonis Descriptio. For this work, he is recognised as the inventor of logarithms. At the occastion of this rare anniversary, I attended a workshop that follows on from the last celebration held in 1914!


SIAM Student Conference Mar 14, 2014

I helped organise today's Edinburgh SIAM Student Chapter Conference 2014 . Except minor technical difficulties, all went well, and graduate students had the opportunity to take a look into different areas of research in applied mathematics.


Meet me Jan 25, 2014

In February, you can meet me in Cambridge around Feb 7 attending LMS meeting on Sparse Regularisation for Inverse Problems , or in London Feb 17-19 at Big Data: Challenges and Applications .


Welcome... Jan 23, 2014

...to my new personal web page. I hope it's red enough.

Recent abstracts

Improving Federated Learning Personalization via Model Agnostic Meta Learning

Federated Learning (FL) refers to learning a high quality global model based on decentralized data storage, without ever copying the raw data. A natural scenario arises with data created on mobile phones by the activity of their users. Given the typical data heterogeneity in such situations, it is natural to ask how can the global model be personalized for every such device, individually. In this work, we point out that the setting of Model Agnostic Meta Learning (MAML), where one optimizes for a fast, gradient-based, few-shot adaptation to a heterogeneous distribution of tasks, has a number of similarities with the objective of personalization for FL. We present FL as a natural source of practical applications for MAML algorithms, and make the following observations. 1) The popular FL algorithm, Federated Averaging, can be interpreted as a meta learning algorithm. 2) Careful fine-tuning can yield a global model with higher accuracy, which is at the same time easier to personalize. However, solely optimizing for the global model accuracy yields a weaker personalization result. 3) A model trained using a standard datacenter optimization method is much harder to personalize, compared to one trained using Federated Averaging, supporting the first claim. These results raise new questions for FL, MAML, and broader ML research.

Towards Federated Learning at Scale: System Design

Federated Learning is a distributed machine learning approach which enables model training on a large corpus of decentralized data. We have built a scalable production system for Federated Learning in the domain of mobile devices, based on TensorFlow. In this paper, we describe the resulting high-level design, sketch some of the challenges and their solutions, and touch upon the open problems and future directions.

Expanding the Reach of Federated Learning by Reducing Client Resource Requirements

Communication on heterogeneous edge networks is a fundamental bottleneck in Federated Learning (FL), restricting both model capacity and user participation. To address this issue, we introduce two novel strategies to reduce communication costs: (1) the use of lossy compression on the global model sent server-to-client; and (2) Federated Dropout, which allows users to efficiently train locally on smaller subsets of the global model and also provides a reduction in both client-to-server communication and local computation. We empirically show that these strategies, combined with existing compression approaches for client-to-server communication, collectively provide up to a 14× reduction in server-to-client communication, a 1.7× reduction in local computation, and a 28× reduction in upload communication, all without degrading the quality of the final model. We thus comprehensively reduce FL's impact on client device resources, allowing higher capacity models to be trained, and a more diverse set of users to be reached.


Achievements & Awards

  • Google Europe Doctoral Fellowship 2014-2017

    This is what Google says about the Fellowship: Nurturing and maintaining strong relations with the academic community is a top priority at Google. Today, we're announcing the 2014 Google PhD Fellowship recipients. These students, recognized for their incredible creativity, knowledge and skills, represent some of the most outstanding graduate researchers in computer science across the globe. We're excited to support them, and we extend our warmest congratulations.

  • Best Contribution Award 2015
    Biological and Astronomical Signal Processing workshop

    in the area of Signal Processing

  • Principal’s Career Development Scholarship 2013
    University of Edinburgh

    Highly competitive PhD scholarship awarded to several students across University

  • ChaLearn Gesture Challenge, 2nd place 2012

    Purpose of the competition was to develop a one-shot-learning gesture recognizer for Microsoft Kinect. We were awarded a travel grant to present our solution at the International Conference on Pattern Recognition 2012. We were invited to submit a paper to Journal of Machine Learning Research. The paper has been accepted for publication.

  • International Mathematical Olympiad 2010
    Honourable mention

    Astana, Kazakhstan

  • Olympiad in Informatics, national round 2010
    11th place
  • Middle European Mathematical Olympiad 2008
    Honourable mention

    Olomouc, Czech Republic

Download My Resume

Fields of Academic Interest

  • Machine Learning 95%
  • Differential Privacy 75%
  • Optimization 70%
  • Rand. Numerical Algebra 45%
  • Something new 70%

Programming Skills

  • Python 90%
  • TensorFlow 90%
  • MATLAB 55%
  • LaTeX 70%


  • Painting 2%
  • Singing 0%
  • Playing guitar 17%


Contact Info

  • 551 North 34th Street
    Seattle, WA, US - 98103
  • konkey@google.com
  • prachetit

Keep In Touch

*By using this form, you risk that your message will not be delivered. Why not use normal e-mail instead?


Hello! Here, you'll understand how we projected this theme. Read all the text to know any details about Metroid vCard Template. It's a powerfull personal page to start your web life, because it's simple, fast, animated, beautiful, colorful and it brings many possibilities!

Exclusive Grid System

.col .c1

.col .c2-1 .first

.col .c2-1

.col .c3-1 .first

.col .c3-1

.col .c3-1

.col .c3-1 .first

.col .c3-2

Creating new pages

Is it easy to do?

A: Yes, simply add a <section class="content"> into the <div id="page"></div> and create a tag link with the (class="menu").

See the example:
<a href="#blog" id="blog" class="menu">Blog Link</a>

<div id="page">
  <section id="blog-page" class="content">
    <div class="inner">
      ...page content


Changing the content

Metroid was designed to be simple to use and customize. Here's how easy it is to use it.

How to change the text of my site?

A: Everything that is displayed on the screen, whether text, form boxes, portfolio or progress bars, are written in a single file named "index.html". The screen changes are simulated by script calls, so our theme is considered Onepage.

How to change profile picture?

A: Go to the folder "img/profile/" and replace the file "photo.jpg". We recommend that you use a square photo with 245x245 pixels.

Note : We recommend that you use the same sizes as the profile pictures and blog to keep the layout is appropriate in all resolutions (screen, tablet and mobile).

Icons | Credits: Font Awesome (http://fortawesome.github.io/Font-Awesome/)

Changing the style

The Metroid comes with predefined CSS files with 10 different colors in different color tones. In addition, we selected 20 patterns to choose as background.
Patterns | Credits: Subtle Patterns (http://subtlepatterns.com/)

How to do?

A: Just choose the color and pattern you want and change the CSS call into index.html file on line 32. To configure the desired pattern go to the folder "img/bg/" and see the numbering pattern to set the line 54 into index.html.

See the example:
Line 32: <link rel="stylesheet" type="text/css" href="css/colors/color-name.css" id="color" />
Change to name of color.

Line 54: <body class="bg01">
Change from '01' to '20'.

Can I set up my colors in the theme?

A: Yes, simply access the file "custom.css", include your colors there, comment out the line 32 and uncomment line 33.


Metroid used fonts from Google Fonts. There are amazing font styles in Google Font that you can choose. So, you can choose the better font to you, and apply in your new website. We are using the font "Ubuntu" in all Metroid's pages. Enjoy and customize!

Use simple tags to create your headers, texts and lists.







Need Support

Still have questions?

Well, Thanks so much for purchased our items! We’re really appreciated it and hope you enjoy it! If you need support, you can leave your questions or doubts in comments area of the item. We usually get back to you within 2-12hours (except holiday seasons which might take longer).

Theme by Ararazu Themes.