Loading

Jakub Konečný

Researcher

I am PhD student at the University of Edinburgh under supervision of Peter Richtárik.

As of June 2014, I am recipient of the Google European Doctoral Fellowship in Optimization Algorithms.

I am developing large-scale optimization algorithms for machine learning.

Social Profiles

Google Summer Internship May 16, 2016

Until end of August, I am at Google Seattle, doing a summer internship, working mainly with Brendan McMahan. I am going to be working some of the algorithmic chellenges in Federated Optimization/Learning --- decentralized and highly unbalanced datasets.

 

Future of Humanity Institute Mar 14, 2016

As of today, I am visiting the Future of Humanity Institute in Oxford, trying to better understand where the ideas regarding AI safety stand at the moment and how I might be able to contribute. I highly recommend reading Superintelligence, if you haven't yet.

 

Optimization Without Borders Feb 7, 2016

I am excited to be attending the Optimization Without Borders in beautiful Les Houches under Mount Blanc. The event is dedicated to the 60th birthday of Yuri Nesterov and his lifelong contributions to the field.

 

Late updates Jan 11, 2016

Thanks to everybody at NIPS for the amazing atmosphere, inspiring discussions and many new ideas. I presented Stop Wasting My Gradients: Practical SVRG, which I co-authored with Reza Babanezhad, Mohamed Osama Ahmed, Alim Virani, Mark Schmidt and Scott Sallinen from University of British Columbia. I also presented a poster on Federated Optimization: Distributed Optimization Beyond the Datacenter. in Optimization in Machine Learning Workshop.

Before that (Nov 25-27) I spoke at one of the scoping workshops of The Alan Turing Institute , which is being set up, with the topic Distributed Machine Learning and Optimization.

Even before that (Nov 23) I briefly visited the Future of Humanity Institute in Oxford, and discussed some of the long term dangers arising from technological development. For those of you who haven't noticed it yet, I highly recommend Nick's book on Superintelligence. The book is free of bold unjustified predictions, and rather summarizes the open problems and discussed current ideas.

 

Federated Optimization Nov 14, 2015

New preliminary paper out! Federated Optimization: Distributed Optimization Beyond the Datacenter , which is based on my summer work in Google with Brendan McMahan. We introduce a new important setting, in which data are distributed across a very large number of computers, each having access only to few data points. This is primarily motivated by the setting, where users keep their data on their devices, but the goal is still to train a high quality global model.

 

Practical SVRG Nov 10, 2015

New paper out! Stop Wasting My Gradients: Practical SVRG, which I co-authored with Mark Schmidt and his group has been accepted to the NIPS conference. I am looking forward to meeting you all there!

 

INFORMS Annual Meeting Nov 1, 2015

Not many people showed up in Halloween costumes at INFORMS Annual Meeting in Philadelphia. Never mind, I am feeling overwhelmed by the 80 parallel sessions going on. I am giving a talk on Federated Optimization on Monday. Slides are here , paper available soon (available upon request).

 

Alma Mater Oct 21, 2015

I am gave a talk at seminar at Comenius University in Bratislava, speaking about S2GD and work on Federated Optimization I have done during summer in Google, the next step in large-scale distributed optimization. Preliminary paper coming out soon...

 

ISMP 2015 Jul 17, 2015

I am giving a talk today at the ISMP conference in Pittsburgh. So far very inspirational event, in a suprisingly pleasant city.

 

Google Summer Internship May 26, 2015

Until end of August, I am at Google, doing a summer internship, working mainly with Brendan McMahan. Hope there's lots of cool stuff to learn!

 

Optimization & Big Data 2015 May 6, 2015

At home, in Edinburgh, we are organizing three day workshop - Optimization & Big Data 2015 - focused on large-scale optimization, with keynote speaker Arkadi Nemirovski. I am giving an invited talk on Thursday, my slides are available here.

 

Full paper - mS2GD Apr 20, 2015

Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting, joint work with Jie Liu, Martin Takáč and Peter Richtárik. This is the full-size version of the following short paper which was presented at the NIPS Optimization workshop.

 

Edinburgh SIAM Student
Chapter Conference Apr 13, 2015

The Edinburgh SIAM Student Chapter, for which I am in the committee, organized Annual Conference, with speakers including Yves Wiaux (Heriott-Watt University), Marta Faias (New University of Lisbon), Michel Destrade (NUI Galway), Klaus Mosegaard (University of Copenhagen), Samuel Cohen (Oxford) and Julien Dambrine (University of Poitiers).

The aims of the Chapter are promote interdisciplinary and multi-institutional collaboration, increasing student engagement and increase awareness of the application of mathematical techniques to real world problems in science and industry.

 

SUTD Singapore Feb 27, 2015

Until Mar 13, I am at the Singapore University of Technology and Design, visiting Ngai-Man Cheung and Selin Damla Ahipasaoglu.

 

MLSS 2015 Sydney Feb 16, 2015

As of today, I am attending Machine Learning Summer School in Sydney. Seems to be full of interesting program and people.

 

BASP Jan 26, 2015

I am attending a very interesting multidisciplinary conference International Biomedical and Astronomical Signal Processing (BASP) Frontiers workshop in beautiful Villars-sur-Ollon. The main areas are Cosmology, Medical Imaging and Signal Proccessing. I am giving a talk on Wednesday (slides) about the Semi-Stochastic Gradient Descent.

UPDATE: I received the Best Contribution Award in the area of Signal Processing. Overall, the conference had an amazing apmosphere, and I managed to talk to many interesting people outisde of my field. Highly reccommend to anyone!

 

Late updates Jan 7, 2015

The year finished with very interesting conference, Foundations of Computational Mathematics in Montevideo. I presented a poster on Mini-batch semi-stochastic gradient descent in the proximal setting (see poster here).

We also managed to release Semi Stochastic Coordinate Descent, joint with Zheng Qu and Peter Richtárik. This is the full-length version of this brief paper, presented at 2014 NIPS workshop on Optimization in Machine Learning.

 

NIPS 2014 Dec 8, 2014

I am attending NIPS (Neural Informations Processing Systems) in Montreal. Obviously, it's really cold here.

I am presenting two posters at the Optimization for Machine Learning Workshop. The poters are on the Mini-batch semi-stochastic gradient descent in the proximal setting (see poster here) and the Semi-Stochastic Coordinate Descent (see poster here).

 

Research visit
Lehigh University Nov 17, 2014

The following three weeks, I am visiting Martin Takáč at Lehigh University. Optimization group is strong here, so I look forward to hopefully fruitful time. For now, I can say it's surprisingly cold here these days.

And here are some refined slides.

 

Research visit - ETH Zurich Nov 3, 2014

This week, I am on a research visit at the Data Analytics Lab led by Thomas Hofmann at ETH Zurich. First impression is amazing, Zurich seems like a great place to live and work.

Slides from my talk are here .

 

An interesting article Oct 23, 2014

Here is an interesting interview with Michael Jordan about the big thing about Big Data and more...

 

Two short papers out Oct 19, 2014

This week we released two short papers. First one, S2CD: Semi-stochastic coordinate descent , coauthored with Zheng Qu and Peter Richtárik, extends the S2GD algorithm to coordinate setting.

The other one, mS2GD: Mini-batch semi-stochastic gradient descent in the proximal setting , joint with Jie Liu, Martin Takáč and Peter Richtárik, discusses extension of the S2GD algorithm into minibatched setting, which admits twofold speedup. One from better estimates, and other from simple parallelism.

 

New paper out Sep 30, 2014

Simple complexity analysis of direct search , joint with Peter Richtárik.

See the abstract in the right-hand column --->

 

MMEI 2014 (in a castle!) Sep 9, 2014

Staying in beutiful Smolenice Castle in Slovakia, attending International Conference on Mathematical Methods in Economy and Industry, a conference dating its tradition back to 1973.

 

4th IMA NLAO Sep 3, 2014

This week I am attending 4th IMA Conference on Numerical Linear Algebra and Optimisation. I co-organized minisymposium on First order methods and big data optimization (organized with Zheng Qu and Peter Richtárik), in which I am also giving a talk.

 

S2GD code available Jul 9, 2014

Efficient implementation of Semi-Stochastic Gradient Descent for logistic regression is available in the MLOSS repository. The code is works with MATLAB, and is implemented in C++.

 

SIAM Annual Meeting Jul 8, 2014

On Monday I gave an invited talk at the SIAM Annual Meeting in Chicago on the S2GD algorithm (slides) . Withinin the same minisymposium, very interesting talk was delivered by S V N Vishwanathan on NOMAD (slides)

As a representative of Edinburgh SIAM Student Chapter, I attended a breakfast with other representatives and SIAM executives and got some ideas on how to improve our chapter.

Regarding my observations, the whole conference feels much more (compared to other conferences) like a networking event for anyone involved with SIAM activities.

 

Google PhD Fellowship Jun 18, 2014

I was awarded the 2014 Google Europe Doctoral Fellowship in Optimization Algorithms! The news was announced today in the Google Research Blog.

The University of Edinburgh was particularly successful this year, bagging two awards (each University can nominate up to two students). Out of the 15 Europe Fellowships, 4 were awarded to universities in the UK: 2 others in Cambridge. The rest went to students in Switzerland (4), Germany (3), Israel (2), Austria (1) and Poland (1).

This is what Google says about these Fellowships:
Nurturing and maintaining strong relations with the academic community is a top priority at Google. Today, we're announcing the 2014 Google PhD Fellowship recipients. These students, recognized for their incredible creativity, knowledge and skills, represent some of the most outstanding graduate researchers in computer science across the globe. We're excited to support them, and we extend our warmest congratulations.

I would like to thank everyone for their support!

 

London Optimization W. Jun 9, 2014

This week, King's College is hosting London Optimization Workshop. The first day saw, among others, very interesting talk by Panos Parpas on A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization.

 

SIAM OP14 May 18, 2014

I am off to San Diego to attend SIAM Conference on Optimization. I am giving a talk about S2GD (slides) . The weather is obviously good :)

 

Funding awarded Apr 25, 2014

Thanks to support from SIAM and School Research & Development Fund, I am going to attend 2014 SIAM Annual Meeting in Chicago, where I will represent Edinburgh SIAM Student Chapter.

 

Logarithm anniversary Apr 2, 2014

It has been 400 hundred years since Scot John Napier published his Mirifici Logarithmorum Canonis Descriptio. For this work, he is recognised as the inventor of logarithms. At the occastion of this rare anniversary, I attended a workshop that follows on from the last celebration held in 1914!

 

SIAM Student Conference Mar 14, 2014

I helped organise today's Edinburgh SIAM Student Chapter Conference 2014 . Except minor technical difficulties, all went well, and graduate students had the opportunity to take a look into different areas of research in applied mathematics.

 

Meet me Jan 25, 2014

In February, you can meet me in Cambridge around Feb 7 attending LMS meeting on Sparse Regularisation for Inverse Problems , or in London Feb 17-19 at Big Data: Challenges and Applications .

 

Welcome... Jan 23, 2014

...to my new personal web page. I hope it's red enough.

Recent abstract

Federated Optimization: Distributed Optimization Beyond the Datacenter

We introduce a new and increasingly relevant setting for distributed optimization in machine learning, where the data defining the optimization are distributed (unevenly) over an extremely large number of \nodes, but the goal remains to train a high-quality centralized model. We refer to this setting as Federated Optimization. In this setting, communication efficiency is of utmost importance.

A motivating example for federated optimization arises when we keep the training data locally on users' mobile devices rather than logging it to a data center for training. Instead, the mobile devices are used as nodes performing computation on their local data in order to update a global model. We suppose that we have an extremely large number of devices in our network, each of which has only a tiny fraction of data available totally; in particular, we expect the number of data points available locally to be much smaller than the number of devices. Additionally, since different users generate data with different patterns, we assume that no device has a representative sample of the overall distribution.

We show that existing algorithms are not suitable for this setting, and propose a new algorithm which shows encouraging experimental results. This work also sets a path for future research needed in the context of federated optimization.

 
Stop Wasting My Gradients: Practical SVRG

We present and analyze several strategies for improving the performance of stochastic variance-reduced gradient (SVRG) methods. We first show that the convergence rate of these methods can be preserved under a decreasing sequence of errors in the control variate, and use this to derive variants of SVRG that use growing-batch strategies to reduce the number of gradient calculations required in the early iterations. We further (i) show how to exploit support vectors to reduce the number of gradient computations in the later iterations, (ii) prove that the commonly-used regularized SVRG iteration is justified and improves the convergence rate, (iii) consider alternate mini-batch selection strategies, and (iv) consider the generalization error of the method.

Education

Achievements & Awards

  • Google Europe Doctoral Fellowship 2014-2017
    Google

    This is what Google says about the Fellowship: Nurturing and maintaining strong relations with the academic community is a top priority at Google. Today, we're announcing the 2014 Google PhD Fellowship recipients. These students, recognized for their incredible creativity, knowledge and skills, represent some of the most outstanding graduate researchers in computer science across the globe. We're excited to support them, and we extend our warmest congratulations.

  • Best Contribution Award 2015
    Biological and Astronomical Signal Processing workshop

    in the area of Signal Processing

  • Principal’s Career Development Scholarship 2013
    University of Edinburgh

    Highly competitive PhD scholarship awarded to several students across University

  • ChaLearn Gesture Challenge, 2nd place 2012
    ChaLearn

    Purpose of the competition was to develop a one-shot-learning gesture recognizer for Microsoft Kinect. We were awarded a travel grant to present our solution at the International Conference on Pattern Recognition 2012. We were invited to submit a paper to Journal of Machine Learning Research. The paper has been accepted for publication.

  • International Mathematical Olympiad 2010
    Honourable mention

    Astana, Kazakhstan

  • Olympiad in Informatics, national round 2010
    11th place
  • Middle European Mathematical Olympiad 2008
    Honourable mention

    Olomouc, Czech Republic

Work experience

  • Manager (volunteering) 2010 - 2013
    Trojsten

    Trojsten is a Slovak NGO working with the most talented high school children in the fields of mathematics, physics and informatics. Trojsten organizes three postal competitions and week-long camps for the best contestants. Apart from being an organizer of the mathematical part, I was also in charge of finance and administration of the entire organization. Approximately 70 volunteers involved.

  • Co-founder 2011 - 2012
    Vocablr.com

    Vocablr is a language teaching platform designed to make learning new vocabulary fun. I was responsible for most of the non-design part. We were also part of a Slovak startup accelerator program.

  • Coordinator 2010
    Slovak Mathematical Olympiad

    Coordination of sub-national rounds.

Download My Resume


Fields of Academic Interest

  • Convex Programming 100%
  • Machine Learning 95%
  • Computer Vision 60%
  • Rand. Numerical Algebra 45%
  • Something new 75%

Programming Skills

  • MATLAB 95%
  • C++/C# 65%
  • LaTeX 85%

Art

  • Painting 2%
  • Singing 0%
  • Playing guitar 17%

Publications

Major Talks (Past and upcoming)

  • INFORMS Annual Meeting (invited) Nov 1-4, 2015
    Philadelphia
  • Comenius University Oct 21, 2015
    Bratislava, Slovakia
  • 22nd International Symposium on Mathematical Programming (invited) Jul 12-17, 2015
    Pittsburgh, UK
  • Optimization and Big Data 2015 (invited) May 6-8, 2015
    Edinburgh, UK
  • Edinburgh Compressed Sensing group seminar Mar 18, 2015
    Edinburgh, UK
  • Singapore University of Technology and Design, seminar talk (invited) Mar 4, 2015
    Singapore, Singapore
  • International Biomedical and Astronomical Signal Processing (BASP) Frontiers workshop (invited) Jan 25-30, 2015
    Villars-sur-Ollon, Switzerland
  • Lehigh University, seminar talk (invited) Dec 3, 2014
    Bethlehem, USA
  • ETH Zurich, seminar talk (invited) Nov 3, 2014
    Zurich, Switzerland
  • Mathematical Methods in Economy and Industry Sep 8-12, 2014
    Smolenice Castle, Slovakia
  • 4th IMA Conference on Numerical Linear Algebra and Optimisation Sep 3-5, 2014
    Birminghman, UK
    (minisymposium organiser)
  • SIAM Annual Meeting (invited) Jul 7-11, 2014
    Chicago, US
  • SIAM Conference on Optimization May 19-22, 2014
    San Diego, US
  • International Student Academic Conference on Applied Mathematics and Informatics (invited) May 22, 2013
    Opava, Czech Republic
  • International Conference on Pattern Recognition (invited) Nov 10-15, 2012
    Gesture recognition workshop
    Tsukuba, Japan

Contact Info

  • University of Edinburgh
    James Clerk Maxwell Building, 5406
    Peter Guthrie Tait Road
    Edinburgh
    EH9 3FD
  • J.Konecny@sms.ed.ac.uk
  • prachetit

Keep In Touch


 
*By using this form, you risk that your message will not be delivered. Why not use normal e-mail instead?

Introduction

Hello! Here, you'll understand how we projected this theme. Read all the text to know any details about Metroid vCard Template. It's a powerfull personal page to start your web life, because it's simple, fast, animated, beautiful, colorful and it brings many possibilities!

Exclusive Grid System

.col .c1

.col .c2-1 .first

.col .c2-1

.col .c3-1 .first

.col .c3-1

.col .c3-1

.col .c3-1 .first

.col .c3-2

Creating new pages

Is it easy to do?

A: Yes, simply add a <section class="content"> into the <div id="page"></div> and create a tag link with the (class="menu").

See the example:
<a href="#blog" id="blog" class="menu">Blog Link</a>

<div id="page">
  <section id="blog-page" class="content">
    <div class="inner">
      ...page content
    </div>
  </section>

</div>

Changing the content

Metroid was designed to be simple to use and customize. Here's how easy it is to use it.

How to change the text of my site?

A: Everything that is displayed on the screen, whether text, form boxes, portfolio or progress bars, are written in a single file named "index.html". The screen changes are simulated by script calls, so our theme is considered Onepage.

How to change profile picture?

A: Go to the folder "img/profile/" and replace the file "photo.jpg". We recommend that you use a square photo with 245x245 pixels.

Note : We recommend that you use the same sizes as the profile pictures and blog to keep the layout is appropriate in all resolutions (screen, tablet and mobile).

Icons | Credits: Font Awesome (http://fortawesome.github.io/Font-Awesome/)

Changing the style

The Metroid comes with predefined CSS files with 10 different colors in different color tones. In addition, we selected 20 patterns to choose as background.
Patterns | Credits: Subtle Patterns (http://subtlepatterns.com/)

How to do?

A: Just choose the color and pattern you want and change the CSS call into index.html file on line 32. To configure the desired pattern go to the folder "img/bg/" and see the numbering pattern to set the line 54 into index.html.

See the example:
Line 32: <link rel="stylesheet" type="text/css" href="css/colors/color-name.css" id="color" />
Change to name of color.

Line 54: <body class="bg01">
Change from '01' to '20'.

Can I set up my colors in the theme?

A: Yes, simply access the file "custom.css", include your colors there, comment out the line 32 and uncomment line 33.

Typography

Metroid used fonts from Google Fonts. There are amazing font styles in Google Font that you can choose. So, you can choose the better font to you, and apply in your new website. We are using the font "Ubuntu" in all Metroid's pages. Enjoy and customize!

Use simple tags to create your headers, texts and lists.

Headers

<h1>Title<h1>
<h2>Title<h2>
<h3>Title<h3>
<h4>Title<h4>
<h5>Title<h5>
<h6>Title<h6>

Texts

<p>Paragraph<p>
<label>Label<label>
<span>Span<span>
<code>Code<code>
<pre>Pre<pre>
<a>Anchor<a>

Lists

<ul>
<li>item</li>
<li>item</li>
<li>item</li>
<li>item</li>
</ul>

Need Support

Still have questions?

Well, Thanks so much for purchased our items! We’re really appreciated it and hope you enjoy it! If you need support, you can leave your questions or doubts in comments area of the item. We usually get back to you within 2-12hours (except holiday seasons which might take longer).

Theme by Ararazu Themes.