New paper out Nov 25, 2016
We have released a new paper,
Randomized Distributed Mean Estimation: Accuracy vs Communication,
joint with
Peter Richtárik. We consider the problem of distributed computation of
arithmetic average of vectors, under constraints on communication. We propose a
flexible family of randomized algorithms exploring the tradeoff between
expected communication cost and estimation error.
Federated Learning Oct 19, 2016
We have recently released two new papers in collaboration
with Google, focusing on Federated Learning  a setting
where we need to learn from massively decentralized data,
such as when the training data reside on phones of users that
generated them.
The first work,
Federated Optimization: Distributed Machine Learning for
OnDevice Intelligence describes the setting in detail, and
proposes algorithms with particular focus on convex setting.
In
Federated Learning: Strategies for Improving Communication
Efficiency , we address the communication efficiency of systems
for Federated Learning. In the best case, we were able to train
a deep model, while the amount of bits communicated over network
was smaller than the original data itself.
Learning, Privacy, Mobile Data Sep 13, 2016
I am attending
Learning, Privacy, and Mobile Data Workshop organized by
Google in Seattle. Organized by the team I interned with, the goal
is to better understand interactions between the fields of Machine
Learning, Optimization, Privacy and Cryptography to facilitate
move to mobile computing devices, and fundamentally change the way
machine learning is deployed.
IMA Birmingham Sep 7, 2016
Together with colleagues from our research group, we organized
two minisymposia at the
5th IMA Conference on Numerical Linear Algebra and Optimization
in Birmingham.
I spoke about our recent work,
AIDE: Fast and Communication Efficient Distributed Optimization.
New paper out [AIDE] Aug 29, 2016
In collaboration with Sashank Reddi,
Barnabás Póczos and
Alex Smola
from Carnegie Mellon University, and
Peter Richtárik, we propose and analyse a new algorithm,
AIDE: Fast and Communication Efficient Distributed Optimization,
in short an accelerated inexact DANE algotihm.
Earlier this summer, I presented a poster
on AIDE at the Internalional Conference on Machine Learning 2016 in New York.
PhD Fellowship Summit Aug 23, 2016
I am speaking at the
Google PhD Fellowship Summit in Mountain View, about my
experience of merging research efforts in academia and in Google.
Also, I met many passionate young researchers that enjoy
the same support by Google during their studies, a very motivating
experience.
Google Summer Internship May 16, 2016
Until end of August, I am at Google Seattle, doing a summer internship,
working mainly with
Brendan McMahan. I am going to be working some of the algorithmic
chellenges in Federated Optimization/Learning  decentralized and highly unbalanced datasets.
Future of Humanity Institute Mar 14, 2016
As of today, I am visiting the
Future of Humanity Institute in Oxford, trying to better
understand where the ideas regarding AI safety stand at the
moment and how I might be able to contribute. I highly recommend
reading Superintelligence, if you haven't yet.
Optimization Without Borders Feb 7, 2016
I am excited to be attending the
Optimization Without Borders in beautiful
Les Houches under Mount Blanc. The event is dedicated to the
60th birthday of Yuri Nesterov and his lifelong contributions
to the field.
Late updates Jan 11, 2016
Thanks to everybody at NIPS for the amazing atmosphere,
inspiring discussions and many new ideas. I presented
Stop Wasting My Gradients: Practical SVRG, which I coauthored
with Reza Babanezhad, Mohamed Osama Ahmed, Alim Virani,
Mark Schmidt and Scott Sallinen from University of British Columbia.
I also presented a poster on Federated Optimization: Distributed Optimization
Beyond the Datacenter. in Optimization in Machine Learning Workshop.
Before that (Nov 2527) I spoke at one of the scoping workshops of
The
Alan Turing Institute , which is being set up, with the topic
Distributed Machine Learning and Optimization.
Even before that (Nov 23) I briefly visited the
Future of
Humanity Institute in Oxford, and discussed some of the
long term dangers arising from technological development. For
those of you who haven't noticed it yet, I highly recommend
Nick's book on Superintelligence. The book is free of bold
unjustified predictions, and rather summarizes the open problems
and discussed current ideas.
Federated Optimization Nov 14, 2015
New preliminary paper out!
Federated Optimization: Distributed Optimization Beyond the Datacenter ,
which is based on my summer work in Google with
Brendan McMahan. We introduce a new
important setting, in which data are distributed across a very large
number of computers, each having access only to few data points.
This is primarily motivated by the setting, where users keep their
data on their devices, but the goal is still to train a high
quality global model.
Practical SVRG Nov 10, 2015
New paper out!
Stop Wasting My Gradients: Practical SVRG,
which I coauthored with Mark Schmidt and his group has been
accepted to the
NIPS conference. I am looking forward to meeting you all there!
INFORMS Annual Meeting Nov 1, 2015
Not many people showed up in Halloween costumes at
INFORMS Annual Meeting in Philadelphia. Never mind, I am
feeling overwhelmed by the 80 parallel sessions going on. I am giving
a talk on Federated Optimization on Monday. Slides
are here ,
paper available soon (available upon request).
Alma Mater Oct 21, 2015
I am gave a talk at seminar at Comenius University in Bratislava, speaking about
S2GD and
work on Federated Optimization I have done during summer in Google, the next
step in largescale distributed optimization. Preliminary paper coming out soon...
ISMP 2015 Jul 17, 2015
I am giving a talk today at the ISMP
conference in Pittsburgh. So far very inspirational event,
in a suprisingly pleasant city.
Google Summer Internship May 26, 2015
Until end of August, I am at Google, doing a summer internship,
working mainly with
Brendan McMahan. Hope there's lots of cool stuff to learn!
Optimization & Big Data 2015 May 6, 2015
At home, in Edinburgh, we are organizing three day workshop

Optimization & Big Data 2015 
focused on largescale optimization, with keynote speaker
Arkadi Nemirovski. I am giving an invited talk on Thursday,
my slides are available here.
Full paper  mS2GD Apr 20, 2015
MiniBatch SemiStochastic Gradient Descent in the Proximal Setting,
joint work with Jie Liu,
Martin Takáč and
Peter Richtárik.
This is the fullsize version of the following
short paper
which was presented at the NIPS Optimization workshop.
Edinburgh SIAM Student
Chapter Conference Apr 13, 2015
The
Edinburgh SIAM Student Chapter, for which I am in the committee,
organized Annual Conference, with speakers including
Yves Wiaux (HeriottWatt University),
Marta Faias (New University of Lisbon),
Michel Destrade (NUI Galway),
Klaus Mosegaard (University of Copenhagen),
Samuel Cohen (Oxford) and
Julien Dambrine (University of Poitiers).
The aims of the Chapter are promote interdisciplinary and
multiinstitutional collaboration, increasing student engagement
and increase awareness of the application of mathematical
techniques to real world problems in science and industry.
SUTD Singapore Feb 27, 2015
Until Mar 13, I am at the Singapore University of Technology
and Design, visiting
NgaiMan Cheung and
Selin Damla Ahipasaoglu.
MLSS 2015 Sydney Feb 16, 2015
As of today, I am attending
Machine Learning Summer School in Sydney. Seems to be
full of interesting program and people.
BASP Jan 26, 2015
I am attending a very interesting multidisciplinary conference
International Biomedical
and Astronomical Signal Processing (BASP) Frontiers workshop
in beautiful VillarssurOllon.
The main areas are Cosmology, Medical Imaging and Signal Proccessing.
I am giving a talk on Wednesday
(slides) about the SemiStochastic Gradient Descent.
UPDATE: I received the Best Contribution Award in the area of Signal Processing.
Overall, the conference had an amazing apmosphere, and I managed
to talk to many interesting people outisde of my field. Highly reccommend to anyone!
Late updates Jan 7, 2015
The year finished with very interesting conference,
Foundations of Computational Mathematics in Montevideo.
I presented a poster on Minibatch semistochastic gradient descent
in the proximal setting
(see poster here).
We also managed to release
Semi Stochastic Coordinate Descent,
joint with Zheng Qu
and Peter Richtárik.
This is the fulllength version of
this brief paper, presented at 2014 NIPS workshop on
Optimization in Machine Learning.
NIPS 2014 Dec 8, 2014
I am attending NIPS
(Neural Informations Processing Systems) in Montreal. Obviously,
it's really cold here.
I am presenting two posters at the
Optimization for Machine Learning Workshop. The poters are on
the Minibatch semistochastic gradient descent in the proximal setting
(see poster here) and
the SemiStochastic Coordinate Descent
(see poster here).
Research visit
Lehigh University Nov 17, 2014
The following three weeks, I am visiting
Martin Takáč at Lehigh University.
Optimization group is strong here, so I look forward to hopefully fruitful time.
For now, I can say it's surprisingly cold here these days.
And here are some refined slides.
Research visit  ETH Zurich Nov 3, 2014
This week, I am on a research visit at the
Data Analytics Lab
led by Thomas Hofmann at ETH Zurich. First impression is amazing,
Zurich seems like a great place to live and work.
Slides from my talk are here .
An interesting article Oct 23, 2014
Here is an interesting interview with
Michael Jordan
about the big thing about Big Data and more...
Two short papers out Oct 19, 2014
This week we released two short papers. First one,
S2CD: Semistochastic
coordinate descent , coauthored with Zheng Qu and Peter Richtárik,
extends the S2GD algorithm to coordinate setting.
The other one,
mS2GD: Minibatch semistochastic gradient descent in the proximal setting
, joint with Jie Liu, Martin Takáč and Peter Richtárik, discusses
extension of the S2GD algorithm into minibatched setting, which
admits twofold speedup. One from better estimates, and other from
simple parallelism.
New paper out Sep 30, 2014
Simple complexity analysis of direct search , joint with
Peter Richtárik.
See the abstract in the righthand column >
MMEI 2014 (in a castle!) Sep 9, 2014
Staying in beutiful Smolenice Castle in Slovakia, attending
International Conference on Mathematical Methods in Economy and Industry,
a conference dating its tradition back to 1973.
4th IMA NLAO Sep 3, 2014
This week I am attending
4th IMA Conference on Numerical Linear Algebra and Optimisation.
I coorganized minisymposium on First order methods and big data
optimization (organized with Zheng Qu and Peter Richtárik),
in which I am also giving a talk.
S2GD code available Jul 9, 2014
Efficient implementation of SemiStochastic Gradient Descent
for logistic regression is available in the
MLOSS repository. The code is works with MATLAB, and is
implemented in C++.
SIAM Annual Meeting Jul 8, 2014
On Monday I gave an invited talk at the SIAM Annual Meeting in Chicago
on the S2GD algorithm (slides) .
Withinin the same minisymposium, very interesting talk was delivered by S V N Vishwanathan on NOMAD
(slides)
As a representative of Edinburgh SIAM Student Chapter, I attended
a breakfast with other representatives and SIAM executives and got
some ideas on how to improve our chapter.
Regarding my observations, the whole conference feels much more
(compared to other conferences) like a networking event for
anyone involved with SIAM activities.
Google PhD Fellowship Jun 18, 2014
I was awarded the 2014 Google Europe Doctoral Fellowship in Optimization Algorithms!
The news was announced today in the
Google Research Blog.
The University of Edinburgh was particularly successful
this year, bagging two awards (each University can nominate up to two students).
Out of the 15 Europe Fellowships, 4 were awarded to universities in the UK:
2 others in Cambridge. The rest went to students in Switzerland (4),
Germany (3), Israel (2), Austria (1) and Poland (1).
This is what Google says about these Fellowships:
Nurturing and maintaining
strong relations with the academic community is a top priority at Google.
Today, we're announcing the 2014 Google PhD Fellowship recipients. These students,
recognized for their incredible creativity, knowledge and skills, represent some
of the most outstanding graduate researchers in computer science across the globe.
We're excited to support them, and we extend our warmest congratulations.
I would like to thank everyone for their support!
London Optimization W. Jun 9, 2014
This week, King's College is hosting
London Optimization Workshop. The first day saw, among others, very interesting talk
by Panos Parpas on
A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization.
SIAM OP14 May 18, 2014
I am off to San Diego to attend
SIAM Conference on Optimization. I am giving a talk about S2GD
(slides) .
The weather is obviously good :)
Funding awarded Apr 25, 2014
Thanks to support from SIAM and School Research & Development Fund, I am going to attend
2014 SIAM Annual Meeting
in Chicago, where I will represent Edinburgh SIAM Student Chapter.
Logarithm anniversary Apr 2, 2014
It has been 400 hundred years since Scot John Napier
published his Mirifici Logarithmorum Canonis Descriptio.
For this work, he is recognised as the inventor of logarithms.
At the occastion of this rare anniversary, I attended a
workshop that follows on from the last celebration held in 1914!
SIAM Student Conference Mar 14, 2014
I helped organise today's Edinburgh SIAM Student Chapter Conference 2014 . Except minor technical difficulties, all went well, and graduate students had the opportunity to take a look into different areas of research in applied mathematics.
Meet me Jan 25, 2014
In February, you can meet me in Cambridge around Feb 7 attending LMS meeting on Sparse Regularisation for Inverse Problems , or in London Feb 1719 at Big Data: Challenges and Applications .
Welcome... Jan 23, 2014
...to my new personal web page. I hope it's red enough.
Social Profiles