Current Position

Past Positions

Latest News

Dec 22, 2015
My paper “Scalable Gaussian Process Classification via Expectation Propagation” with Daniel Hernández-Lobato has been accepted at AISTATS 2016 with an oral presentation (only 35 out of 537 submissions got an oral).
Dec 2, 2015
I have uploaded to the arxiv the paper “A General Framework for Constrained Bayesian Optimization” with Mike, Matt, Ryan and Zoubin. Here is the pdf.
Dec 1, 2015
I will be presenting 4 posters at the NIPS workshops. These are “Black-box alpha-divergence minimization”, “Predictive Entropy Search for Multi-objective Bayesian Optimization”, “Stochastic Expectation Propagation for Large Scale Gaussian Process Classification” and “Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation”. You can find these in the Workshops section on the Publications tab.
Sep 4, 2015
My paper “Stochastic Expectation Propagation” with Yingzhen Li and Rich Turner has been accepted for presentation at NIPS 2015 with a spotlight (only 4% of submitted papers had a spotlight).
May 21, 2015
I gave a talk on my work “Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks” at the workshop on Gaussian Process Approximations. The slides can be found here.
Apr 25, 2015
Three of my papers were accepted for presentation at the International Conference on Machine Learning (ICML) 2015. The titles are “Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks“, “Predictive Entropy Search for Bayesian Optimization with Unknown Constraints” and “A Probabilistic Model for Dirty Multi-task Feature Selection“.
Mar 31, 2015
I gave a talk on Bayesian optimization and information-based methods at the Machine Learning meetup in Boston. The slides can be found here.
Nov 17, 2014
My paper Expectation Propagation in Linear Regression Models with Spike-and-slab Priors with Daniel Hernández-Lobato and Alberto Suárez has been accepted for publication in the journal Machine Learning.
Nov 9, 2014
Four of my extended abstracts have been accepted for presentation at the Workshops of the Neural Information Processing Systems (NIPS) conference. The titles of the abstracts are: “Predictive Entropy Search for Bayesian Optimization with Unknown Constraints”, “Predictive Entropy Search for Efficient Global Optimization of Black-box Functions”, “Learning the Semantics of Discrete Random Variables: Ordinal or Categorical? and “A Probabilistic Model for Dirty Multi-task Feature Selection”. The links to the  pdf files can be found in the publications tab.
Sep 9, 2014
Two of my papers have been accepted for presentation at the Neural Information Processing Systems (NIPS) conference: “Predictive Entropy Search for Efficient Global Optimization of Black-box Functions” with Matt and Zoubin and “Gaussian Process Volatility Model” with Yue and Zoubin.
May 13, 2014
I was awarded a prestigious fellowship from the Rafael del Pino Foundation to complete a 2 year postdoctoral research program in the Harvard Intelligent Probabilistic Systems group, where I will be working from September 2014 on machine learning and Bayesian optimization with Professor Ryan Adams.
April 12, 2014
Three of my papers with N. Houlsby and Z. Ghahramani have been accepted for presentation at the International Conference on Machine Learning (ICML) 2014.
February 20, 2014
I gave a talk on Determinantal Point Processes with Hong Ge at the Machine Learning Group in Cambridge. The slides for the talk are available here.
February 12, 2014
I gave a talk on Gaussian process conditional copulas at the first MSR-MLG (Microsoft Research and Machine Learning Group) joint meeting in Cambridge. The video is available here.
October 21, 2013
I will be presenting the workStochastic Inference for Scalable Probabilistic Modeling of Binary Matrices” at the NIPS Workshop Randomized Methods for Machine Learning, this December in Lake Tahoe, USA.
October 12, 2013
I will be giving a talk as part of the Oxford-Man Institute Monday Sandwich Seminar Series in the University of Oxford on Monday 28th. The talk will be about conditional copulas with applications to financial time series.
October 12, 2013
http://jhml.org was set up.

Short Biography

I am currently a postdoctoral fellow in the Harvard Intelligent Probabilistic Systems group at the School of Engineering and Applied Sciencies of Harvard University, working with the group leader Prof. Ryan Adams. This position is funded through a post-doctoral fellowship given by the Rafael del Pino Foundation. Before that, I was a postdoctoral research associate in the Machine Learning Group at the Department of Engineering of Cambridge University (UK) from June 2011 to August 2014, working with Prof. Zoubin Ghahramani. During my first two years in Cambridge I worked in a collaboration project with the Indian multinational company Infosys Technologies. I also spent two weeks giving lectures on Bayesian Machine Learning at Charles University in Prague (Czech Republic). From December 2010 to May 2011, I was a teaching assistant at the Computer Science Department in Universidad Autónoma de Madrid (Spain), where I completed my Ph.D. and M.Phil. in Computer Science in December 2010 and June 2007, respectively. I also obtained a B.Sc. in Computer Science from this institution in June 2004, with a special prize to the best academic record on graduation. My research revolves around model based machine learning with a focus on probabilistic learning techniques and with a particular interest on Bayesian optimization, matrix factorization methods, copulas, Gaussian processes and sparse linear models. A general feature of my work is also an emphasis on fast methods for approximate Bayesian inference that scale to large datasets. The results of my research have been published at top machine learning journals (Journal of Machine Learning Research) and conferences (NIPS and ICML).

Contact Information

Maxwell-Dworkin, Room 209
33 Oxford Street
Cambridge, MA 02138
Email: jmh-at-seas.harvard.edu

Curriculum Vitae

My CV can be downloaded from this link [pdf].