Advanced Topics in Machine Learning

On 8th April 2015 we will meet in Baskin Engineering 169 as usual.

Starting from 10th of April 2015 the class will shift to SocSci 2, Room 159.

This is a research reading course on advanced topics in machine learning. We will discuss both frequentist and Bayesian algorithms. The focus will be on an unifying perspective as well as scalability. Students are expected to do a fair amount of independent reading and information gathering. Selected topics that we will cover include:

  • Kernel density estimation
  • Exponential families
  • Density estimation with exponential families
  • Unconstrained optimization 
    • Batch
    • Stochastic
  • Conditional densities
  • Logistic regression
    • Binary
    • Multiclass
    • Multilabel
    • Ranking loss (RobiRank)
  • Support Vector Machines
    • Binary
    • Multiclass
    • Multilabel
    • Quadratic programming
    • Scaling out (RERM)
  • Mixture models
    • General exponential family models
    • Dirichlet process mixture models
    • Chinese restaurant process
    • LDA
    • Large scale inference in LDA (nomad LDA)
    • Indian buffet process

 

Papers:

  • A Bayesian Interpretation of Interpolated Kneser-Ney by Y W Teh http://people.ee.duke.edu/~lcarin/Teh_TechReport.pdf
  • Producing Power-Law distributions and damping word frequencies with two stage language models by Goldwater et al., http://www.jmlr.org/papers/volume12/goldwater11a/goldwater11a.pdf

Instructors and Assistants