Popular Posts

Tuesday, November 20, 2012

To use tips

These are the few things which are important in ML. I too sometimes forget these.. :P
  • Use version control.
  • Separate code from data.
  • Separate input data, working data and output data.
  • Modify input data with care.
  • Save everything to disk frequently.
  • Separate options from parameters.
  • Do not use global variables.
  • Record the options used to generate each run of the algorithm.
% store the results
serialise(options, 'options.dat', options.working_path);
serialise(parameters, 'parameters.dat', options.working_path); 
OR
serialise(parameters, 'parameters.dat'], ...
              [options.working_path '_' options_configuration.name]);
  • Make it easy to sweep options.
  • Make it easy to execute only portions of the code.
run_experiment('dataset_1_options', '|preprocess_data|initialise_model|train_model|'); 
  • Use checkpointing.
% set the options
options = ...

% load the data
data = ...

if saved_state_exists(options)

    % load from disk
    [parameters, state] = deserialize_latest_params_state(options.working_path);

    % command line output
    disp(['Starting from iteration ' state.iteration]);

else

    % initialize
    parameters = init_parameters();
    state = init_state();

end

% learn the parameters
parameters = train_model(options, data, parameters, state);
  • Write demos and tests.
From  hunch.net which further has its root to link

Monday, October 1, 2012

Compiling mex files on 64bit Linux(UBUNTU) using MATLAB 2012

I could not setup the compiler for myself to create mex files. So I searched the internet, going through many blogs and posts. Finally I got a tutorial and 2 posts which I have almost copied shamelessly(maybe for my own reference.. :) ). The source link is at the end of the post.

This is a simple guide to compiling and running mex files from MATLAB R2012a on Ubuntu 12.04 64bit


I compiled my sweet hello.c file(You can get this hello.c in this text DOWN)
compile hello.c using the command in MATLAB:

>> mex hello.c

I had no idea what the problem was. But I was getting a warning that
Warning: You are using gcc version "4.6.3-1ubuntu5)".  The version currently supported with MEX is "4.4.6".


Although I searched the internet and few blogs told I need not downgrade to 4.4.6.(But still I installed gcc-4.4 on top of gcc-4.6)

run this command in terminal: 
sudo apt-get install gcc-4.4

The gcc version compatible with MATLAB R2012a is gcc-4.4 while the version pre-installed on Ubuntu 12.04 is gcc-4.6

Just follow these steps:

1. Open terminal and type
sudo gedit /usr/local/MATLAB/R2012a/bin/mexopts.sh
(OR where you installed your MATLAB)

2. Change 'gcc' to 'gcc-4.4' ,'g++' to 'g++-4.4' , 'gfortran' to 'gfortran-4.4' at all instances of  CC = 'gcc' , CXX = 'g++' and FC = 'gfortran'.
Save the file and exit.

3. Open MATLAB and type:
mex -setup (in the Command line).

MATLAB will show the following:

The options files available for mex are:


  1: /usr/local/MATLAB/R2012a/bin/mexopts.sh :
      Template Options file for building gcc MEX-files


  0: Exit with no changes

Enter the number of the compiler (0-1):

Select 1.

The setup is complete. Now time to test it.
4.  
If you have your code its fine. 
or this is a sample code(hello.c :P)

//SAMPLE1
#include <mex.h>

void mexFunction(int nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[])
{
  mexPrintf("Hello World!\n");


//SAMPLE2 

#include "mex.h"
#include "stdio.h"
#include "matrix.h"

void mexFunction(int nlhs,mxArray *plhs[],int nrhs, const mxArray *prhs[])
{

mxArray *xdata;
double *xValues;
int i,j,r,c;
double avg;

xdata = prhs[0];
xValues = mxGetPr(xdata);

r = mxGetN(xdata);
c = mxGetM(xdata);
for(i=0;i<r;i++)
{
    avg=0.0;
    for(j=0;j<c;j++)
        avg += *(xValues++);
   
    avg = avg/c;
    printf("avg of column %d is %f \n",i+1, avg);
}

}

A fast short tutorial on creating mex files:
pdf 
epub (looks good.. I mean soothing to eyes.. :P)
Save the text file as "hello.c"

Again compile your sweet "hello.c" in MATLAB command line:
>> mex hello.c

Now I got the following warning.

/usr/bin/ld: cannot find -lstdc++

Shit got real!!! 










---------------------------------------------------------------------------------------------------------------------


Don't Panic!!!

To fix this you need to find your mexopts.sh file and change the line
 CLIBS="$CLIBS -lstdc++"
 TO  (for 64bit)
 CLIBS="$CLIBS -L/usr/local/MATLAB/R2012a/sys/os/glnxa64 -lstdc++"

 OR  (for 32bit)
 CLIBS="$CLIBS -L/usr/local/MATLAB/R2012a/sys/os/glnx86 -lstdc++"
 Please search the path and see that it exists.

obviously, you’ll need to change /usr/local/MATLAB/ to wherever you actually installed MATLAB.

Your next step is to do the following in a bash prompt

ln -s /usr/local/MATLAB/R2012a/sys/os/glnxa64/libstdc++.so.6 
/usr/local/MATLAB/R2012a/sys/os/glnxa64/libstdc++.so

(Please see that the path exists. Maybe /usr/local/MATLAB/R2012a/sys/os/glnx86

again – substituting wherever you installed MATLAB for /usr/local/MATLAB/

 
5. Now change your MATLAB folder to the folder you saved your text file in and type this in the command window:  

>>mex hello.c
It compiles. :)
6. The file "hello.mexglx" or "hello.mexa64" depending on your OS (32 bit / 64 bit) will show up in the same directory. 
7. Now run it from MATLAB.

(for SAMPLE1)
>> hello

(for SAMPLE2)
>> x = [1 2; 3 4];
>> hello(x);
 

References: 
link1
link2
link3


Also came to know a few things about Matlab (In next post.. :) )

Thursday, August 16, 2012

Unsupervised Feature Learning and Deep Learning Resources

Unsupervised feature learning and deep learning has been fascinating to me recently and here are some interesting links and tutorials. 

First is Andrew Ng UFLDL tutorial which has some explanation and intuition. It also has starter code where we need to complete only the objective code. 

http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial  by Andrew Ng
 

Reducing the Dimensionality of Data with Neural Networks(pdf)

Self-taught learning(pdf)

An interesting reading about applications of Deep Learning at Google
link1
link2
link3


Google Research ICML paper(pdf)


Transforming Autoencoder(pdf)

Extracting and Composing Robust Features with Denoising Autoencoders(pdf)
(pdf   8 pages)

Greedy Layer-Wise Training of Deep Networks(pdf)

Saturday, July 21, 2012

ML Quick start guide

I was seeing Andrew Ng's ML  lectures and I was half way.
Then I was surprised when I saw a link where there were some prerequisites suggested by someone which I never thought were prerequisites for Basic ML.
All were good links(the prerequisite list were the ones I knew) which i wanted to complete after ANg's lecture.
I would like to post them here for people who want to give time for ML and for my own reference.  

I myself have not completed the prerequisites but I am sure that all of them are good(everyone suggests the same).

Quick start guide(Video Links)

People generally ignore this but knowing about Algorithms of CS is also important.(CORMEN/Skiena/Kleinberg)

The following knowledge is prerequisite to make any sense out of Machine learning

    Linear Algebra by Gilbert Strang: http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/video-lectures/
    Convex Optimization by Boyd http://see.stanford.edu/see/courseinfo.aspx?coll=2db7ced4-39d1-4fdb-90e8-364129597c87
    Probability and statistics for ML: http://videolectures.net/bootcamp07_keller_bss/
    Some mathematical tools for ML: http://videolectures.net/mlss03_burges_smtml/ Video+Audio Very bad quality
    Probability primer (measure theory and probability theory) : http://www.youtube.com/playlist?list=PL17567A1A3F5DB5E4&feature=plcp

Once the prerequisites are complete,  the following are good series of lectures on Machine Learning.
Basic ML:

    Andrew Ng’s Video Lectures(CS229) : http://see.stanford.edu/see/courseinfo.aspx?coll=348ca38a-3a6d-4052-937d-cb017338d7b1
    Andrew Ng’s online course offering: http://www.ml-class.org
    Tom Mitchell’s video lectures(10-701) : http://www.cs.cmu.edu/~tom/10701_sp11/lectures.shtml
    Mathematicalmonk’s videos: http://www.youtube.com/playlist?list=PLD0F06AA0D2E8FFBA&feature=plpp
    Learning from data by CalTech:     http://work.caltech.edu/telecourse.html


Advanced ML:
    Probabilistic graphical models by Daphne Koller(Stanford)
    http://www.pgm-class.org/

   
    SVMs and kernel methods , Scholkopf: http://videolectures.net/mlss03_scholkopf_lk/
    basics for Support Vector Machines and related Kernel methods. Video+Audio Very bad quality
    Kernel methods and Support Vector Machines, Smola: http://videolectures.net/mlss08au_smola_ksvm/
    Introduction of the main ideas of statistical learning theory, Support Vector Machines, Kernel Feature Spaces, An overview of the applications of Kernel Methods.
    Easily one of the best talks on SVM. Almost like a run-down tutorial. http://videolectures.net/mlss06tw_lin_svm/
    Introduction to Learning Theory, Olivier Bousquet.  http://videolectures.net/mlss06au_bousquet_ilt/
    This tutorial focuses on the “larger picture” than on mathematical proofs, it is not restricted to statistical learning theory however. 5 lectures.
    Statistical Learning Theory, Olivier Bousquet, http://videolectures.net/mlss07_bousquet_slt/
    This course gives a detailed introduction to Learning Theory with a focus on the Classification problem.
    Statistical Learning Theory, John-Shawe Taylor, University of London.  7 lectures.  http://videolectures.net/mlss04_taylor_slt/
    Advanced Statistical Learning Theory, Oliver Bousquet. 3 Lectures. http://videolectures.net/mlss04_bousquet_aslt/

Most of the above links have been filtered from http://onionesquereality.wordpress.com/2008/08/31/demystifying-support-vector-machines-for-beginners/
 Important Links:

    Channel for probability primer and Machine learning . : http://www.youtube.com/user/mathematicalmonk#grid/user/D0F06AA0D2E8FFBA [VIDEO]
    A comprehensive blog comprising of best resources for ML : http://onionesquereality.wordpress.com/2008/08/31/demystifying-support-vector-machines-for-beginners/ [links]
    Another great blog for ML http://www.quora.com/Machine-Learning/What-are-some-good-resources-for-learning-about-machine-learning-Why [links]
    Lectures 21-28 by Gilbert Strang, linear algebra way of optimization.  http://academicearth.org/courses/mathematical-methods-for-engineers-ii

Monday, May 14, 2012

Using multiple functions in single file(globally)

Declaration: 
function funs = makefuns
  funs.fun1=@fun1;
  funs.fun2=@fun2;
end

function y=fun1(x)
  ...
end

function z=fun2
  ...
end
 
 
 
Calls: 
myfuns = makefuns;
myfuns.fun1(x)    
myfuns.fun2() 

see also for object oriented programming in matlab: 
http://yagtom.googlecode.com/svn/trunk/html/objectOriented.html

Machine Learning Resources for Beginners and Beginners++

I have been learning ML for sometime now and I have spent some time on finding what are some of the good resources for ML. 

For Beginners I would say if you look beyond beginners section  you might be overwhelmed by the amount of content. Just relax and eat it bit by bit. This is not a race.


My most visited page was random copied resource list on Machine Learning. As now I have some idea about what all the links are about I will try to reorganize the page with time(Possibly subcategorize). 
Pre-requisites
Machine Learning Lectures:
Beginners
  1.  Machine Learning Course by Andrew Ng at Coursera
    Thinking about what should be the first course in ML. The only course that comes to my mind is Machine Learning Course by Andrew Ng at Coursera. Practically no prerequisites(maybe Calculus). I highly recommend doing the programming exercises. Keeps your interests going. Not too mathematical. There is also a very short tutorial on Octave/Matlab(Just enough for doing Homeworks).
  2.  Linear Algebra Course by Gilbert Strang
    (I believe the best on internet)
    We also need to understand about spaces. This is a good place to get the intuition about vector spaces. Video Lectures accompanies a text book. I would also like to suggest lecture notes by vandenberge (pdf) which are great for implementing various matrix factorizations as recursive solution.
  3. Learning from Data by Abu Mostafa at Caltech
    Course is more detailed than ML by Andrew Ng at Coursera. Very clear explanation of the content. Video Lectures accompanies a textbook.
  4.  Linear and Integer Programming by Sriram Sankaranarayanan at Coursera/UoColorado
    To start getting a flavour of optimization this is an excellent place to start. This also shows the advantage of abstraction in solving problem. 
Beginners++
  1.  Machine Learning by Andrew Ng at Stanford SEE
    This is a longer version of classroom course taught by Andrew Ng at Stanford. This course covers a lot of topics not covered in above courses. This courses accompanies an excellent set of lecture notes(Highly recommended).
    see lecture notes because sometimes lectures are not clearly explained but still is good for introductory ML.
  2.  Probabilistic Graphical Models by Daphne Koller at Coursera
    http://openclassroom.stanford.edu/MainFolder/CoursePage.php?course=ProbabilisticGraphicalModels
    This course is like marriage of Probability and Graph Theory which is a significant chunk in Machine Learning. It involves efficient inference methods and how graphs help us. The programming assignments are not very easy like previous courses. This is widely used in NLP and Computer Vision.
  3. Machine learning  Lectures by Mathematical Monk
    If wondering who is he, then visit http://www.dam.brown.edu/people/jmiller/teaching.html
    This is a course covering a lot of methods similar to Machine Learning Course by Andrew Ng. But the methods covered are wider(Also more probabilistic) than Andrew Ng's ML course.
  4. Introduction to ML This and This by Alex Smola 10-701 at CMU
    This is a foundation course for PhD students.

  5. Scalable Machine Learning by Alex Smola
    http://alex.smola.org/teaching/berkeley2012/
    This course deals with Scalablity issues and jargons in Machine Learning. It is also good for Statistics, Graphical Models, Recommender Systems, Kernels. This course should be complemented with a practical Project considering Scalability issues.
  6. Introduction to Recommender Systems
    Not too difficult course but the whole course is on recommender Systems.
  7. Machine Learning By Pedro Domingos at Coursera/UoW
    This course also covers lot of topics well explained. This could be done independently(Worth considering atleast for the topics not covered in previous course)
  8. Mining Massive Data Sets by Jure Leskovec at Coursera/Stanford
    Deals with scalability issues of ML/Datamining.
  9.  Neural Networks for Machine Learning by Geoffrey Hinton
    Great introductory course to Neural Network. This course has included all the recent advanced in NNs. At the end of the course it might get difficult to understand. You might need to complement the end of the course with some ML text on inference(including approximate inference).
  10. Deep Learning by Yann Lecunn
    Deep Learning course dealing with practicality. Doing work on GPU etc.
  11.  Neural Networks by Hugo Larochelle
    This is a fast paced course in Neural Networks. Great if you have some background of bits and pieces of NN/Inference/Graphical Models.
  12.  Deep Learning by Nando D. F. Oxford 
    https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/
  13.  Wow course on reinforcement learning by david silverhttps://www.youtube.com/playlist?list=PL5X3mDkKaJrL42i_jhE4N-p6E2Ol62Ofa
  14.  Wow course on approximate dynamic programming AKA Reinforcement Learning by Dimitri P. Bertsekas https://www.youtube.com/playlist?list=PLiCLbsFQNFAxOmVeqPhI5er1LGf2-L9I4
    http://web.mit.edu/dimitrib/www/DP_Slides_2015.pdf
    http://web.mit.edu/dimitrib/www/Abstract_DP_Online_Complete.pdf (Beauty)
    http://arxiv.org/abs/1405.6757 Proximal RL
  15. Advanced Inference  / Submodular functions course by Jeff Bilmes
    https://www.youtube.com/channel/UCvPnLF7oUh4p-m575fZcUxg/videos
  1. Big Data, Large Scale Machine Learning by Langford, Lecunn
    Again a practical course for dealing with scalability,
    online learning etc.
  2. Convex Optimization I  by Stephen Boyd at Stanford SEE

     CVX101 Convex Optimization by Stephen Boyd

    A great course on Convex Optimization. I would not say this is an easy course. But totally worth the effort put in. This also complements a text book freely available.
  3. Convex Optimization II  by Stephen Boyd
    Optimization course covering some other useful topics not covered in previous course.
  4. CS281: Advanced Machine Learning by Ryan Adams at Harvard
    This is a real advanced course on Machine Learning. Covering practically everything a general ML course can have(except really specialized topics).
  5. Advanced Optimization and Randomized Methods by AlexSmola/SuvritSra at CMU
    https://www.youtube.com/playlist?list=PLjTcdlvIS6cjdA8WVXNIk56X_SjICxt0d
    http://people.kyb.tuebingen.mpg.de/suvrit/teach/ee227a/lectures.html
    Advanced optimization covering many new topics missing in basic optimization course. Many of these topics are useful in large scale optimization.
  6. Introduction to robotics
  7. Introduction to linear dynamical systems by Stephen Boyd

  8. Probabilistic graphical models - advanced methods (Spring 2012) by murphy at stanford

  9.  For inference and information theory(Mackay) [Lecture9-Lecture 14 Recommended]:
    This is a course on information theory and inference. After first half the course practically changes to an ML course mostly inference and Neural Networks.
  10. Machine Learning by nando freitas at UBC
  11. Game Theory Part I and Part II at coursera
    Help to build decision theory intuitions and concepts. Also one of the branches of AI.
  12. Harvard Data Science
    http://cm.dce.harvard.edu/2014/01/14328/publicationListing.shtml 
  13. Probabilistic Graphical Models 10-708, Spring 2014  Eric Xing (link)
  14. Multiple View Geomtry by Prof. Prof. D. Cremers
  15.  Variational  Methods for Computer Vision by Prof.  D. Cremers
  16. Visual Navigation for Flying Robots TUM
  17. Machine Learning for Computer Vision TUM
  18. Optimization by Geoff Gordan 
  19. Machine Learning by Aarti Singh and Geoff Gordon
  20. A collection of links for streaming algorithms and data structures  
  21. 6.895 Sketching, Streaming and Sub-linear Space Algorithms
  22. Statistical Learning Theory Poggio (todo)
  23. Statistical Machine Learning Larry Wasserman (todo)
  24.  Regularization Methods for Machine Learning 2016
ML Books, Notes, Links:
Shelf Books(pic from Stephen Gould):


ML Book list: (link to Michel Jordan recommendation)



Standard ML Text
Pattern Recognition and Machine Learning Christopher M. Bishop
The Elements of Statistical Learning: Data Mining, Inference, and Prediction Trevor Hastie
Machine Learning: A Probabilistic Perspective Kevin P. Murphy
Probabilistic Graphical Models: Principles and Techniques Daphne Koller
A Probabilistic Theory of Pattern Recognition Stochastic Modelling and Applied Probability Gabor Lugosi


Probability Theory
Probability and Random Processes Geoffrey R. Grimmett
Probability Theory: The Logic of Science E.T. Jaynes
Probability: Theory and Examples Richard Durrett
A User's Guide to Measure Theoretic Probability David  Pollard


Statistics
All of Statistics: A Concise Course in Statistical Inference Larry Wasserman
All of Nonparametric Statistics Larry Wasserman
Statistical Inference Roger L. Berger


Bayesian Theory and Practice
Bayesian Core: A Practical Approach to Computational Bayesian Statistics Jean-Michel Marin
The Bayesian Choice Christian P. Robert
Bayesian Data Analysis Andrew Gelman


Large Sample Theory and Asymptotic Statistics
A Course in Large Sample Theory Thomas S. Ferguson
Elements of Large-Sample Theory E.L. Lehmann
Asymptotic Statistics A.W. van der Vaart




Monte Carlo Statistical Methods Christian P. Robert
Introduction to Nonparametric Estimation Alexandre B. Tsybakov
Large-Scale Inference Bradley Efron


Optimization
Linear Algebra and Its Applications Gilbert Strang
Matrix Computations Gene H. Golub
Introduction to Linear Optimization Dimitris Bertsimas
Numerical Optimization Jorge Nocedal
Introductory Lectures on Convex Optimization: A Basic Course Y. Nesterov
Convex Optimization Stephen Boyd
Nonlinear Programming Dimitri P. Bertsekas


Information Theory
Elements of Information Theory Thomas M. Cover
Information Theory, Inference and Learning Algorithms David J.C. MacKay


Analysis
Introductory Functional Analysis with Applications Erwin Kreyszig

Dont Look DOWN!!!  Will edit and organize stuff after this soon
________________________________________________________________________________


Road map to EP (Minka)
http://www.convexoptimization.com/dattorro/convex_optimization.html
http://www.convexoptimization.com/wikimization/index.php/

EE464: Semidefinite Optimization and Algebraic Techniques
Convex Analysis lecture notes by Nemirovski
http://www2.isye.gatech.edu/~nemirovs/
http://www2.isye.gatech.edu/~nemirovs/OptIII_TR.pdf 
http://www2.isye.gatech.edu/~nemirovs/OPTIII_LectureNotes.pdf
http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-253-convex-analysis-and-optimization-spring-2012/lecture-notes/MIT6_253S12_lec_comp.pdf  (DIMITRI P. BERTSEKAS)

Probabilistic Models for Cognition by Noah Good Man and Joshua Tenenbaum

Compressed sensing

http://www.sms.cam.ac.uk/collection/1117766/#!

http://www.brainshark.com/brainshark/brainshark.net/portal/title.aspx?pid=zCdz10BfTRz0z0#!

More ML books: 
http://www.reddit.com/r/MachineLearning/comments/1jeawf/machine_learning_books/

A lot of lectures related to AI by Coursera
https://www.coursera.org/category/cs-ai


A cool intro to machine learning with python examples
Programming Collective Intelligence: Building Smart Web 2.0 Applications by Toby Segaran

If you want to read a book on ML then read 
  • The Elements of Statistical Learning(good book freely downloadable)
http://www.stanford.edu/~hastie/local.ftp/Springer/ESLII_print5.pdf

  • Pattern Recognition and Machine Learning by Christopher Bishop
  • Machine Learning by Tom Mitchell 
  • Machine Learning A Probabilistic Perspective by Kevin Murphy(I choose this)
some more resources on ML class resources link by Andrew Ng:
https://share.coursera.org/wiki/index.php/ML:Useful_Resources
For optimization read 
  • Convex Optimization by Stephen Boyd(good book freely downloadable)
  • numerical optimization nocedal wright
http://www.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf
  • Non-Linear Programming by Dimitri P Bertsekas

For Scalability of Machine Learning read

  • Scaling Up Machine Learning: Parallel and Distributed Approaches 

by Ron Bekkerman, Mikhail Bilenko, John Langford

For Graphical Models

  • Probabilistic Graphical Models: Principles and Techniques by Daphne Koller 

More books:

http://www.reddit.com/r/MachineLearning/comments/1jeawf/machine_learning_books/

http://pindancing.blogspot.in/2010/01/learning-about-machine-learniing.html

A very strong ML community:  

http://metaoptimize.com/

http://metaoptimize.com/qa

 Andrew Moores slides http://www.autonlab.org/tutorials/

http://videolectures.net/mlss04_bishop_gmvm/     (graphical models and variational methods bishop)

http://videolectures.net/mlss06tw_wainwright_gmvmm/   (GMvariational methodsmessage passing)

http://www.cs.jhu.edu/~jason/tutorials/variational.html (High level explaination)

MATLAB tutorial: basic and advanced:

http://code.google.com/p/yagtom/

http://lukstafi.blogspot.de/2013/10/artificial-intelligence-university.html

__________________________________________________________________________________

Dont look down.. :P ML and stuff

ML and random stuff... :D

Random bayesian books 

Doing Bayesian Data Analysis (Kruschke)
Bayesian Data Analysis (Gelman et al.)
Applied Bayesian Hierarchical Methods (Congdon)
Statistical Rethinking: A Bayesian Course (McElreath)

Bayesian Analysis Made Simple (Woodward)
The BUGS Book (Lunn et al.)
Bayesian Methods (Gill)
Bayesian Ideas and Data Analysis (Christensen et al.)
Bayesian Statistics and Marketing (Rossi et al.)
Introduction to Bayesian Econometrics (Greenberg)
Bayesian Forecasting and Dynamic Models (West and Harrison)
Bayesian Psychometric Modeling (Levy and Mislevy)
Bayesian Models (Hilbe et al.)
Large-Scale Inference: Empirical Bayes Methods (Efron)
Handbook of Markov Chain Monte Carlo (Brooks et al.)
Fundamentals of Nonparametric Bayesian Inference (Ghosal and van der Vaart)

 

Compressed Sensing

https://sites.google.com/site/igorcarron2/cs
http://dsp.rice.edu/cs
http://nuit-blanche.blogspot.fr/ 

UFLDL
http://web.eecs.umich.edu/~honglak/teaching/eecs598/schedule.html
http://www.cs.toronto.edu/~hinton/deeprefs.html
http://deeplearning.stanford.edu/wiki/index.php/Main_Page

Tom Minka's page 

http://alumni.media.mit.edu/~tpminka/

http://www.stats.ox.ac.uk/~teh/teaching/npbayes.html#modernbnp

http://mlg.eng.cam.ac.uk/zoubin/course05/index.html

http://mlg.eng.cam.ac.uk/teaching/4f13/1213/  (Machine Learning 2013 cambridge / Murphy/PRML textbook)

Gatsby Machine Learning Qualifying Exam Topic List 

http://www.cs.princeton.edu/~blei/courses.html 

http://mlg.eng.cam.ac.uk/zoubin/p8-07/index.html (Image search and modelling)

Advanced Topics in Machine Learning ( subspace learning, manifold learning, subspace clustering, manifold clustering)

http://www.vision.jhu.edu/teaching/learning10/

Short Python writeup.

http://alumni.media.mit.edu/~tpminka/PLE/python/python.html

Hadoop in python  

http://www.michael-noll.com/tutorials/writing-an-hadoop-mapreduce-program-in-python/

http://blog.cloudera.com/blog/2013/01/a-guide-to-python-frameworks-for-hadoop/ 

graphics:

http://inst.eecs.berkeley.edu/~cs184/fa12/onlinelectures1.html

 http://www.youtube.com/user/raviramamoorthi/videos?view=1&flow=grid

https://graphics.stanford.edu/wikis/cs348b-11/Lectures#Goals 

Large scale ML and data mining: 

http://www.stanford.edu/class/cs224w/

http://www.stanford.edu/class/cs246/handouts.html

http://www.stanford.edu/group/mmds/

http://www.cs.cornell.edu/Courses/cs6784/2010sp/ 

 NETLOGO tutorial 

http://i-programmer.info/programming/other-languages/5613-getting-started-with-netlogo.html 

Sunday, March 4, 2012

Difference between Self-taught learning and semi-supervised learning settings

There are 2 common type of unsupervised learning settings:
  • Self-taught learning
  • Semi-supervised learning
   Self-taught learning setting is more versatile, broadly applicable and does not assume that your unlabeled data has to be drawn from the same distribution as your labeled data. In Self-taught learning setting it is not necessary that most of the unlabelled data belongs to at least one class, it may happen that appreciable amount of data does not belong to any class. 

   Semi-supervised learning setting assumes that unlabeled data comes from exactly the same distribution as the labeled data. In  Semi-supervised learning setting most of the unlabelled data belongs to one of the classes.

These two methods are most powerful in problems where we have a lot of unlabeled data, and a smaller amount of labeled data.

Saturday, March 3, 2012

closed-form expression for nth term of the Fibonacci sequence

Generally we all know what a Fibonacci series is. I had 1st seen it in my computer science class where I used the recursive relation to find nth term of Fibonacci series, when I was 1st taught recursive functions. Today my friend told me that he solved the recurrence using power series method to get the golden ratio's and the relation. So I also wanted to write about it and derive the same with z-transform which I learned in my DSP class, one of the few classes I read the books of. So here we are and lets get started.

So lets see the Fibonacci series first...

1,1,2,3,5,8,13,.....

If y(n) be the nth term of the Fibonacci sequence then the recurrence relation below clearly satisfies the relation below...
y(n) = y(n-1) + y(n-2)            --------------(1)

we also have 2 initial conditions:
1) y(0) = y(-1) + y(-2) = 1
2) y(1) = y(0)  + y(-1) = 1

from these conditions we see that y(-1) = 0 and y(-2)=1 work this out and make sure u get this.

 To take the z-transform of the eq. (1) we need to know about one-sided or unilateral z-transform. Its not very different from two-sided z-transform. Its just that in two-sided z-transform needs signal to be specified in the range (-∞,∞). Since input is applied at a finite time sequence say n0 therefore we need signal to be specified in the range [n0,∞).

The unilateral or one-sided z-transform may be defined as:
 and the time shifting property is also a bit different just a bit modified and can be intuitively proved.Which I am not going to do here.
TIME DELAY PROPERTY:




Taking z-transform by applying time delay property,  we get..


Y(z) = [z-1 Y(z) + y(-1)]  + [z-2Y(z) + y(-2) + y(-1)z-1]
 or
Y(z) = 1/(1 - z-1 - z-2)
or
Y(z) = z2/(z2 – z1 -1)
or 
Y(z) = z2/(z - p1)(z - p2)        where p1=(1 + √5)/2     and     p2=(1 – √5)/2
or 
Y(z) = 1/(1 - p1 z-1)(1 - p2 z-1)
using partial fraction, we get :
 
Y(z) = A1/(1 - p1 z-1)  +  A2/(1 - p2 z-1)
where A1 = 1/(1 – p2p1-1) = p1/(p1 – p2) = p1 / √5
and     A2 = 1/(1 – p1p2-1) = p2/(p2 – p1) = -p2 / √5

 Now we know that or can be easily verified that:
UZ(an u(n)) = 1/(1 – az-1)  where UZ(.) is unilateral z-transform

so we finally get, 

y(n) = [A1(p1)n   + A2(p2)n ]u(n)

substituting A1 and A2 we get:
                                                   
 y(n) = (1/√5)[(p1)n+1   - (p2)n+1 ]u(n)    ---------------(2)
 y(n) = (1/√5)[(p1)n+1   - ( p1_conjugate)n+1 ]u(n)  
 where p1=(1 + √5)/2     and     p2 = p1_conjugate=(1 – √5)/2
 
equation (2) above is the formula which works as nth term of Fibonacci series.