Biometrics and Machine Learning Group
Latest news
We are pleased to announce that Mateusz Trokielewicz defended (with honors) his doctoral dissertation entitled „Iris Recognition Methods Resistant to Biological Changes in the Eye” , supervised by prof. Czajka and prof. Pacut, on the 18th of July, 2019.
Iris scanner can distinguish dead eyeballs from living ones: MIT Technology Review reports on our recent developements in the field of presentation attack detection for cadaver irises.
We are pleased to announce that Mateusz Trokielewicz received the EAB European Biometrics Research Award 2016 for research on iris recognition reliability including template aging, influence of eye diseases and post-mortem recognition.
Is That Eyeball Dead or Alive? Adam Czajka discusses the prevention of iris sensors accepting the use of a high-resolution photo of an iris or, in a grislier scenario, an actual eyeball. For full article, please see IEEE Spectrum.
Introduction to Neural Networks
back to Introduction to Neural Networks (CSE 40868/60868)
Materials available here were prepared for students of the University of Notre Dame attending the course in Fall 2016. If you find these notes helpful in your work, please provide the following reference:
"Adam Czajka, Introduction to Neural Networks (CSE 40868/60868), Lecture Notes, Fall 2016, available online http://zbum.ia.pw.edu.pl/EN/node/60"
Progress
Wed. 12/07/2016:
Boltzmann Machines (slides)
Stochastic neurons, architecture of Boltzmann Machine. Data generation, causal model, non-causal model, thermal equilibrium. Sampling the model. Boltzmann Machine learning, positive and negative phases. Restricted Boltzmann Machine.
Mon. 12/5/2016:
Guest lecture, Dr. Walter Scheirer: Biological Neural Networks as a Gateway to Better Machine Learning (slides)
Fri. 11/11/2016 --> Wed. 12/02/2016:
Recurrent Neural Networks (RNN) (slides)
Computational graphs, unfolding the RNN. Design patterns, types of recurrences, deep RNN, recurrent Convolutional Neural Networks, bidirectional RNN. Training the RNN, back-propagation through time (BPTT), teacher-forcing training, gradient clipping. Long Short-Term Memory (LSTM). Associative memory, Hopfield nets, energy function, setting the network state, updating the weights, spurious minima, unlearning.
Fri. 11/4/2016 --> Wed. 11/9/2016:
Radial-Basis-Function (RBF) Networks (slides)
Structure of the RBF for classification and approximation problems. Cover's theorem. Training of the RBF, variants of multiple-phase training.
Wed. 10/12/2016 --> Wed. 11/2/2016:
Convolutional Neural Networks (slides)
CNN as a special case on MLP. Convolution operation, convolution vs cross-correlation, implementation in the CNN. Components of the CNN, convolutional layers, pooling and subsampling, fully-connected layers. Arrangement of neurons in 3D volumes, layer patterns. Visualization of the CNN, learned kernels, feature map outputs. Transfer learning.
Mon. 9/26/2016 --> Mon. 10/10/2016 (except for Fri. 7/10/2016):
Multi-layer perceptron and weight adaptation techniques (slides)
Layered structures, non-linear input-output transformation. Unconstrained optimization, cost function, minimization of cost function. First-order algorithms, steepest descent. Batch, on-line and mini-batch learning, stochastic gradient descent. Momentum, Nesterov Accelerated Gradient, Adagrad, Adadelta, Adam. Second-order methods, Newton's and Levenberg-Marquardt algorithms, quasi-Newton methods, DFP and BFGS. Practical heuristics, weight and bias initialization, avoiding neuron's saturation, normalization of input data. Function approximation, MLP as a function approximator.
Mon. 9/19/2016 --> Fri. 9/23/2016:
Multi-class classification (slides)
Linear machine, one-vs-all ("winner takes all") approach, learning algorithm. Machine with two outputs, equivalence to Rosenblatt's perceptron. One-vs-one approach. Softmax classification. Cross-validation.
Wed. 9/14/2016 and Fri. 9/16/2016:
Margin classifiers (slides)
Logistic regression, binary classification as an instance of regression. Support Vector Machine for linear classification. Dealing with non-separable data, slack variables, kernel trick.
Fri. 9/9/2016 and Mon. 9/12/2016:
Rosenblatt's perceptron
1. Slides and Matlab code: binary, linear and non-linear classification, oriented hyperplane; Rosenblatt's perceptron as a linear classifier; training of the perceptron, error-correction rule, modifications.
2. Perceptron training: video (22:00 -> 36:20)
3. Capacity of a single neuron: video (full)
Wed. 9/7/2016:
Guest lecture, Prof. Anderson Rocha: Hand-crafted and data-driven solutions for sensitive media detection (slides available through our closed forum on piazza.com)
Fri. 8/26/2016 --> Mon. 9/5/2016:
Introduction (slides)
Biological inspirations in Computer Science. Definitions of a neural network, useful properties on neural networks. Milestones. Human nervous system in a nutshell. Basic model of a neuron, McCulloch-Pitts model. Neuron equation, weights, bias, activation functions, layers. Static vs dynamic networks. Knowledge representation, building prior information and invariance into the network. Network learning, supervised, unsupervised and reinforcement learning. Learning tasks, pattern association, pattern recognition and function approximation.
Wed. 8/24/2016:
Course structure and syllabus (slides)