We are designing faster algorithms for Face Recognition that runs on commodity hardware. We use Principal Component Analysis, and Linear Discriminant Analysis as a training model. Training time on GPU linearly increases whereas on CPU it remains quadratic. We also use Incremental algorithms for PCA and LDA to learn from videos in online fashion.
In this information era, we have most of our data secured by computers by incurring different security mechanisms such as passwords, encryption keys, fingerprints, faces as well iris data. Over Last three decades, face recognition have been a pervasive research problem in computer vision due to its wide applicability. Computation of high dimensional data in real-time can increase time complexity. To overcome time complexity, we can use hardware with more processing powers. Though high-end CPUs can reduce the computation time, GPUs can reduce computation time significantly, because they have been designed for specialized optimization for faster arithmetic operations than traditional processors exploiting the power of streaming multiprocessors.
Besides, these high-end hardware (CPUs & GPUs) can cost a fortune. We are designing faster algorithms for Face Recognition that runs on commodity hardware. We use Principal Component Analysis, and Linear Discriminant Analysis as a training model. Training time on GPU linearly increases whereas on CPU it remains quadratic. We also use Incremental algorithms for PCA and LDA to learn from videos in online fashion. By sacrifice of some frames, we had been able to preserve frame rates and retain recognition accuracy of around 90%.
Members
Axat Chaudhary
Mayank Jobanputra
Saumil Shah
Keywords: Data Science, Theoretical Computer Science, Communications and Signal Processing