Announcements | Machine Learning for Signal Processing | Quick Links |
---|---|---|
Coming up in an OBE format!
*OBE is outcome-based education according to Washington Accord 28 Dec. 2021 Classes start on Tue, 4 Jan. 2022. Open for UG3, UG4, PG1 and MS/PhD only. Limited to 50 seats, apply through ERP within 3 Jan 2022. Students are expected to be proficient in Python programming. Prerequisite: Digital Signal Processing |
EE60020
Spring 2022 Subject Type: Elective | LTP: 3-0-0 | Credits: 3 Online MS Teams Time: Slot A / Mon (08:00 AM - 09:55 AM) + Tue (12:00 PM - 12:55 PM) Instructors: Dr. Debdoot Sheet TAs: Srinivas Maddimsetti Grading: Quizzes 20%, Coding Assignments 20%, Mid-Term 25%, End-Term 35% |
Linear Algebra - Gilbert Strang
A Gentle Introduction to Programming Using Python - Sarina Canelake Design and Analysis of Algorithms - Dana Moshkovitz and Bruce Tidor Tools of the Trade: Anaconda Python 3.6 | PyTorch | Getting started with PyTorch |
Why this subject? | |
---|---|
The student undertaking this subject on its successful completion would be able to apply concepts of Bayesian decision theory, information theory, linear discriminant analysis, neural networks and deep neural architectures, generative models to analyse, segment, restore, filter signals as well as infer about the processes generating the signals. The students would also be able to explain these processes and their rational basis, while extending their real life practical implementation through computerised implementation.
The subject will focus on demonstrating different applications of machine learning to a number of examples of signal processing. The subject will contain tutorials which will focus on hands-on session and implementation of machine learning algorithms that use them. Students on completion are expected to be able to understand the concepts of machine learning and will be able to develop signal processing solutions using them. Text books: [1]. S. Theodoridis and K. Koutroumbas, Pattern Recognition, 4th Edition, Elsevier-Academic Press, 2009. [2]. I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016. [3]. S. Haykin, Neural Networks and Learning Machines, 3rd Edition, Pearson, 2008. [4]. T. M. Cover, J. A. Thomas, Elements of Information Theory, 2nd Edition, Wiley. 2006. Reference books: [R1]. T. M. Mitchell, Machine Learning, Mc. Graw Hill Education, 1997. [R2]. C. M. Bishop, Pattern Recognition and Machine Learning, 2nd Edition, Springer, 2011. [R3]. C. M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, 1995. [R4]. R. O. Duda, P. E. Hart, D. G. Stork, Pattern Classification, 2nd Edition, Wiley, 2001. [R5]. D. Cohen-Or, C. Greif, T. Ju, N. J. Mitra, A. Shamir, O. Sorkine-Hornung, H. Zhang, A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics and Image Processing, CRC Press, 2015. Measure of Outcome: A student undertaking this subject would be graded based on perfromance in all of the following: (1) Regular participation in class activity. (2) Timely submission of all quizzes, online assignments. (3) Participation in tutorials in class. (4) Appear for all the exams. (5) Also attend the practice tutorials and workshops. |
Supervised Learning Discriminative Modeling, Learning Patterns and Feature Selection Unsupervised Learning Clustering and Sequential Algorithms Representation Learning, Dictionary Learning and Deep Neural Networks |