Probabilistic Neural Networks and Deep Learning
About this course
This course is designed to equip students with a basic to advanced understanding of the concepts and applications of Deep Learning. In this course, students will learn the probabilistic foundations that underlie machine learning, including probability theory, standard distributions, and parameters. The discussion continues with single-layer networks for regression and classification, which provides insight into how simple models can be linked to probability theory and loss functions.
Next, students will explore deep neural networks with a focus on multilayer perceptron (MLP) architecture, non-linear activation functions, and how network depth increases representation capacity. This course also discusses important issues such as the curse of dimensionality, regularization, and decision theory in making optimal predictions. In the final session, students are introduced to the concepts of representation learning, transfer learning, and various error functions relevant to modern model development.
Through a combination of mathematical and probabilistic theory and practical implementation, this course provides comprehensive skills for understanding, designing, and evaluating artificial neural network architectures. By the end of the course, participants are expected to be able to explain the basic principles of deep learning, implement regression and classification models, and understand the benefits of networks in representation learning and knowledge transfer.
What you will learn
Providing a conceptual and probabilistic foundation for understanding artificial neural networks.
Equipping students with an understanding of discrete and continuous probability distributions, as well as parameter estimation techniques for model building.
Explains how regression and classification can be modeled with simple (single-layer) networks, including decision theory and regularization.
Exploring multilayer architecture and the advantages of deep networks in representation, transfer learning, and error functions for various tasks.
Meet your instructors
Stefanus Benhard, S.Kom., M.Kom.
Learn moreCourse Information
Go to courseStart Date
23 October 2025
End Date
-
Language
English
Category
Computer Science
Duration
2 hours
Enrolled Students
1
Rating
0.0
Reviews
                            
                                
                                    
                                        
                                            
                                        
                                    
                                    No review yet
                                    
                                
                            
                            
                
No review yet
