Concepts of Neural Networks (CSA203)
This course on Concepts of Neural Networks (CSA203) covers key topics such as Important Terminologies of ANNs, activation functions, network topologies, Unsupervised learning and Supervised learning, Reinforcement learning, Offline and online learning, perceptron learning, backpropagation, radial basis function networks, and self-organizing maps. It is designed for students at Sharda University pursuing artificial intelligence and machine learning.
π Syllabus (CSA203 - Theory)
-
Unit 1: Introduction
- A. Introduction, Motivation and History, Components of a Neuron-synapses, dendrite, cell nucleus, axon.
- B. Important Terminologies of ANNs: Propagation function, Activation function, output function, Components of Artificial Neural Network: common activation functions, network topologies- feed forward, recurrent networks, completely linked networks.
- C. Neuron Activation order: Synchronous activation, asynchronous activation, Communication with the outside world: input and output of data in and from neural networks
-
Unit 2: Learning Paradigms
- A. Learning Paradigms and their real Applications, Unsupervised learning and Supervised learning, Reinforcement learning, Offline and online learning and their applications based on real life problems.
- B. Training patterns and teaching inputs, use of training samples, data set split into training, validation and testing data, Implication of splitting of data set, Learning curves and their importance in diagnostics.
- C. Gradient optimization procedures, Hebbian learning rule.
-
Unit 3: The Perceptron, Backpropagation and its variants
- A. Single layer Perceptron network, Perceptron Learning Algorithm and convergence theorem, Delta rule as a gradient based learning strategy, Limitations of Single Layer Perceptron network.
- B. Multilayer Perceptron Network, Backpropagation learning and its applications.
- C. Analysing effect of learning rate on learning process, Variants of Back propagation algorithm.
-
Unit 4: Radial Basis Function Neural Networks
- A. Components & Structure of an RBF network, Information processing of an RBF network, Information Processing in RBF neurons, analytical thoughts prior to training.
- B. Equation system and gradient strategies for training, Growing RBF Networks, comparison of RBF Networks and Multilayer Perceptrons.
- C. Recurrent Neural Networks: Jordan networks, Elman Networks, Training Recurrent neural networks.
-
Unit 5: Unsupervised Learning Network Paradigms
- A. Self-organizing feature maps, structure of a self-organizing feature map, Training of SOM, Topology function, common distance and topology functions, relationship between learning rates and neighbourhoods, applications of SOMs.
- B. Introduction to Adaptive Resonance Theory, Task and structure of an ART Network, Learning process of an ART Network- top down and bottom up learning, Extensions- ART2, ART3.
- C. Introduction to Hobbfield Network, Associative Network (Homogenous & Heterogeneous), Introduction to Restricted Boltzman Machine.
π Evaluation Schemes
| Component | Theory (CSA203) |
|---|---|
| Total Marks | 100 (CA: 25 + MSE: 15 + ESE: 60) |
| Continuous Assessment |
β’ Assessment 1: 10 Marks (Units 1 & 2) β’ Assessment 2: 5 Marks (Units 3 & 4) β’ Assignment 1: 5 Marks (Units 1 & 2) β’ Assignment 2: 5 Marks (Units 3, 4, and 5) |
| Mid Semester Exam | 15 Marks |
| End Semester Exam | 60 Marks |
π Lectures
Theory lecture materials will be uploaded unit-wise. Each unit will include:
Unit 1: Introduction
- π Lecture PPT on Introduction to Artificial Neural Network (ANN)
- π Reference Book Neural Networks and Learning Machines Third Edition by Simon Haykin.
- π Additional Reference Book-1 Neural Network Design by Martin Hagan.
- π Additional Reference Book-2 Neural Networks And Deep Learning by Charu C. Aggarwal.
- π Additional Deep Learning Reference Book DEEP LEARNING by lan Goodfellow, Yoshua Bengio, and Aaron Courville.
Unit 2: Learning Paradigms
- π Lecture PPT on Learning Paradigms
- π Reference Book Neural Networks and Learning Machines Third Edition by Simon Haykin.
- π Additional Reference Book-1 Neural Network Design by Martin Hagan.
- π Additional Reference Book-2 Neural Networks And Deep Learning by Charu C. Aggarwal.
- π Additional Deep Learning Reference Book DEEP LEARNING by lan Goodfellow, Yoshua Bengio, and Aaron Courville.
Unit 3: The Perceptron, Backpropagation and Its Variants
- π Lecture PPT on The Perceptron, Backpropagation and Its Variants
- π Reference Book Neural Networks and Learning Machines Third Edition by Simon Haykin.
- π Additional Reference Book-1 Neural Network Design by Martin Hagan.
- π Additional Reference Book-2 Neural Networks And Deep Learning by Charu C. Aggarwal.
- π Additional Deep Learning Reference Book DEEP LEARNING by lan Goodfellow, Yoshua Bengio, and Aaron Courville.
Unit 4: Radial Basis Function Neural Networks
- π Lecture PPT on Radial Basis Function Neural Networks
- π Reference Book Neural Networks and Learning Machines Third Edition by Simon Haykin.
- π Additional Reference Book-1 Neural Network Design by Martin Hagan.
- π Additional Reference Book-2 Neural Networks And Deep Learning by Charu C. Aggarwal.
- π Additional Deep Learning Reference Book DEEP LEARNING by lan Goodfellow, Yoshua Bengio, and Aaron Courville.
Unit 5: Unsupervised Learning Network Paradigms
- π Lecture PPT on Unsupervised Learning Network Paradigms
- π Reference Book Neural Networks and Learning Machines Third Edition by Simon Haykin.
- π Additional Reference Book-1 Neural Network Design by Martin Hagan.
- π Additional Reference Book-2 Neural Networks And Deep Learning by Charu C. Aggarwal.
- π Additional Deep Learning Reference Book DEEP LEARNING by lan Goodfellow, Yoshua Bengio, and Aaron Courville.
π Assignments
π Common Instructions
- All assignments must be submitted before the due date.
- Upload your solution in PDF format.
- Plagiarism will not be tolerated.
- Late submissions may not be accepted.
- All assignments must be handwritten. Answers and solutions should be presented in a clear, step-by-step illustrative manner. Running text format will not be accepted.
- Students may be asked to explain their answers and solutions while obtaining the instructorβs signature. Grades will be awarded based on the explanation and understanding demonstrated.
- If any AI tools (such as GPTs) are used in preparing the assignment, students must also submit the complete script or prompt history along with the assignment.
π Assignment 1: Based on Unit-1 and 2
Due Date: 14th Feb 2026
- All questions are compulsory.
- Prepared and Submit as a single PDF file.
- Submit scanned handwritten PDF document using the link mentioned below.
π Submit Here
π Assignment 2: Based on Unit-3 and 4
Due Date: 3rd April 2026
- All questions are compulsory.
- Include clear reasoning steps.
- Submit scanned handwritten PDF document using the link mentioned below.
π Submit Here
β Post Your Doubts
Please fill in your details and doubt. Your submission will be recorded securely.
π Important Dates
- Assessment 1: 19th Feb to 23 Feb 2026 (Units 1 & 2) - Upcoming
- Assessment 2: 06 Apr to 12 Apr 2026 (Units 3 & 4)
- Assignment 1:
14 Feb 2026(Submission) βοΈ - Assignment 2: 20th March 2026 (Submission)
- Mid Semester Exam: 9th to 14th March 2026
- End Semester Exam: As per University Schedule
π Course Feedback
π‘ Your feedback is extremely valuable in improving the course content and teaching effectiveness. Please take a few minutes to share your thoughts and suggestions with me.
π Fill Out the Feedback FormThanks From Your Course Instructor
Dear Students,
Thank you for your active participation in the course.
Your enthusiasm, curiosity, and commitment make this learning journey inspiring.
Keep asking questions, keep exploring, and never stop learning!
β Dr. Gopal Chandra Jana
(Course Instructor)