# Syllabus for Master of Science (Statistics) Academic Year  (2021)

 1 Semester - 2021 - Batch Course Code Course Type Hours Per Week Credits Marks MST131 PROBABILITY THEORY Core Courses 5 5 100 MST132 DISTRIBUTION THEORY Core Courses 5 5 100 MST133 MATRIX THEORY AND LINEAR MODELS Core Courses 5 5 100 MST134 RESEARCH METHODOLOGY AND LATEX Core Courses 2 2 50 MST171 SAMPLE SURVEY DESIGNS Core Courses 6 5 150 MST172 STATISTICAL COMPUTING USING R Core Courses 5 4 150 2 Semester - 2021 - Batch Course Code Course Type Hours Per Week Credits Marks MST231 STATISTICAL INFERENCE-I Core Courses 4 4 100 MST232 STOCHASTIC PROCESSES Core Courses 4 4 100 MST233 CATEGORICAL DATA ANALYSIS Core Courses 4 4 100 MST271 REGRESSION ANALYSIS Core Courses 6 5 150 MST272 STATISTICAL COMPUTING USING PYTHON Core Courses 5 4 150 MST273A PRINCIPLES OF DATA SCIENCE AND DATA BASE TECHNIQUES Discipline Specific Elective 5 4 150 MST273B SURVIVAL ANALYSIS Discipline Specific Elective 5 4 150 MST273C OPTIMIZATION TECHNIQUES Discipline Specific Elective 5 4 100 MST281 RESEARCH PROBLEM IDENTIFICATION AND FORMULATION Core Courses 2 1 50 3 Semester - 2020 - Batch Course Code Course Type Hours Per Week Credits Marks MST331 STATISTICAL INFERENCE II Core Courses 4 4 100 MST332 MULTIVARIATE ANALYSIS Core Courses 4 4 100 MST371 TIME SERIES ANALYSIS Core Courses 6 5 150 MST372A STATISTICAL MACHINE LEARNING Discipline Specific Elective 5 4 150 MST372B BIOSTATISTICS Discipline Specific Elective 5 4 150 MST372C RELIABILITY ENGINEERING Discipline Specific Elective 5 4 150 MST373A NUMERICAL ANALYSIS Discipline Specific Elective 5 4 150 MST373B NON-PARAMETRIC METHODS Discipline Specific Elective 5 4 150 MST373C THEORY OF GAMES AND STATISTICAL DECISIONS Discipline Specific Elective 5 4 150 MST381 RESEARCH MODELING AND IMPLEMENTATION Core Courses 8 4 200 4 Semester - 2020 - Batch Course Code Course Type Hours Per Week Credits Marks MST431 ADVANCED OPERATIONS RESEARCH Core Courses 4 4 100 MST432 DESIGN AND ANALYSIS OF EXPERIMENTS Core Courses 4 4 100 MST433 STATISTICAL QUALITY CONTROL Core Courses 4 4 100 MST471A NEURAL NETWORKS AND DEEP LEARNING Discipline Specific Elective 5 4 150 MST471B SPATIAL STATISTICS Discipline Specific Elective 5 4 150 MST471C BIG DATA ANALYTICS Discipline Specific Elective 5 4 150 MST472A HIGH DIMENSIONAL STATISTICAL ANALYSIS Discipline Specific Elective 5 4 150 MST472B STATISTICAL GENETICS Discipline Specific Elective 5 4 150 MST472C ACTUARIAL METHODS Discipline Specific Elective 5 4 150 MST473A BAYESIAN STATISTICS Discipline Specific Elective 5 4 150 MST473B CLINICAL TRIALS Discipline Specific Elective 5 4 150 MST473C RISK MODELING Discipline Specific Elective 5 4 150 MST481 SEMINAR PRESENTATION Skill Enhancement Course 2 1 50

 Department Overview: Statistics is the body of scientific principles and methodologies that are used to extract useful and comprehensive information from data to draw conclusions about any phenomenon and is a discipline that studies the best ways of dealing with randomness, or more precisely and broadly, variation.  The Department of Statistics is the deep interplay between application, computation and theory, as well as the backbone of data science. It renders a good combination of pure and applied statistics with software skills, enabling students to successfully participate in professional life by gaining knowledge. Mission Statement: Department Vision: The departme Introduction to Program: Master of Science in Statistics at CHRIST (Deemed to be University) offers the students an amalgam of knowledge on theoretical and applied statistics on a broader spectrum. Further, it intends to impart awareness of the importance of the conceptual framework of statistics across diversified fields and provide practical training on statistical methods for carrying out data analysis using sophisticated programming languages and statistical softwares such as R, Python, SPSS, EXCEL, etc. The course curriculum has been designed in such a way to cater for the needs of stakeholders to get placements in industries and institutions on successful completion of the course and to provide those ample skills and opportunities to meet the challenges at the national level competitive examinations like CSIR NET in Mathematical Science, SET, Indian Statistical Service (ISS) etc. Program Objective: Programme Objectives: To impart the importance of the role of approximation and mathematical approaches to analyse the real problems. To strengthen analytical and problem-solving skill through real-time applications. To gain practical experience in computational techniques and programming tools used in the statistical arena. To provide a strong foundation in the best practices of collating and disseminating information. To imbibe quality research and develop solutions to social issues. To prepare the students to use their skills in interdisciplinary areas such as finance, health, agriculture, government, business, industry etc. Programme Outcomes:   By the end of the M.Sc. programme, students will be able to   PO1: Engage in continuous reflective learning in the context of technology and scientific advancement.   PO2: Identify the need and scope of Interdisciplinary research.   PO3: Enhance research culture and uphold scientific integrity and objectivity   PO4: Understand the professional, ethical and social responsibilities   PO5: Understand the importance and the judicious use of technology for the sustainability of the environment   PO6: Enhance disciplinary competency, employability and leadership skills Programme Specific outcomes By the end of the M.Sc. programme in Statistics, students will be able to PSO1: Demonstrate analytical and problem-solving skills to identify and apply appropriate principles and methodologies of statistics in real-time problems. PSO2: Demonstrate the execution of statistical experiments or investigations, analyse and interpret using appropriate statistical methods, including statistical software and report the findings of experiments or studies accurately. PSO3: Acquaint with contemporary trends in industrial/research settings and innovate novel solutions to existing problems. PSO4: Enhance competency as a statistician or a data scientist to work in a broad range of analytic, scientific, government, financial, health, technical and other fields. Assesment Pattern CIA - 50% ESE - 50% Examination And Assesments CIA - 50% ESE - 50%
 MST131 - PROBABILITY THEORY (2021 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:100 Credits:5 Course Objectives/Course Description Probability is a measure of uncertainty and forms the foundation of statistical methods. This course makes students use measure-theoretic and analytical techniques for understanding probability concepts. Course Outcome By the end of the course, the learner will be able to: CO1: Relate measure and probability concepts CO2: Analyse probability concepts using the measure-theoretic approach CO3: Evaluate conditional distributions and conditional expectations CO4: Make use of limit theorems in the convergence of random variables
Unit-1
Teaching Hours:15
Probability and Random variable

Sets – functions - Sigma field – Measurable space – Sample space – Measure – Probability as a measure - Inverse function - Measurable functions – Random variable - Induced probability space - Distribution function of a random variable: definition and properties.

Unit-2
Teaching Hours:15
Expectation and Generating functions

Expectation and moments: Definition and properties – Probability generating function - Moment generating functions - Moment inequalities: Markov’s, Chebychev’s, Holder, Jenson and basic inequalities - Characteristic function and properties (idea and statement only).

Unit-3
Teaching Hours:15
Random Vectors

Random vectors – joint distribution function – joint moments - Conditional probabilities - Randon-Nikodym Theorem (Statement only) - Bayes’ theorem – conditional distributions – independence - Conditional expectation and its properties

Unit-4
Teaching Hours:15
Convergence

Modes of convergence: Convergence in probability, in distribution, in rth mean, almost sure convergence and their inter-relationships - Convergence theorem for expectation

Unit-5
Teaching Hours:15
Limit theorems

Law of large numbers - Convergence of series of independent random variables - Weak law of large numbers (Kninchine’s and Kolmogorov’s) - Kolmogorov’s strong law of large numbers - Central limit theorems for i.i.d random variables: Lindberg-Levy and Liaponov’s CLT.

Text Books And Reference Books:
1. Rohatgi, V.K. and Salah, A.K.E, (2015), An Introduction to Probability and Statistics, 3rd Ed., John Wiley & Sons.
2. Bhat, B.R, (2014), Modern Probability Theory, 4th Ed., New Age International.
1. Feller W, (2008), An Introduction to Probability Theory and its Applications, Volume I , 3rd Ed., Wiley Eastern.
2. Feller W, (2008), An Introduction to Probability Theory and its Applications, Volume II,3rd Ed.,  Wiley Eastern.
3. Billingsley, P. (2008). Probability and measure. John Wiley & Sons.
4. Basu A.K, (2012), Measure Theory and Probability, 2nd Ed., PHI.
5. Durrett R, (2010), Probability: Theory and Examples. 4th ed. Cambridge University Press, 2010.
Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST132 - DISTRIBUTION THEORY (2021 Batch)

Total Teaching Hours for Semester:75
No of Lecture Hours/Week:5
Max Marks:100
Credits:5

Course Objectives/Course Description

Probability distributions are used in many real-life phenomena. This course makes students understand different probability distributions and model real-life problems using them.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Classify different families of probability distributions.

CO2: Analyse well-known probability distributions as a special case of different families of distribution

CO3: Identify different distributions arising from sampling from the normal distribution.

CO4: Apply probability distribution in various statistical problems.

Unit-1
Teaching Hours:15
Discrete Distributions

Modified power series family and properties - Binomial - Negative binomial, Logarithmic series and Lagrangian distributions and their properties as special cases of the results from modified power series family - hypergeometric distribution and its properties.

Unit-2
Teaching Hours:15
Continuous Distributions

Pearsonian system of distributions - Beta, Gamma, Pareto and Normal as special cases of the Pearson family and their properties - Exponential family of distributions.

Unit-3
Teaching Hours:15
Sampling distributions

Sampling distributions of the mean and variance from normal population - independence of mean and variance - chi-square, students t and F distribution and their non-central forms - Order statistics and their distributions.

Unit-4
Teaching Hours:15
Multivariate distributions

Bivariate Poisson, Multinomial distribution - Multivariate normal (definition only) - bivariate exponential distribution of Gumbel - Marshall and Olkin distribution - Dirichlet distribution.

Unit-5
Teaching Hours:15

Quadratic forms in normal variables: distribution and properties - Cochran’ theorem: applications.

Text Books And Reference Books:
1. Rohatgi, V.K. and Salah, A.K.E. (2015) An Introduction to Probability and Statistics, 3rd Ed., John Wiley & Sons.
2. Krishnamoorthy, K. (2016). Handbook of statistical distributions with applications. CRC Press.
1. Johnson N.L, Kotz S and Kemp A.W (2005) Univariate discrete distributions, 3rd Ed., John Wiley.
2.  Johnson N.L, Kotz S and Balakrishnan N (2017) Continuous univariate distributions I & II, John Wiley.
3.  Johnson N.L, Kotz S and Balakrishnan N (2000) Multivariate Distribution, 2nd Ed., John Wiley.
4. Arnold B.C, Balakrishnan N and Nagaraja H.N (2012) A first course in order statistics.
5. Elderton, W. P., & Johnson, N. L. (2009). Systems of frequency curves, Cambridge University press.
Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST133 - MATRIX THEORY AND LINEAR MODELS (2021 Batch)

Total Teaching Hours for Semester:75
No of Lecture Hours/Week:5
Max Marks:100
Credits:5

Course Objectives/Course Description

This course is offered to make students understand the critical aspects of matrix theory and linear models used in different areas of statistics such as regression analysis, multivariate analysis, design of experiments and stochastic processes.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Demonstrate the understanding of  vector-space and different operations on it

CO2: Analyse system of linear equations using matrix theoretic approach

CO3: Identify applications of matrix theory in statistical problems

CO4: Apply matrix theory in linear models

Unit-1
Teaching Hours:15
System of linear equations

Matrix operations - Linear equations - row reduced and echelon form -  Homogenous system of equations -  Linear dependence

Unit-2
Teaching Hours:15
Vector Space

Vectors - Operations on vector space - subspace - nullspace and column space - Linearly independent sets - spanning set - bases - dimension - rank - change of basis.

Unit-3
Teaching Hours:15
Linear transformations

Algebra of linear transformations - Matrix representations -  rank nullity theorem - determinants - eigenvalues and eigenvectors -  Cayley-Hamilton theorem -  Jordan canonical forms - orthogonalisation process -  orthonormal basis.

Unit-4
Teaching Hours:15
Quadratic forms and special matrices useful in statistics

Reduction and classification of quadratic forms -  Special matrices: symmetric matrices - positive definite matrices - idempotent and projection matrices - stochastic matrices -  Gramian matrices - dispersion matrices

Unit-5
Teaching Hours:15
Linear models

Fitting the model - ordinary least squares -  estimability of parametric functions -  Gauss – Markov theorem -  applications: regression model - analysis of variance.

Text Books And Reference Books:
1. David C. Lay, Steven R. Lay, Judi J. McDonald (2016) Linear algebra and its applications. Pearson.
2. Lipschutz, S., & Lipson, M. L. (2018). Schaum's Outline of Linear Algebra. McGraw-Hill Education.
1. Searle, S. R., & Khuri, A. I. (2017). Matrix algebra useful for statistics. John Wiley & Sons.
2. Rencher, A. C., & Schaalje, G. B. (2008). Linear models in statistics. John Wiley & Sons.
3. Khuri, A. I. (2003). Advanced calculus with applications in statistics. Hoboken, NJ: Wiley-Interscience.
4. Gentle, J. E. (2017) Matrix algebra- Theory, Computations and Applications in Statistics. Springer texts in statistics, Springer, New York.
5. Strang, G. (2006) Linear Algebra and its Applications: Thomson Brooks. Cole, Belmont, CA, USA.
Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST134 - RESEARCH METHODOLOGY AND LATEX (2021 Batch)

Total Teaching Hours for Semester:30
No of Lecture Hours/Week:2
Max Marks:50
Credits:2

Course Objectives/Course Description

To acquaint students with different methodologies in statistical research and to make them prepare scientific articles using LaTeX.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Define a research problem

CO2: Identify suitable methodology for solving the research problem

CO3: Create scientific articles using LaTeX.

 Unit-1 Teaching Hours:15 Fundamentals of research Objectives - Motivation - Utility - Concept of theory - empiricism - deductive and inductive theory - Characteristics of the scientific method - Understanding the language of research - Concept - Construct - Definition - Variable - Research Process  Problem Identification & Formulation - Research Question – Investigation Question - Logic & Importance Unit-2 Teaching Hours:15 Scientific writing Principles of mathematical writing - LaTeX: installing packages and editor, preparing title page - mathematical expressions - tables - importing graphics - bibliography - writing a research paper - survey article - thesis writing - Beamer: preparing presentations Text Books And Reference Books: Kothari, C. R. and Garg, G. (2014). Research methodology: Methods and techniques. 3rd Ed., New Age International. L. Lamport (2014), LaTeX, a Document Preparation System, 2nd ed, Addison-Wesley. Essential Reading / Recommended Reading1.Grätzer, G. (2013). Math into LATEX. Springer Science & Business Media. Evaluation PatternCIA - 50% ESE - 50% MST171 - SAMPLE SURVEY DESIGNS (2021 Batch) Total Teaching Hours for Semester:90 No of Lecture Hours/Week:6 Max Marks:150 Credits:5 Course Objectives/Course Description This course aims to impart the concepts of survey sampling theory and the analysis of complex surveys, including methods of sample selection, estimation, sampling variance, standard error of estimation in a finite population, development of sampling theory for use in sample survey problems and sources of errors in surveys. Course Outcome By the end of the course, the learner will be able to: CO1: List different steps in designing a sample survey. CO2: Analyse different sample survey designs and find estimators. CO3: Identify the use of different sample survey designs. CO4: Apply suitable sample survey design in real-life problems.
 Unit-1 Teaching Hours:18 Random sampling designs Sampling vs census, simple random sampling: with (SRS) and without replacement (SRSWOR) of units, estimators of mean, total and variance, determination of sample size, sampling for proportions, Stratified sampling scheme: estimation and allocation of sample size, comparison with simple random sampling schemes.  Lab Exercises: 1. Drawing samples with SRSWR and SRSWOR and estimation of parameters 2. Estimation of parameters using a sample of proportions 3. Drawing stratified sample and estimation of parameters Unit-2 Teaching Hours:18 Ratio and regression estimators Bias and mean square error, estimation of variance, confidence interval, comparison with mean per unit estimator, optimum property of ratio estimator, unbiased ratio type estimator, ratio estimator in stratified random sampling, Difference estimator and Regression estimator:- Difference estimator, regression estimator, comparison of regression estimator with mean per unit and ratio estimator, regression estimator in stratified random sampling. Lab Exercises: 4. Estimation using ratio estimator 5. Estimation using regression estimator 6. Ratio estimator and regression estimator in stratified sampling Unit-3 Teaching Hours:18 Varying probability sampling designs With and without replacement sampling schemes: PPS and PPSWR schemes, Selection of samples, estimators: ordered and unordered estimators. Πps sampling schemes. Lab Exercises: 7. Exercise on the PPS scheme 8. Exercise on the PPSWR scheme 9. Exercise on Πps sampling scheme Unit-4 Teaching Hours:18 Advanced sampling designs Systematic sampling scheme: estimation of population mean and variance, comparison of systematic sampling with SRS and stratified random sampling, circular systematic sampling, Cluster sampling: estimation of population mean, estimation of efficiency by a cluster sample, variance function, determination of optimum cluster size, Multistage sampling: estimation population total with SRS sampling at both stages, multiphase sampling (outline only), quota sampling, network sampling; Adaptive sampling: introduction and estimators under adaptive sampling. Introduction to small area estimation. Lab Exercises: 9. Exercise on the systematic sampling scheme 10. Exercise on cluster sampling 11. Exercise on multi-stage sampling 12. Exercise on small area estimation Unit-5 Teaching Hours:18 Errors in Sample Survey Sampling and non-sampling errors, the effect of unit nonresponse in the estimate, procedures for unit nonresponse Lab Exercises: 13. Exercise on the sensitivity of efficiency due to sampling errors 14. Procedures for non-response Text Books And Reference Books:1. Arnab, R. (2017). Survey sampling: Theory and Applications. Academic Press. 2. Singh, D. and Chaudharay, F.S. (2018) Theory and Analysis of Sample Survey Designs, New Age International. Essential Reading / Recommended Reading1. Cochran, W.G. (2007) Sampling Techniques, Third edition, John Wiley & Sons. 2. Singh, S. (2003). Advanced Sampling: Theory and Practice. Kluwer. 3. Des Raj and Chandhok, P. (2013) Sampling Theory, McGraw Hill.  4. Mukhopadhay, P (2009) Theory and methods of survey sampling, Second edition, PHI Learning Pvt Ltd., New Delhi. 5. Sampath, S. (2005) Sampling theory and methods, Alpha Science International Ltd., India. 6. Lumley, T. (2011). Complex surveys: a guide to analysis using R. John Wiley & Sons. Evaluation PatternCIA - 50% ESE - 50% MST172 - STATISTICAL COMPUTING USING R (2021 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description The programming skill in R helps students to perform statistical computations with ease. This course equips students with knowledge of R programming to develop statistical models for real-world problems Course Outcome By the end of the course, the learner will be able to: CO1: Demonstrate the understanding of basic concepts of R programming CO2: Build useful programs with functions CO3: Analyse various data using R. CO4: Create visualisation  of data using R CO5: Compare different methods of simulating random numbers
 Unit-1 Teaching Hours:15 Introduction R and R studio - Variables - Functions - Vectors - Expressions and assignments - Logical expressions - Matrices - The workspace - R markdown. Practical Assignments: 1. Demonstrate variables and functions in R 2. Creating vectors and matrices and associated operations in R 3. Logical and arithmetic operations in R Unit-2 Teaching Hours:15 Basic Programming Loops: if, for, while - Program flow - Basic debugging  - Good programming habits -  Input and outputs: Input from a file and output to a file  Practical Assignments: 4. Illustration of control structures: if, else, for 5. Illustration of control structures: while, repeat, break, next and ifelse Unit-3 Teaching Hours:15 Programming with functions Functions - Optional arguments and default values - Vector-based programming using functions - Recursive programming - Debugging functions - Sophisticated data structures - Factors -Dataframes - Lists - The apply family. Practical Assignments: 7. Creating user-defined functions and doing vector-based programming 8. Creating lists and data frames and associated operations 9. Demonstration of recursive functions, apply functions in R Unit-4 Teaching Hours:15 Graphics Visualising data - Graphical summaries of data: Bar chart, Pie chart, Histogram, Box-plot, Stem and leaf plot, Frequency table - Plotting of probability distributions and sampling distributions - P-P plot - Q-Q Plot  - ggplot2 - lattice – 3D plots,  -  par -graphical augmentation. Practical Assignments: 10. Visualization of univariate data 11. Visualization of numerical variables in R using ‘base R’, ‘ggplot2’ and ‘lattice 3D’ packages 12. Contingency tables and visualization of categorical variables using ‘base R’, ‘ggplot2’ and ‘lattice 3D’  packages 13. Construction  of probability plots and quantile plots in R Unit-5 Teaching Hours:15 Simulation Simulating iid uniform samples - Congruential generators - Seeding - Simulating discrete random variables - Inversion method for continuous random variables -  Rejection method - generation of normal variates: Rejection with exponential envelope, Box-Muller algorithm. Practical Assignments: 14.  Simulation of discrete variables in R 15. Simulation of continuous variables- inversion method, rejection method Text Books And Reference Books:1.Jones, O., Maillardet. R. and Robinson, A. (2014). Introduction to Scientific Programming and Simulation Using R. Chapman & Hall/CRC, The R Series. 2.Matloff, N. (2016). The art of R programming: A tour of statistical software design. No Starch Press. Essential Reading / Recommended Reading1.Crawley, M, J. (2012). The R Book, 2nd Edition. John Wiley & Sons. 2.Chambers, J. M. (2008). Software for Data Analysis-Programming with R. Springer-Verlag, New York. Evaluation PatternCIA - 50% ESE - 50% MST231 - STATISTICAL INFERENCE-I (2021 Batch) Total Teaching Hours for Semester:60 No of Lecture Hours/Week:4 Max Marks:100 Credits:4 Course Objectives/Course Description To provide a strong mathematical and conceptual foundation in the methods of parametric estimation and their properties. Course Outcome By the end of the course, the learner will be able to: CO1: List properties of estimators. CO2: Identify a suitable estimation method. CO3: Analyse likelihood function and apply different root solving methods to find estimators CO4: Construct confidence intervals for parameters involved in the model.
Unit-1
Teaching Hours:12
Sufficiency

Sufficiency - factorisation theorem - minimal sufficiency - exponential family and completeness - Ancillary statistics and Basu's theorem

Unit-2
Teaching Hours:12
Unbiasedness

UMVUE - Fisher Information and Cramer-Rao inequality - Chapman-Robbin’s and Bhattacharya bounds - Rao-Blackwell theorem - Lehman-Scheffe theorem - Unbiased estimation

Unit-3
Teaching Hours:12
Consistent estimators

Consistency - Weak and strong consistency - Marginal and joint consistent estimators - CAN estimators - equivariance - Pitman estimators

Unit-4
Teaching Hours:12
Methods of point estimation

Methods of moments - Minimum chi square and its modification, Least square estimation, Maximum likelihood, Properties of maximum likelihood estimators, Cramer-Huzurbazar Theorem, Likelihood equation - multiple roots, Iterative methods, EM Algorithm.

Unit-5
Teaching Hours:12
Interval estimation

Large sample confidence interval - shortest length confidence interval - Methods of finding confidence interval: Inversion of the test statistic, pivotal quantities, pivoting CDF- evaluation of confidence interval: size and coverage probability, loss function and test function optimality.

Text Books And Reference Books:
1.  Kale, B. K. and Muralidharan, K.(2015). Parametric Inference: An Introduction. Alpha Science Int. Ltd.
2. Srivastava, A. K., Khan, A. H. and Srivastava, N. (2014). Statistical Inference: Theory of Estimation. PHI Learning Pvt. Ltd, New Delhi.

1.Casella, G., & Berger, R. L. (2002). Statistical inference. Pacific Grove, CA: Duxbury.

2.Silvey, S. D. (2017). Statistical inference. Routledge.

3.Trosset, M. W. (2009). An introduction to statistical inference and its applications with R. Chapman and Hall/CRC.

4.Dixit, U. J. (2016). Examples in parametric inference with R, Springer.

5.Lehmann, E. L., & Casella, G. (2006). Theory of point estimation, 2nd Ed. Springer.

6.Robert, C., & Casella, G. (2013). Monte Carlo statistical methods. Springer

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST232 - STOCHASTIC PROCESSES (2021 Batch)

Total Teaching Hours for Semester:60
No of Lecture Hours/Week:4
Max Marks:100
Credits:4

Course Objectives/Course Description

To equip the students with theoretical and practical knowledge of stochastic models which are used in economics, life sciences, engineering etc.

Course Outcome

By the end of the course, the learner will be able to:

CO1: List different stochastic models.

CO2: Identify ergodic Markov chains.

CO3: Analyse queuing models using continuous-time Markov chains.

CO4: Apply Brownian motion in finance problems.

Unit-1
Teaching Hours:12
Introduction

A sequence of random variables - definition and classification of the stochastic process - autoregressive processes and Strict Sense and Wide Sense stationary processes.

Unit-2
Teaching Hours:12
Discrete time Markov chains

Markov Chains: Definition, Examples - Transition probability matrix -  Chapman-Kolmogorv equation - classification of states - limiting and stationary distributions - ergodicity - discrete renewal equation and basic limit theorem - Absorption probabilities - Criteria for recurrence - Generic application: hidden Markov models.

Unit-3
Teaching Hours:12
Continuous time Markov chains and Poisson process

Transition probability function - Kolmogorov differential equations - Poisson process: homogenous process, inter-arrival time distribution, compound process - Birth and death process - Service applications: Queuing models- Markovian models.

Unit-4
Teaching Hours:12
Branching process

Galton-Watson branching processes - Generating function - Extinction probabilities - Continuous-time branching processes - Extinction probabilities - Branching processes with general variable lifetime.

Unit-5
Teaching Hours:12
Renewal process and Brownian motion

Renewal equation - Renewal theorem - Generalisations and variations of renewal processes -  Brownian motion - Introduction to Markov renewal processes.

Text Books And Reference Books:

1.Karlin, S. and Taylor, H.M. (2014). A first course in stochastic processes. Academic Press.

2.S. M. Ross (2014). Introduction to Probability Models. Elsevier.

1.Feller, W. (2008) An Introduction to Probability Theory and its Applications, Volume I&II, 3rd Ed., Wiley Eastern.

2.J. Medhi (2009) Stochastic Processes, 3rd Edition, New Age International.

3.Dobrow, R.P. (2016), Introduction to Stochastic Processes with R, Wiley Eastern.

4.Cinlar, E. (2013). Introduction to stochastic processes. Courier Corporation.

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST233 - CATEGORICAL DATA ANALYSIS (2021 Batch)

Total Teaching Hours for Semester:60
No of Lecture Hours/Week:4
Max Marks:100
Credits:4

Course Objectives/Course Description

Categorical data analysis deals with the study of information captured through expressions or verbal forms. This course equips the students with the theory and methods to analyse and categorical responses.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Describe the categorical response.

CO2: Identify tests for contingency tables.

CO3: Apply regression models for categorical response variables.

CO4: Analyse contingency tables using log-linear models.

Unit-1
Teaching Hours:12
Introduction

Categorical response data - Probability distributions for categorical data - Statistical inference for discrete data

Unit-2
Teaching Hours:12
Contingency tables

Probability structure for contingency tables - Comparing proportions with 2x2 tables - The odds ratio - relative risk - Tests for independence - Association of IXJ tables

Unit-3
Teaching Hours:12
Generlaized linear models

Components of a generalised linear model - GLM for binary and count data - Statistical inference and model checking - Fitting GLMs

Unit-4
Teaching Hours:12
Logistic regression

Interpreting the logistic regression model - Inference for logistic regression -  Logistic regression with categorical predictors - Multiple logistic regression - Summarising effects - Building and applying logistic regression models - Multicategory logit models

Unit-5
Teaching Hours:12
Loglinear models for contingency tables

Loglinear models for two-way and three-way tables - Inference for Loglinear models - the log-linear-logistic connection - Models for matched pairs: Comparing dependent proportions,  Logistic regression for matched pairs - Comparing margins of square contingency tables - symmetry issues

Text Books And Reference Books:

1.       Agresti, A. (2012). Categorical Data Analysis, 3rd Edition. New York: Wiley

2.       Agresti, A. (2010). Analysis of ordinal categorical data. John Wiley & Sons.

1.      Le, C.T. (2009). Applied Categorical Data Analysis and Translational Research, 2nd Ed., John Wiley and Sons.

2.      Stokes, M. E., Davis, C. S., & Koch, G. G. (2012). Categorical data analysis using SAS. SAS Institute.

3.      Agresti, A. (2018). An introduction to categorical data analysis. John Wiley & Sons.

4.      Bilder, C. R., & Loughin, T. M. (2014). Analysis of categorical data with R. Chapman and Hall/CRC.

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST271 - REGRESSION ANALYSIS (2021 Batch)

Total Teaching Hours for Semester:90
No of Lecture Hours/Week:6
Max Marks:150
Credits:5

Course Objectives/Course Description

Regression models are mainly used in establishing a relationship among variables and predicting future values. It got applications in various domain such as finance, life science, management, psychology, etc. This course is designed to impart the knowledge of statistical model building using regression technique.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Formulate simple and multiple regression models

CO2: Identify the correct regression model for the given problem

CO3: Apply non-linear regression in real-life problems.

CO4: Analyse the robustness of the regression model.

 Unit-1 Teaching Hours:18 Linear regression model Linear Regression Model: Simple and multiple -  Least squares estimation -  Properties of the estimators - Maximum likelihood estimation -  Estimation with linear restrictions -  Hypothesis testing -  confidence intervals. Practical Assignments: 1.     Build a simple linear model and interpret the data. 2.     Construct confidence interval for the simple linear model 3.     Build a multiple linear models and estimate its parameters. 4.     Construct confidence interval for multiple linear model Unit-2 Teaching Hours:18 Model adequacy Residual analysis -  Departures from underlying assumptions -  Effect of outliers - Collinearity - Nonconstant variance and serial correlation -  Departures from normality -  Diagnostics and remedies.  Practical Assignments: 5.     Carry out residual analysis and validate the model assumptions. 6.     Construct residual plots for checking outliers, leverage points and influential points. 7. Checking the assumption of homoscedasticity and its remedial measures  8. Detecting multicollinearity and its remedial measures Unit-3 Teaching Hours:18 Model Selection selection of input variables and model selection -  Methods of obtaining the best fit - stepwise regression -  Forward selection and backward elimination Practical Assignments: 9.     Selecting the best model using step wise regression. 10.     Selecting the best model using the forward and backward selection procedure. Unit-4 Teaching Hours:18 Nonlinear regression Introduction to general non-linear regression - least-squares in non-linear case - estimating the parameters of a non-linear system - reparametrization of the model -  Non-linear growth models  Practical Assignments: 11.Estimate parameters in non-linear models using the least square procedure Unit-5 Teaching Hours:18 Robust regression Linear absolute deviation regression - M estimators -  robust regression with rank residuals -  resampling procedures for regression models, methods and its properties (without proof) -  Jackknife techniques and least-squares approach based on M-estimators. Practical Assignments: 12.     Illustrate resampling procedures in regression models. 13.     Build a regression model with robust regression procedures. Text Books And Reference Books:1. Chatterjee, S., & Hadi, A. S. (2015). Regression analysis by example. John Wiley & Sons. 2. Draper, N. R., & Smith, H. (2014). Applied regression analysis. 3rd edition. John Wiley & Sons. 3. Montgomery, D. C., Peck, E. A., & Vining, G. G. (2021). Introduction to linear regression analysis. John Wiley & Sons. Essential Reading / Recommended Reading1. Seber, G. A., & Lee, A. J. (2012). Linear regression analysis (Vol. 329). John Wiley & Sons. 2. Keith, T. Z. (2014). Multiple regression and beyond: An introduction to multiple regression and structural equation modelling. Routledge. 3. Fox, J. (2015). Applied regression analysis and generalized linear models. Sage Publications. 4. Fox, J., & Weisberg, S. (2018). An R companion to applied regression. Sage publications. Evaluation PatternCIA - 50% ESE - 50% MST272 - STATISTICAL COMPUTING USING PYTHON (2021 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description Python is a generic programming language that is extensively used in data science. This course equips students with programming skill in Python and associated statistical libraries and to apply in data analysis Course Outcome By the end of the course, the learner will be able to: CO1: Demonstrate the understanding of the fundamentals of Python programming  CO2: Implement functions and data modelling CO3: Analyze statistical datasets and visualize the results. CO4: Build statistical models using various statistical libraries in python
 Unit-1 Teaching Hours:15 Introduction Installing Python - basic syntax - interactive shell - editing, saving  and  running  a script. The concept of data types -  variables -  assignments - mutable type - immutable types - arithmetic operators and expressions -  comments in the program - understanding error messages - Control statements - operators.    Practical Assignments: 1.  Lab exercise on data types 2.  Lab exercise on arithmetic operators and expressions  3. Lab exercise on Control statements Unit-2 Teaching Hours:15 Design with functions Introduction to functions - inbuilt and user defined functions - functions with arguments and return values - formal vs actual arguments - named arguments - Recursive functions -  Lambda function - OOP Concepts -  classes - objects - attributes and methods - defining classes - inheritance -  polymorphism.   Practical Assignments: 4.  Lab exercise on inbuilt and user-defined functions 5.  Lab exercise on Recursive and Lambda function  6. Lab exercise on OOP Concepts. Unit-3 Teaching Hours:15 Statistical Analysis -I using Pandas Introduction to Pandas - Pandas data series - Pandas data frames - data handling -  grouping - Descriptive statistical analysis and Graphical representation.   Practical Assignments: 7.  Lab exercise on Pandas data series, frame, handling and grouping 8.  Lab exercise on statistical analysis Unit-4 Teaching Hours:15 Statistical Analysis - II using Pandas Hypothesis testing - data modelling - linear regression models -  logistic regression model.   Practical Assignments: 9.  Lab exercise on Hypothesis testing 10.  Lab exercise on regression modelling Unit-5 Teaching Hours:15 Visualization Using Seaborn and Matplotlib Line graph - Bar chart - Pie chart - Heat map - Histogram - Density plot - Cumulative frequencies - Error bars - Scatter plot - 3D plot.   Practical Assignments: 11.  Lab exercise on graphical and diagrammatic representation. 12.  Lab exercise on the density plot 13.  Lab exercise on scatter and 3D plot Text Books And Reference Books:1.Lambert, K. A. (2018). Fundamentals of Python: first programs. Cengage Learning. 2.Haslwanter, T. (2016). An Introduction to Statistics with Python. Springer International Publishing. Essential Reading / Recommended Reading1.Unpingco, J. (2016). Python for probability, statistics, and machine learning, Vol.1, Springer International Publishing. 2.Anthony, F. (2015). Mastering pandas. Packt Publishing Ltd. Evaluation PatternCIA - 50% ESE - 50% MST273A - PRINCIPLES OF DATA SCIENCE AND DATA BASE TECHNIQUES (2021 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course provides a strong foundation for data science and the application area related to it and caters for the underlying core concepts and emerging technologies in data science. Course Outcome By the end of the course, the learner will be able to: CO1: Explore the fundamental concepts of data science  CO2: Apply data analysis techniques for handling large data CO3: Demonstrate various databases and Compose effective queries
 Unit-1 Teaching Hours:15 Introduction to Data Science Introduction – Big Data and Data Science  – Data science Hype – Getting Past the Hype – The Current Landscape – Role of Data Scientist – Exploratory Data Analysis –  Data Science Process Overview – Defining goals – Retrieving data – Data preparation – Data exploration – Data modelling – Presentation. Problems in handling large data – General techniques for handling large data – Big Data and its importance, Four Vs, Drivers for Big data, Big data analytics, Big data applications, Algorithms using map-reduce, Matrix-Vector Multiplication by Map Reduce. Steps in big data – Distributing data storage and processing with Frameworks – Data science ethics – valuing different aspects of privacy – The five C’s of data.  Practical Assignments  1. Lab exercise for feature engineering    2. Lab exercise for big data processing Unit-2 Teaching Hours:15 Machine Learning Machine learning – Modeling Process – Training model – Validating model – predicting new observations – Supervised learning algorithms – Unsupervised learning algorithms. Introduction to deep learning – Deep Feed Forward networks – Regularization – Optimization of deep learning – Convolutional networks – Recurrent and recursive nets – applications of deep learning. Practical Assignments:  3.  Lab exercise on Linear and Logistic discrimination  4.  Lab exercise on K means clustering and Hierarchical clustering Unit-3 Teaching Hours:15 Introduction to Relational Database and Design Concept and Overview of DBMS, Data Models, Database Languages, Database Administrator, Database Users, Three Schema architecture of DBMS. Basic concepts, Design Issues, Mapping Constraints, Keys, Entity-Relationship Diagram, Weak Entity Sets, Functional Dependency, Different anomalies in designing a Database, Normalization: using functional dependencies, 1NF, 2NF, 3NF and Boyce-Codd Normal Form Practical Assignments: 5. Lab Exercise on Database Design 6. Top-Down Approach 7. Bottom-up Approach Unit-4 Teaching Hours:15 Database Querying and Data Integration SQL Basic Structure - DDL, DML, DCL-Integrity Constraints - Domain Constraints, Entity Constraints - Referential Integrity Constraints, Concept of Set operations, Joins, Aggregate Functions, Null Values, , assertions, views, Nested Subqueries – procedural extensions – stored procedures – functions- cursors – Intelligent databases – ECA rule – Data Integration – ETL Process  Practical Assignments: 8. Lab Exercise on SQL 9. Lab Exercise on PL/SQL 10. Lab Exercise on ETL Unit-5 Teaching Hours:15 Introduction to Data Warehouse Data Warehousing - Defining Feature – Data warehouses and data marts –Metadata in the data warehouse – Data design and Data preparation - Dimensional Modeling - Principles of dimensional modelling – The star schema – star schema keys – Advantages of the star schema – Updates to the dimension tables – The snowflake schema – Aggregate fact tables – Families Oo Stars – MDX queries – Reporting services. Practical Assignments: 11. Lab Exercise on Analysis Services 12. Lab Exercise on Reporting Services Text Books And Reference Books:1. Davy Cielen, Arno D. B. Meysman, Mohamed Ali (2016), Introducing Data Science, Manning Publications Co. 2. Thomas Cannolly and Carolyn Begg, (2007), Database Systems, A Practical Approach to Design, Implementation and Management”, 3rd Edition, Pearson Education. Essential Reading / Recommended Reading1. Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani (2013), An Introduction to Statistical Learning: with Applications in R, Springer. 2. D J Patil, Hilary Mason, Mike Loukides, (2018), Ethics and Data Science, O’Reilly. 3. LiorRokach and OdedMaimon, (2010), Data Mining and Knowledge Discovery Handbook. Evaluation PatternCIA - 50% ESE - 50% MST273B - SURVIVAL ANALYSIS (2021 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course will provide an introduction to the principles and methods for the analysis of time-to-event data. This type of data occurs extensively in both observational and experimental biomedical and public health studies. Course Outcome By the end of the course, the learner will be able to: CO1: Explore the fundamental concepts of survival models  CO2: Analyse survival data using various parametric models CO3: Identify Non-Parametric Survival techniques for applications lifetime data CO4: Demonstrate the understanding of various Competing Risks and their effects
Unit-1
Teaching Hours:15
Basic quantities and censoring

The hazard and survival functions - Mean residual life function - competing risk - right,left and interval censoring, truncation - likelihood for censored and truncated data - Parametric and non-parametric estimation in truncated and censored cases.

Practical Assignments:

1.Lab exercise on the parametric estimation of left and right-censored data

2.Lab exercise on the parametric estimation of truncated data

3.Lab exercise on the non-parametric estimation of censored and truncated data

Unit-2
Teaching Hours:15
Parametric Survival Models

Parametric forms and the distribution of log time - The exponential - Weibull - Gompertz - Gamma - Generalized Gamma - Coale-McNeil - and generalized F distributions - The U.S. life table - Approaches to modelling the effects of covariates - Parametric families - Proportional hazards models (PH) - Accelerated failure time models (AFT) - The intersection of PH and AFT. Proportional odds models (PO) - The intersection of PO and AFT - Recidivism in the U.S.

Practical Assignments:

1.Lab exercise on parametric modelling pf survival data

2.Lab exercise on the proportional hazard model

3.Lab exercise on AFT models

Unit-3
Teaching Hours:15
Non-Parametric Survival Models

One-sample estimation with censored data - The Kaplan-Meier estimator - Greenwood's formula - The Nelson-Aalen estimator - The expectation of life - Comparison of several groups: Mantel- Haenszel and the log-rank test.

Regression: Cox's model and partial likelihood - The score and information - The problem of ties - Tests of hypotheses - Time-varying covariates - Estimating the baseline survival - Martingale residuals.

Practical Assignments:

7.Lab exercise on Kaplan-Meier estimator and Nelson-Aalen estimator

8.Lab exercise on Mantel- Haenszel and the log-rank test

9.Lab exercise on the Cox model with time-varying covariate

Unit-4
Teaching Hours:15
Models for Discrete Data and Extensions

Cox's discrete logistic model and logistic regression - Modelling grouped continuous data and the complementary log-log transformation - Piece-wise constant hazards and Poisson regression - Current status data versus retrospective data - Open intervals and time since the last event - Backward recurrence times - Interval censoring.

Practical Assignments:

10.Lab exercise on the discrete logistic model for survival data

11.Lab exercise on Poisson regression for survival data

12.Lab exercise on Piece-wise regression for survival data

Unit-5
Teaching Hours:15
Models for Competing Risks

Modelling multiple causes of failure - Research questions of interest - Cause-specific hazards - Overall survival - Cause-specific densities - Estimation: one-sample and the generalized Kaplan- Meier and Nelson-Aalen estimators - The Incidence function - Regression models - Weibull regression - Cox regression and partial likelihood - Piece-wise exponential survival and multinomial logits - The identification problem - Multivariate and marginal survival - The Fine-Gray model.

Practical Assignments:

13.Lab exercise on non-parametric modelling of competing risk data

14.Lab exercise on parametric modelling of competing risk data

15.Lab exercise on multivariate survival data

Text Books And Reference Books:

1. Klein, J. P., & Moeschberger, M. L. (2006). Survival analysis: techniques for censored and truncated data. Springer Science & Business Media.

2. Cleves, M.; W. G. Gould, and J. Marchenko (2016). An Introduction to Survival Analysis using Stata. Revised 3rd Ed. College Station, Texas: Stata Press.

3. Kalbfleisch, J. D., & Prentice, R. L. (2011). The statistical analysis of failure time data,2nd Ed. John Wiley & Sons.

4. Moore, D. F. (2016). Applied survival analysis using R. Switzerland: Springer.

1. Singer, J.D and J. B. Willett (2003) Applied Longitudinal Data Analysis: Modeling Change and Event Occurrence. Oxford, Oxford University Press.

2. Therneau, T. M. and P. M. Grambsch (2000). Modelling Survival Data: Extending the Cox Model, Springer, NY

3. Collett, D. (2015). Modelling survival data in medical research. Chapman and Hall/CRC.

Evaluation Pattern

CIA - 50%

ESE - 50%

MST273C - OPTIMIZATION TECHNIQUES (2021 Batch)

Total Teaching Hours for Semester:75
No of Lecture Hours/Week:5
Max Marks:100
Credits:4

Course Objectives/Course Description

This course is designed to train the students to develop their modelling skills in mathematics through various methods of optimization. The course helps the students to understand the theory of optimization methods and algorithms developed for solving various types of optimization problems.

Course Outcome

By the end of the course, the learner will be able to:

 CO1: Understand and apply linear programming problems CO2: Apply one dimensional and multidimensional optimization problems. CO3: Understand multidimensional constrained and unconstrained optimization problems. CO4: Apply geometric and dynamic programming problems. CO5: Solve nonlinear problems through its linear approximation.

 Unit-1 Teaching Hours:15 Linear Programming Problems (LPP) Introduction to optimization – convex set and convex functions – simplex method: iterative nature of simplex method – additional simplex method: duality concept - dual simplex method – generalized simplex algorithm - revised simplex method: revised simplex algorithm – development of the optimality and feasibility conditions.    Practical Assignments: 1. Formulate the LPP. 2. Solve the LPP using simplex method. 3. Solve the LPP using revised simplex method. Unit-2 Teaching Hours:15 Integer Linear Programming Branch and bound algorithm – cutting plane algorithm – transportation problem: north-west method, least-cost method, vogel’s approximation and method of multipliers – assignment problem: mathematical statement, Hungarian method, variations of assignment problems.  Practical Assignments: 4. Solve integer LPP by cutting plane method. 5. Formulate and solve transportation problems. 6. Formulate and solve assignment problems. Unit-3 Teaching Hours:15 Non-linear Programming Introduction – unimodal function – one dimensional optimization: Fibonacci method – golden Section Method – quadratic interpolation methods - cubic interpolation methods – direct root method: newton method and quasi newton method – Multidimensional unconstrained optimization: univariate method – Hooks and Jeeves method – Fletcher – Reeves method - Newton’s method and quasi newton’s method. Practical Assignments: 7. Solve a non LPP problem. 8. Solve an unconstrained optimization problem by a univariate method Unit-4 Teaching Hours:15 Classical optimization techniques Single variable optimization – multivariable optimization with no constraints: semi-definite case and saddle point – multivariable optimization with equality constraints: direct substitution – method of constrained variation – method of Lagrange multipliers - Kuhn-Tucker conditions - constraint qualification – convex programming problem. Practical Assignments: 9. Solve a single variable optimization problem. 10. Solve multivariable optimization problems with equality constraints. 11. Solve a  convex optimization problem. Unit-5 Teaching Hours:15 Geometric and Dynamic programming Unconstrained minimization problem – solution of an unconstrained geometric programming problem using arithmetic-geometric inequality method – primal dual relationship - constrained minimization - dynamic programming: Dynamic programming algorithm – solution of linear programming problem by dynamic programming.  Practical Assignments: 12. Formulate and solve a dynamic programming problem. 13. Solve LPP through dynamic programming problems. 14. Solve a  geometric programming problem. Text Books And Reference Books: H. A. Taha (2017), Operations Research – An Introduction, 10th Edition, Prentice – Hall of India, New Delhi. S. S. Rao (2019), Engineering Optimization, 5th Edition, New Age International Pvt. Ltd., Publishers, Delhi. Essential Reading / Recommended Reading J.K. Sharma (2010), Quantitative Techniques for Managerial Decisions, Macmillan. Hadley, G. (2002), Linear Programming, Addison Wesley. G. Srinivasan (2007), Operations Research: Principles & Applications, Prentice Hall of India, New Delhi, India. Evaluation PatternCIA- 50% ESE-50% MST281 - RESEARCH PROBLEM IDENTIFICATION AND FORMULATION (2021 Batch) Total Teaching Hours for Semester:30 No of Lecture Hours/Week:2 Max Marks:50 Credits:1 Course Objectives/Course Description The course will be inculcating research culture which will enhance the employability skills to the students. Course Outcome CO1: Demonstrate the objective and data collection methodology for a research problem.
Unit-1
Teaching Hours:30
Problem Identification

Students will do the following,

1. Identify a domain for the research project

2. Literature survey

3. Identifying the existing methodology and models

4. Writing a problem statement

5. Project presentation at the end of the process

Text Books And Reference Books:

-

-

Evaluation Pattern

CIA - 50%

ESE - 50%

MST331 - STATISTICAL INFERENCE II (2020 Batch)

Total Teaching Hours for Semester:60
No of Lecture Hours/Week:4
Max Marks:100
Credits:4

Course Objectives/Course Description

 This course is designed to provide the strong conceptual foundations of testing of hypothesis, procedures of testing hypothesis, likelihood ratio tests, sequential tests, and non-parametric tests.

Course Outcome

CO1: Demonstrate the understanding of basic concepts of robust estimation and testing of hypotheses.

CO2: Apply the procedures of testing hypotheses for solving real-life problems.

CO3: Apply various non-parametric tests and draw conclusions to real-life problems.

CO4: Develop appropriate tests for testing specific statistical hypotheses.

CO5: Draw conclusions about the population with the help of various estimation and testing procedures.

Unit-1
Teaching Hours:14
Robust estimation

Robust estimation: The influence curve and empirical influence curve - M-estimation: Median, Trimmed and winsorized mean - Influence curve for M-estimators -  Limiting distribution of M-estimators - Resampling methods: Quenouille’s Jackknife estimation, parametric and non-parametric bootstrap methods.

Unit-2
Teaching Hours:10
Neyman-Pearson theory of testing of hypotheses

Basic concepts in statistical hypotheses testing - Simple and composite hypothesis - Critical regions - Type-I and Type-II error - Significance level - p-value and power of a test - Randomised and non-randomized tests - Neyman-Pearson lemma and its applications - Generalization of NP lemma - Construction of tests using NP lemma - Most powerful test - Uniformly most powerful test - Monotone Likelihood Ratio (MLR) property - Testing in one-parameter exponential families - Unbiased and invariant tests - Locally most powerful tests.

Unit-3
Teaching Hours:12
Uniformly most powerful tests

One-sided uniformly most powerful tests - Unbiased and Uniformly Most Powerful Unbiased tests for different two-sided hypothesis - Extension of these results to Pitman family when only upper or lower end depends on the parameters - UMP test from α-similar tests and α-similar tests with Neyman structure.

Unit-4
Teaching Hours:12
Likelihood procedure of testing of hypotheses

Likelihood ratio test (LRT) - asymptotic properties - LRT for the parameters of binomial and normal distributions - Generalized likelihood ratio tests - Chi-Square tests - t-tests - F-tests - Need for sequential tests - Sequential Probability Ratio Test (SPRT) - Wald’s fundamental identity - OC and ASN functions - Applications to Binomial, Poisson, and Normal distributions.

Unit-5
Teaching Hours:12
Basics of non-parametric tests

Non-parametric tests: Sign test - Chi-square tests - Kolmogorov-Smirnov one sample and two samples tests - Median test - Wilcoxon Signed Rank test - Mann- Whitney U-test - Test for Randomness - Runs up and runs down test - Wald–Wolfowitz run test for equality of distributions - Kruskal–Wallis one-way analysis of variance - Friedman’s two-way analysis of variance - Power and asymptotic relative efficiency.

Text Books And Reference Books:

1. Rohatgi, V. K. and Saleh, A.K.M. (2015). An Introduction to Probability and

Statistics, John Wiley and Sons.

1.  Lehmann, E. L., and Romano, J. P. (2005). Testing Statistical Hypotheses, 2/e, John Wiley, New York.
1. Srivastava, M.K., Khan, A.H. and Srivastava, N. (2014). Statistical Inference- Testing of Hypothesis, Prentice Hall India, New Delhi.
2. Rajagopalan, M. and Dhanavanthan, P. (2012). Statistical Inference, PHI Learning Pvt Ltd, New Delhi.
3. Kendall, M.G. and Stuart, A. (1967). The Advanced Theory of Statistics, vol 2,

2nd edition. Mc-Millan, New York.

1. Kale, B. K. and Muralidharan, K.  (2015). Parametric Inference: An Introduction. Alpha Science Int. Ltd.
2. Mukhopadhyay, P.(2015): Mathematical Statistics, Books and Allied (P) Ltd., Kolkata.
3. Gibbons J.K. (1971).  Non-Parametric Statistical Inference, McGraw Hill.
Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST332 - MULTIVARIATE ANALYSIS (2020 Batch)

Total Teaching Hours for Semester:60
No of Lecture Hours/Week:4
Max Marks:100
Credits:4

Course Objectives/Course Description

The exposure provided to the multivariate data structure, multinomial and multivariate normal distribution, estimation and testing of parameters, various data reduction methods would help the students in having a better understanding of research data, its presentation and analysis. This course helps to understand multivariate data analysis methods and their applications in various research areas.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Describe concepts of multivariate normal distribution.

CO2: Demonstrate the concepts of MANOVA and MANCOVA.

CO3: Identify various classification methods for multivariate data.

CO4: Analyze various data reduction methods for the multivariate data structure.

CO5: Interpret the results of various multivariate methods.

Unit-1
Teaching Hours:12
: Multivariate Distributions

Basic concepts on multivariate variables - Multivariate normal distribution - Marginal and conditional distribution - Concept of random vector - Its expectation and Variance - Covariance matrix. Marginal and joint distributions - Conditional distributions and Independence of random vectors - Multinomial distribution - Characteristic functions in higher dimensions - Multiple regressions and multiple correlations -Partial regression and Partial correlation (illustrative examples).

Unit-2
Teaching Hours:12
MANOVA and MANCOVA

Multivariate analysis of variance (MANOVA) and Covariance (MANCOVA) of one and two-way classified data with their interactions - Univariate and Multivariate Two-Way Fixed-effects Model with Interaction.

Unit-3
Teaching Hours:12
Equality of Mean and Variance Vector

Wishart distribution (definition, properties) -Construction of tests -Union - Intersection and likelihood ratio principles - Inference on mean vector - Hotelling's T2- Comparing Mean Vectors from Two Populations - Bartlett’s Test.

Unit-4
Teaching Hours:12
Classification and Discriminant Procedures

Concepts of discriminant analysis - Computation of linear discriminant function (LDF) - Classification between k multivariate normal populations based on LDF - Fisher’s Method for discriminating two or several populations - Evaluating Classification Functions - Probabilities of misclassification and their estimation - Mahalanobis D2.

Unit-5
Teaching Hours:12
Factor Analysis and Cluster Analysis

Cluster Analysis: - Distances and similarity measures - Hierarchical clustering methods - K- Means method.

Text Books And Reference Books:

1. Anderson, T.W. (2004). An Introduction to Multivariate Statistical Analysis. John

Wiley. New York.

2. Johnson, R.A. and Wichern, D.W. (2018). Applied Multivariate Statistical Analysis.

6th edn. Prentice-Hall. London.

1. Rohatgi, V.K. and Saleh, A.K.M.E. (2015). An Introduction to Probability Theory and

Mathematical Statistics. 2nd edn. John Wiley & Sons. New York.

2. Srivastava, M.S. and Khatri, C.G. (1979). An Introduction to Multivariate Statistics.

North-Holland.

3. Muirhead, R.J. (1982). Aspects of Multivariate Statistical Theory. John Wiley. New

York.

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST371 - TIME SERIES ANALYSIS (2020 Batch)

Total Teaching Hours for Semester:90
No of Lecture Hours/Week:6
Max Marks:150
Credits:5

Course Objectives/Course Description

The course considers statistical techniques to evaluate processes occurring through time. It introduces students to time series methods and the applications of these methods to different types of data in various fields. Time series modeling techniques including AR, MA, ARMA, ARIMA, and SARIMA will be considered with reference to their use in forecasting. The objective of this course is to equip students with various forecasting techniques and to familiarize themselves with modern statistical methods for analyzing time-series data.

Course Outcome

CO1: Demonstrate the understanding of basic concepts of analyzing time series, including white noise, trend, seasonality, cyclical component, autocovariance and autocorrelation function.

CO2:  Apply the concept of stationarity to the analysis of time-series data in various contexts.

CO3: Select the appropriate model, to fit parameter values, examine residual analysis, and to carry out the forecasting calculation.

CO4: Apply various techniques of seasonal time series models, including the seasonal autoregressive integrated moving average (SARIMA) models and Winters exponential smoothing.

CO5: Demonstrate the principles behind modern forecasting techniques, which includes obtaining the relevant data and carrying out the necessary computation using R software.

 Unit-1 Teaching Hours:20 Basic concepts in time series analysis Stochastic Process - Time series as a discrete parameter stochastic process - Auto – Covariance - Autocorrelation and their properties - Exploratory time series analysis- graphical analysis - classical decomposition model - concepts of trend, seasonality and cycle - Estimation of trend and seasonal components-Elimination of trend and seasonality - Method of differencing - Moving average smoothing - Method of seasonal differencing Practical Assignments: 1.Graphical representation of time series, plots of ACF and PACF and their interpretation 2.Examples of  trend, seasonal and cyclical time series and estimation of trend and seasonal  components 3. Exercise on Moving average smoothing to eliminate trend and illustration on the method of differencing to eliminate trend and seasonality. 4.Exercise on least-square fitting to estimate and eliminate the trend component. Unit-2 Teaching Hours:20 Stationary time series models Stationary time series models - Concepts of weak and strong stationarity - General linear Process - Auto-Regressive(AR), Moving Average(MA), and Auto-Regressive Moving Average (ARMA) processes – their properties - conditions for stationarity and invertibility -model identification based on ACF and PACF- Maximum likelihood estimation - Yule Walker Estimation - order selection ( AIC and BIC ) - Residual Analysis- - Box Jenkins methodology to the identification of stationary time series models Practical Assignments: 5.Exercise on fitting AR model 6. Exercise on fitting MA model 7. Exercise on fitting ARMA model 8.Model-identification using ACF and PACF, Model selection using AIC and BIC 9. Residual analysis and diagnosis check for AR, MA, and ARMA models Unit-3 Teaching Hours:15 Non-stationary time series models Concept of non-stationarity - Spurious trends and regressions-unit root tests: Dickey-Fuller (DF) test - Augmented Dickey-Fuller(ADF) test – Auto-Regressive Integrated Moving Average(ARIMA(p,d,q)) models - Difference equation form of ARIMA- Random shock form of ARIMA - An inverted form of ARIMA  Practical Assignments:  10. Exercise on the identification of non-stationary series from various plots. 11. Exercise on testing non-stationarity using ADF test, Exercise on fitting ARIMA models. 12. Residual analysis and diagnosis check for the ARIMA model. Unit-4 Teaching Hours:15 Seasonal time series models Analysis of seasonal models - parsimonious models for seasonal time series - Seasonal unit root test (HEGY test) - General multiplicative seasonal models - Seasonal ARIMA models - estimation - Residual analysis for seasonal time series.   Practical Assignments: 13. Exercise on the identification of additive and Multiplicative time series 14.Exercise on testing the presence of seasonality and on fitting Seasonal ARIMA models 15. Residual analysis and diagnosis check for Seasonal ARIMA model Unit-5 Teaching Hours:20 Forecasting Techniques In sample and out of sample forecast - Simple exponential and moving average smoothing - Holt Exponential Smoothing - Winter exponential smoothing - Forecasting trend and seasonality in Box Jenkins model:- Method of minimum mean squared error(MMSE) forecast - their properties - forecast error Practical Assignments: 16.Exercise on Simple exponential smoothing and Holt Exponential Smoothing 17.Exercise on Winters exponential smoothing. 18.Exercise on forecasting using ARIMA models. 19.Exercise on forecasting using seasonal ARIMA models. Text Books And Reference Books:1. Box, G. E., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2015). Time series analysis:     forecasting and control. John Wiley & Sons. 2. Chatfield, C., & Xing, H. (2019). The analysis of time series: an introduction with R. CRC Press. Essential Reading / Recommended Reading1. Hamilton, J. D. (2020). Time series analysis. Princeton university press. 2. Brockwell, P. J., & Davis, R. A. (2016). Introduction to time series and forecasting. springer. Evaluation PatternCIA - 50% ESE - 50% MST372A - STATISTICAL MACHINE LEARNING (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description Machine learning has a wide array of applications that belongs to different fields, such as biomedical research, reliability of large structures, space research, digital marketing, etc. This course will equip students with a wide variety of models and algorithms for machine learning and prepare students for research or industry application of machine learning techniques. Course Outcome By the end of the course, the learner will be able to:   CO1: Demonstrate the understanding of basic concepts of statistical machine learning. CO2: Apply classification algorithms for qualitative data. CO3: Analyse high dimensional data using principal component regression learning algorithms. CO4: Construct classification and regression trees by random forests. CO5: Create a statistical learning model using support vector machines
 Unit-1 Teaching Hours:15 Statistical learning Statistical learning: definition-prediction accuracy and model interpretability-supervised and unsupervised learning-assessing model accuracy- important problems in data mining: classification, regression, clustering, ranking, density estimation- Concepts: training and testing, cross-validation, overfitting, bias/variance tradeoff, regularized learning equation- simple and multiple linear regression algorithms.   Practical Assignments:   1.     Lab exercise on data preparation and using simple linear regression   2.     Lab exercise on model assessment simple linear regression   3.     Lab exercise on data preparation with multiple linear regression Unit-2 Teaching Hours:15 Classification algorithms Logistic model- training and testing the model-linear discriminant analysis-quadratic discriminant analysis- Use of Bayes’ theorem-k- nearest neighbours - Naive Bayes’- Adaboost.   Practical Assignments:   4.     Lab exercise on the logistic model   5.     Lab exercise on discriminant analysis   6.     Lab exercise on  Naïve Bayes’ and k-NN classifiers Unit-3 Teaching Hours:15 Linear model selection and regularization Optimal model-shrinkage methods: ridge and lasso regression-Dimension reduction methods: principal component (PC) regression and partial least square (PLS) regression: Non-linear models: regression splines-polynomial – Generalized additive models   Practical Assignments: 7.     Lab exercise on ridge regression 8.     Lab exercise on Lasso regression 9.     Lab exercise on PC regression 10.     Lab exercise on PLS regression Unit-4 Teaching Hours:15 Tree-based methods Decision tree-regression trees - bagging - random forests - boosting - classification trees-boosting-tree vs linear models.   Practical Assignments: 11.     Lab exercise on decision trees 12.     Lab exercise on regression trees 13.     Lab exercise on random forests 14.     Lab exercise on classification trees Unit-5 Teaching Hours:15 Support vector machines and resampling procedures Maximal classifier-support vector classifiers-support - rank boost (ranking algorithm) - hierarchical Bayesian modelling for density - resampling techniques-bootstrap- clustering algorithms: K-means algorithm.   Practical Assignments: 15.     Lab exercise on SVM classifier 16.     Lab exercise on rank boost algorithm 17.     Lab exercise on kernel density estimation 18.     Lab exercise on k-means clustering Text Books And Reference Books: James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An introduction to statistical learning (Vol. 112, p. 18). New York: springer. Essential Reading / Recommended Reading Gutierrez, D. D. (2015). Machine learning and data science: an introduction to statistical learning methods with R. Technics Publications. Müller, A. C., & Guido, S. (2016). Introduction to machine learning with Python: a guide for data scientists. “O’Reilly Media, Inc.". Murphy, K. P. (2012). Machine learning: a probabilistic perspective. MIT press Evaluation PatternCIA - 50% ESE - 50% MST372B - BIOSTATISTICS (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course provides an understanding of various statistical methods in describing and analyzing biological data. Students will be equipped with an idea about the applications of statistical hypothesis testing,  related concepts and interpretation in biological data. Course Outcome By the end of the course, the learner will be able to:   CO1: Demonstrate the understanding of basic concepts of biostatistics and the process involved in the scientific method of research. CO2: Identify how the data can be appropriately organized and displayed. CO3: Interpret the measures of central tendency and measures of dispersion. CO4: Interpret the data based on the discrete and continuous probability distributions. CO5: Apply parametric and non-parametric methods of statistical data analysis.
Unit-1
Teaching Hours:15
Introduction to Biostatistics

Presentation of data - graphical and numerical representations of data - Types of variables, measures of location - dispersion and correlation - inferential statistics - probability and distributions - Binomial, Poisson, Negative Binomial, Hyper geometric and normal distribution.

Practical Assignments:

1. Exercise on the representation of data
2. Exercise on reporting data by descriptive statistics
Unit-2
Teaching Hours:15
Parametric and Non - Parametric methods

Parametric methods - one sample t-test -  independent sample t-test - paired sample t-test - one-way analysis of variance - two-way analysis of variance - analysis of covariance - repeated measures of analysis of variance - Pearson correlation coefficient - Non-parametric methods: Chi-square test of independence and goodness of fit - Mann Whitney U test - Wilcoxon signed-rank test - Kruskal Wallis test - Friedman’s test - Spearman’s correlation test.

Practical Assignments:

1. Exercise on various parametric methods of analysis
2. Exercise on various non-parametric methods of analysis
Unit-3
Teaching Hours:15
Generalized linear models

Review of simple and multiple linear regression - introduction to generalized linear models - parameter estimation of generalized linear models - models with different link functions   - binary (logistic) regression - estimation and model fitting - Poisson regression for count data -  mixed effect models and hierarchical models with practical examples.

Practical Assignments:

1. Exercise on simple linear and multiple linear regression
2. Exercise on logistic regression
3. Exercise on Poisson regression
Unit-4
Teaching Hours:15
Epidemiology

Introduction to epidemiology, measures of epidemiology, observational study designs: case report, case series correlational studies, cross-sectional studies, retrospective and prospective studies, analytical epidemiological studies-case control study and cohort study, odds ratio, relative risk, the bias in epidemiological studies.

Practical Assignments:

1. Exercise on analysis of observational study data
2. Exercise on analysis of cross-sectional study data
3. Exercise on analysis of case-control study data
4. Exercise on analysis of cohort study data
Unit-5
Teaching Hours:15
Demography

Introduction to demography, mortality and life tables, infant mortality rate, standardized death rates, life tables, fertility, crude and specific rates, migration-definition and concepts population growth, measurement of population growth-arithmetic, geometric and exponential, population projection and estimation, different methods of population projection,  logistic curve, urban population growth, components of urban population growth.

Practical Assignments:

1. Exercise on preparing life tables
2. Exercise on measures of population growth
Text Books And Reference Books:
1. Marcello Pagano and Kimberlee Gauvreau (2018), Principles of Biostatistics, 2nd Edition, Chapman and Hall/CRC press
2. David Moore S. and George McCabe P., (2017) Introduction to practice of statistics, 9th Edition, W. H. Freeman.
3. Sundar Rao and Richard J., (2012) Introduction to Biostatistics and research methods, PHI Learning Private limited, New Delhi.
1. Abhaya  Indrayan and Rajeev Kumar M., (2018) Medical Biostatistics, 4th Edition, Chapman and Hall/CRC Press.
2. Gordis Leon (2018), Epidemiology, 6th Edition, Elsevier, Philadelphia
3. Ram, F. and Pathak K. B., (2016): Techniques of Demographic Analysis,Himalaya Publishing house, Bombay.
4. Park K., (2019), Park's Text Book of Preventive and Social Medicine, Banarsidas
Bhanot, Jabalpur.

Evaluation Pattern

50% Continuous Internal Asssessssment (CIA).

50% End Semester Examination.

MST372C - RELIABILITY ENGINEERING (2020 Batch)

Total Teaching Hours for Semester:75
No of Lecture Hours/Week:5
Max Marks:150
Credits:4

Course Objectives/Course Description

This course will provide knowledge in different probability models in the reliability evaluation of the system and its components. Reliability engineering is applied in the industry to reduce failures, ensure effective maintenance and optimize repair time.

Course Outcome

By the end of the course, the learner will be able to:

 CO1: Demonstrate the understanding of basic concepts of reliability. CO2: Analyse system reliability using probability models. CO3: Evaluate reliability from the lifetime data using common estimation procedures CO4: Create a stress-strength model for system reliability.

 Unit-1 Teaching Hours:15 Basic concepts Reliability of a system - failure rate - mean, variance and percentile residual life: identities connecting them - notions of ageing - IFR, IFRA, NBU, NBUE, DMRL, HNBUE, NBUC, etc. and their mutual implications - TTT transforms and characterization of ageing classes.   Practical Assignments: Exercise on failure rate function and mrl function Exercise on comparison ageing classes Unit-2 Teaching Hours:15 Lifetime models Non-monotonic failure rates and mean residual life functions - study of lifetime models: exponential, Weibull, lognormal, generalized Pareto, gamma with reference to basic concepts and ageing characteristics - bathtub and upside-down bathtub failure rate distributions   Practical Assignments:       3. Exercise on exponential lifetime model       4. Exercise on Weibull lifetime model       5. Exercise on bathtub shaped lifetime model Unit-3 Teaching Hours:15 System reliability Reliability systems with dependents components: Parallel and series systems, k out of n Systems - ageing properties with dependent and independents components -  concepts and measures of dependence on reliability - RCSI, LCSD, PF2, WPQD. Practical Assignments: 6. Exercise on reliability evaluation of series system 7. Exercise on reliability evaluation of a parallel system 8. Exercise on reliability evaluation of k out of n system 9. Exercise on reliability evaluation of dependent component system Unit-4 Teaching Hours:15 Reliability estimation Reliability estimation using MLE: exponential, Weibull and gamma distributions based on censored and non-censored samples - UMVU estimation of reliability function - Bayesian reliability estimation of exponential and Weibull models Practical Assignments: 10. Exercise on ML estimation under non-censored samples. 11. Exercise on ML estimation under censored samples. 12. Exercise on  Bayesian estimation of reliability. Unit-5 Teaching Hours:15 Life testing Life testing: basics – modelling lifetime – Accelerated Life Time (ALT) models- cumulative exposure models (CEM) - exponential CEM – stress-strength reliability – exponential stress-strength model. Practical Assignments: 13. Exercise on basic life testing procedure. 14. Exercise on exponential CEM model. 15. Exercise on stress-strength reliability. Text Books And Reference Books:1.      Birolini, A. (2013). Reliability engineering: theory and practice. Springer Science & Business Media.. 2.      Bain, L. (2017). Statistical analysis of reliability and life-testing models: theory and methods. Routledge. Essential Reading / Recommended Reading Barlow, R. E., & Proschan, F. (1975). Statistical theory of reliability and life testing: probability models. Florida State Univ Tallahassee. Tobias, P. A., & Trindade, D. (2011). Applied reliability. CRC Press. Evaluation PatternCIA - 50% ESE - 50% MST373A - NUMERICAL ANALYSIS (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course deals with the theory and application of different numerical methods techniques to solve the complex problems that arise in the modern world of science. The course highlights that through the numerical algorithms, it is definite to arrive at a solution which is efficient and stable for large scale systems. Course Outcome By the end of the course, the learner will be able to: CO1: Demonstrate the understanding of floating-point numbers and the role of errors and their analysis in numerical methods. CO2: Identity accuracy, consistency, stability and convergence of numerical methods. CO3:Derive numerical solution of the algebraic and transcendental equations, ordinary differential equations and boundary value problems. CO4: Interpret, analyse and evaluate results from numerical computations.
 Unit-1 Teaching Hours:15 Error analysis and basics of the solution of algebraic equations Errors and their analysis – Floating Point representation of numbers – Solution of algebraic and Transcendental equations: Bisection method, fixed-point iteration method, the method of False position, Newton Raphson method and Muller’s method. The solution of linear systems – Matrix inversion method – Gauss elimination method – Gauss-Seidel and Gauss-Jacobi iterative methods. Practical Assignments: 1. Solutions to algebraic equations using Bisection and fixed point methods 2. Solutions to algebraic and transcendental equations using Newton Raphson and Muller’s method. 3. Solving system of linear equations using Matrix inversion and Gauss elimination methods 4. Finding real roots to a system of linear equations using Gauss-Seidel and Gauss-Jacobi iterative methods. Unit-2 Teaching Hours:15 Advanced methods to Solve algebraic and transcendental equations Convergence criterion, Aitken’s-process - Sturm sequence method to identify the number of real roots, Bairstow’s method - Graeffe’s root squaring method - Birge-Vieta method - Solution of Linear system of algebraic equations: LU-decomposition methods (Crout’s, Cholesky and Delittle methods), consistency and ill-conditioned system of equations, Tridiagonal system of equations, Thomas algorithm.    Practical Assignments:   5. Identifying real roots for algebraic equations using the Sturm sequence method and Bairstow’s method.   6. Solving equations using the Birge-Vieta method.   7. Solving system of linear equations using LU decomposition methods.   8. Examining consistency of the system of equations using Tridiagonal system of equations and Thomas algorithm. Unit-3 Teaching Hours:15 Finite Differences and Interpolation Finite difference: Forward difference, Backward difference and Shift operators – Separation of symbols – Newton’s Formula for interpolation – Lagrange’s interpolation formulae – Numerical differentiation – Numerical integration: Trapezoidal rule, Simpson’s one-third rule and Simpson’s three-eight rule. Numerical ODE: Taylor’s series – Picard’s method – Euler’s method – Modified Euler’s method – Runge Kutta Method. Practical Assignments: 9. Exercise on Newton’s and Lagrange's interpolation formulae 10. Integration using Trapezoidal rule and Simpon’s rules. 11. Numerical Solutions for ODE using Taylor’s series and Picard’s Method   12. Numerical Solutions for ODE using Euler’s method, Modified Euler’s method and Runge Kutta Method. Unit-4 Teaching Hours:15 Advanced Numerical Integration Lagrange, Hermite, Cubic-spline’s method – with uniqueness and error term, polynomial interpolation: Chebychev and Rational function approximation, Gaussian quadrature, Gauss-Legendre, Gauss-Chebychev formulas.     Practical Assignments:   13. Integration using Hermite and Cubic-spline’s method   14. Interpolation for polynomial equations Chebychev and rational function approximation   15. Interpolation for polynomial equations Gaussian quadrature and Gauss-Legender   16. Numerical integration through Gauss-Chebyshev formulas. Unit-5 Teaching Hours:15 Advanced numerical solutions of Ordinary Differential equations Initial value problems – Multistep method – Adams-Moulton method – Stability (convergence and truncation error) – Boundary value problems: second order finite difference method – first, second and third types by shooting method – Rayleigh-Ritz method – Galerkin method.   Practical Assignments: 17. Solution for initial value problems using Multistep and Adams-Moulton method 18. Solving ODE using shooting, Rayleigh-Ritz and Galerkin methods Text Books And Reference Books: C. F. Gerald and P. O. Wheatley, Applied Numerical Analysis, 7th Edition, Pearson publications, reprint 2017. M. K. Jain, Iyengar, S. R. K. and R. K. Jain, Numerical Methods Problems and Solutions, 3rd Editions,  New Age Pvt. Pub, New Delhi, reprint 2020. Essential Reading / Recommended Reading1.      R.L. Burden and J. Douglas Faires, Numerical Analysis, 9th Edition, Boston: Cengage Learning, 2011. 2.      S.C. Chopra and P.C. Raymond, Numerical Methods for Engineers, New Delhi: Tata McGraw-Hill, 2010. 3.      Graham. W Griffiths, Numerical Analysis using R solution to ODEs and PDEs, Cambridge University Press, 2016. 4.      Jaan Kiusalaas, Numerical methods in Engineering with Python 3, Cambridge University Press, 2013. Evaluation PatternCIA - 50% ESE- 50% MST373B - NON-PARAMETRIC METHODS (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course will provide the basic theory and computing tools to perform nonparametric tests, including the Sign test, Wilcoxon signed-rank test, Median test etc. Kruskal-Wallis for one-way and multiple comparisons, linear rank test for location and scale parameters and measure of association in bivariate populations are other nonparametric tests covered in this course. The aim of the course is the in-depth presentation and analysis of the most common methods and techniques of non-parametric statistics such as sign test, rank test, run test, median test etc. Course Outcome By the end of the course, the learner will be able to: CO1: Compare different nonparametric hypothesis tests in two-sample problems. CO2: Construct interval estimators for population medians and other population parameters based on rank-based methods. CO3: Formulate, test and interpret various hypothesis tests for location, scale, and independence problems CO4: Demonstrate different measures of association for bivariate samples.
 Unit-1 Teaching Hours:15 One-Sample and Paired-Sample Procedures The quantile function - the empirical distribution function - statistical properties of order statistics- confidence interval for a population quantile -hypothesis testing for a population quantile -the sign test and confidence interval for the median - rank-order statistics -treatment of ties in rank tests-  Wilcoxon signed-rank test and confidence interval   Practical Assignments:   Exercise on confidence interval estimation and hypothesis test for a population Quantile Exercise on sign test and confidence interval for the median Exercise on rank-order statistics -treatment of ties in rank tests Exercise on Wilcoxon signed-rank test and confidence interval Unit-2 Teaching Hours:15 The General two-sample problem Wald-Wolfowitz runs test - Kolmogorov-Smirnov two-sample test - median test - the control median test - the Mann-Whitney U test                                                                                       Practical Assignments:  5.Exercise on Wald-Wolfowitz runs test 6.Exercise on Kolmogorov-Smirnov two-sample test. 7.Exercise on Median test and control median test.   8.Exercise on Mann-Whitney U test. Unit-3 Teaching Hours:15 Linear Rank Tests for the Location and Scale Problem Definition of linear rank statistics - Wilcoxon rank-sum test - mood test - Freund-Ansari-Bradley-David-Barton tests - Siegel-Tukey test   Practical Assignments: 9.Exercise on Wilcoxon rank-sum test and  mood test 10.Exercise on Freund-Ansari-Bradley-David-Barton tests. 11. Exercise on Siegel-Tukey test Unit-4 Teaching Hours:15 Tests of the Equality of k Independent Samples extension of the median test - the extension of the control median test - the Kruskal-Wallis one-way ANOVA test and multiple comparisons - tests against ordered alternatives - comparisons with a control -  Chi-Square test for k proportions   Practical Assignments: 12.Exercise on the extension of the median test and control median test. 13.Exercise on Kruskal-Wallis one-way ANOVA test. 14.Exercise on chi-square test for k  proportions Unit-5 Teaching Hours:15 Measures of Association for Bivariate Samples Introduction: definition of measures of association in a bivariate population - Kendall’s Tau coefficient - Spearman’s coefficient of rank correlation - relations between R and T; E(R), t, and r    Practical Assignments: 15.Exercise on Kendall’s Tau coefficient. 16.Exercise on Spearman’s coefficient of rank correlation Text Books And Reference Books: Gibbons, J. D., & Chakraborti, S. (2020). Nonparametric statistical inference. CRC press. Kloke, J., & McKean, J. W. (2014). Nonparametric statistical methods using R. CRC Press. Essential Reading / Recommended Reading  Hollander, M., Wolfe, D. A., & Chicken, E. (2013). Nonparametric statistical methods (Vol. 751). John Wiley & Sons. Lewis, N. D. C. (2013). 100 Statistical Tests in R. Heather Hills Press. Evaluation PatternCIA 50% ESE 50% MST373C - THEORY OF GAMES AND STATISTICAL DECISIONS (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description Game theory and the decision is a branch of Mathematics and Statistics that enables to study of the strategic interactions amongst rational decision-makers. Traditionally, game-theoretic tools have been applied to solve problems in Economics, Business, Political Science, Biology, Sociology, Computer Science, Logic, and Ethics. In recent years, applications of game theory have been successfully extended to several areas of engineered / networked system such as wireline and wireless communications, static and dynamic spectrum auction, social and economic networks. Course Outcome CO1: Demonstrate the basics of a “game” and translate the basics of a “game” into a wide range of conflicts. CO2: Apply the minimax, randomized, and non-randomized decision rules to real-life problems. CO3: Infer the importance of rules based on sufficient and essentially complete class. CO4: Identify the invariant statistical decision problems and their solutions CO5:  Apply the Bayes rules in multiple decision problems and to address Slippage problems.
 Unit-1 Teaching Hours:15 Game and Decision Theories Theory of games - zero-sum game - minimax - maxmin -  dominance strategy - the value of the game - Basic elements of game and Decision -  Comparison of the two theories - Decision function and Risk function; Randomization and optimal decision rules - Form of Bayes rules for estimation.                                                                                                          Practical Assignments: Two-person zero-sum game Game with minimax and maximin strategies Game with dominance rule and value of the game. Unit-2 Teaching Hours:15 Main Theorems of Decision Theory Admissibility and completeness - Fundamental theorems of Game and Decision theories - Admissibility of Bayes rules - Existence of Bayes decision rules - Existence of minimal complete class - Essential completeness of the class of non-randomized decision rules - Minimax theorem -  The complete class theorem -  Methods for finding minimax rules.                      Practical Assignments: Admissibility of Bayes rules Examining the completeness Problems on the non-randomized decision rule Unit-3 Teaching Hours:15 Sufficient Statistics Sufficient Statistics and essentially complete class of rules based on Sufficient Statistics - Complete Sufficient Statistics - Continuity of the risk function.          Practical Assignments: Examining the complete sufficient statistic Rules-based on sufficient statistic Unit-4 Teaching Hours:15 Invariant Statistical Decision Problems Invariant decision problems and rules - Admissible and minimax invariant rules -  Minimax estimates of location parameter - Minimax estimates for the parameters of normal distribution.                                                                                                         Practical Assignments: Minimax estimates of Location Parameter Minimax estimates for the normal parameters. Invariant minimax rules Unit-5 Teaching Hours:15 Multiple Decision problems Monotone Multiple decision problems - Bayes rules in multiple decision problems -  Slippage problems. Practical Assignments: Bayes rules in multiple decision problems Slippage problems Text Books And Reference Books: William M. Bolstad (2007), Introduction to Bayesian Statistics, 2 nd Edition.  J.O. Berger (1985), Statistical Decision Theory and Bayesian Analysis, 2 nd Edition, Springer      . Essential Reading / Recommended Reading   Robert Winkler (Jan 15, 2003) An Introduction to Bayesian Inference and Decision, Second  Edition. T.S. Ferguson (1967), Mathematical Statistics – A Decision Theoretic, Approach,  Academic Press. M.H. DeGroot (1976), Optimal Statistical Decisions, McGraw Hill. Evaluation PatternCIA - 50% ESE - 50% MST381 - RESEARCH MODELING AND IMPLEMENTATION (2020 Batch) Total Teaching Hours for Semester:120 No of Lecture Hours/Week:8 Max Marks:200 Credits:4 Course Objectives/Course Description This will equip the student to apply statistical methods they have studied in various courses and present their work through research articles. Course Outcome By the end of the course, the learner will be able to CO1: Apply statistical techniques to a real-life problem. CO3: Interpret and conclude the statistical analysis scientifically. CO4: Present the work done through presentation and research article.
 Unit-1 Teaching Hours:120 Modelling and Implementation 1. Apply various statistical methods in solving a real-life problem. 2. Comparison with the existing models or results. 3. Writing research article  4. Presentation of the article Text Books And Reference Books:_ Essential Reading / Recommended Reading_ Evaluation PatternCIA 50% ESE 50% MST431 - ADVANCED OPERATIONS RESEARCH (2020 Batch) Total Teaching Hours for Semester:60 No of Lecture Hours/Week:4 Max Marks:100 Credits:4 Course Objectives/Course Description Operations research helps in solving problems in different environments that need decisions. The module includes the topics : linear programming, integer programming, nonlinear programming, simple queueing models and inventory models. The aim of the course is to provide the students, how to formulate the problems into mathematical models and to use the appropriate methods to solve them. Course Outcome By the end of the course, the learner will be able to: CO1:Understand mathematical models used in Operations Research and to use solution methods such as Simplex, revised simplex and dual simplex for solving linear programming problems. CO2: Solve integer programming models using Cutting plane and brand and bound methods. CO3: Solve non-linear programming problems with equality and inequality constraints. CO4: Analyse service-oriented problems usingqueuing models. CO5: Understand the methods used by organizations to obtain the right quantities of stock or inventory, as well as familiarize themselves with inventory management practices.
Unit-1
Teaching Hours:12
Linear Programming problem (LPP)

General Linear programming problem - Formulation -  Solution through graphical, Simplex, Big-M and Two phase methods - Revised Simplex method - Big-M and Two phase Revised Simplex methods - Duality - Primal-dual relationships - Dual Simplex method.

Unit-2
Teaching Hours:12
Integer Programming

Gomory’s All-Integer Cutting-Plane Method - Construction of Gomory’s Constraint - Gomory’s Mixed-Integer Cutting-Plane Method -  Construction of Additional Constraint for Mixed-Integer Programming Problem - Branch and Bound Method.

Unit-3
Teaching Hours:12
Nonlinear programming problem (NLPP)

General nonlinear programming problem - Constrained optimization with equality constraints - Necessary conditions for a generalized NLPP (without proof) - Sufficient conditions for a general NLPP with one constraint (without proof) - Sufficient conditions for a general problem with m(<n) constraints (without proof) -Constrained optimization with inequality constraints - Kuhn-Tucker conditions for general NLPP with m(<n) constraints (without proof).  Constrained optimization with inequality constraints - Kuhn-Tucker conditions for general NLPP with m(<n) constraints (without proof)

Unit-4
Teaching Hours:12
Queueing Theory

Basics of queuing model - Probability distribution in a queueing system - Distribution of arrivals (Pure birth model) - Distribution of departures (Pure death model) -  Poisson queuing model:  (M/M/1) : (GD/∞/∞) - (M/M/1) : (N/FCFS/∞) - (M/M/c) : (∞/FCFS/∞) - (M/M/c) : (N/FCFS/∞).

Unit-5
Teaching Hours:12
Inventory Models

Deterministic inventory Models - Economic Order Quantity(EOQ) models - Classic EOQ models - Problems with no shortages - The fundamental EOQ Problems: EOQ problems with several production runs of unequal length - Problems with price breaks - One price break - More than one price break - Probabilistic inventory models - Single Period Problem without set-up cost - I.

Text Books And Reference Books:

1. Bhunia, A. K., Sahoo, L., & Shaikh, A. A. (2019). Advanced Optimization and Operations Research. Springer.

1. Srinivasan, G. (2017). Operations Research: principles and applications. PHI Learning Pvt. Ltd.

2. Taha, H. A. (2013). Operations research: an introduction. Pearson Education India.

3. Shortle, J. F., Thompson, J. M., Gross, D., & Harris, C. M. (2018). Fundamentals of queueing theory (Vol. 399). John Wiley & Sons.

4. Sharma, J. K. (2016). Operations research: theory and applications. Trinity Press, an imprint of Laxmi Publications Pvt. Limited.

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST432 - DESIGN AND ANALYSIS OF EXPERIMENTS (2020 Batch)

Total Teaching Hours for Semester:60
No of Lecture Hours/Week:4
Max Marks:100
Credits:4

Course Objectives/Course Description

This course will provide students with a mathematical background of various basic designs involving one-way and two-way elimination of heterogeneity and characterization properties. To prepare the students in deriving the expressions for analysis of experimental data and selection of appropriate designs in planning a scientific experimentation

Course Outcome

### CO5: Analyse the Incomplete Block designs.

Unit-1
Teaching Hours:12
Basic of design of experiments

Basic principles of design of experiments - Randomization - Replication and Local control - Uniformity trials - Size and Shape of plots and blocks - Elements of linear estimation - Analysis of variance - Completely Randomized Design (CRD) - Randomized Complete Block Design (RCBD) and Latin Square Design (LSD) -  Missing plot techniques

Unit-2
Teaching Hours:12
Analysis of Covariance

Analysis of covariance - Ancillary/Concomitant variable and study variable - Linear model for ANCOVA - Adjustment of treatment sum of squares in ANCOVA - One - Way and two-way classification with a single concomitant variable in CRD and RCBD designs.

Unit-3
Teaching Hours:12
Factorial experiments

Factorial experiments - Simple experiment (single factor) vs Factorial experiments - Mixed and Fixed factor experiments - Treatment combination in a factorial experiment - Simple effect - Main effect and Interaction effect in a factorial experiment - Yates method of computing factorial effects totals - Complete and partial confounding in symmetrical factorial experiments (22, 23, 33, 2nand 3n series) - Gain in the factorial experiments.

Unit-4
Teaching Hours:12
Split - Plot and Strip - Plot designs

Split - Plot, Split - Split plot and Strip - Plot (Split Block) design - Situation for the usage of the design - Layout and analysis of the designs - Difference in the error components in the designs - Selection of factor for allocation in plots (main/sub) - Combined experiments -  Cross - Over designs

Unit-5
Teaching Hours:12
Incomplete Block Designs

Balanced Incomplete Block (BIB) designs - General properties and Analysis with and without recovery of information - Construction of BIB designs - Parameter relationship - Intra and inter-block Analysis - Partially Balanced Incomplete Block Design (PBIBD) - Youden square designs - Lattice designs

Text Books And Reference Books:
1. Montgomery, D.C. (2019). Design and Analysis of Experiments. John Wiley and Sons, Inc. New York
1. Gupta, S.C and Kapoor, V.K. (2019). Fundamentals of Applied Statistics. 4th edition (Reprint). Sultan Chand and Sons. India.
2. Cochran, W.G. and Cox, G.M. (1992). Experimental Designs. John Wiley.
3. Dean, A.M. and Voss, D. (1999). Design and Analysis of Experiments. Springer.
4. Das, M.N. and Giri, N.C. (1986). Design and Analysis of Experiments. New Age.
5. Lawson, J. (2015). Design and Analysis of Experiments with R. CRC Press
6. Dey, A. (1986). Theory of Block Designs. Wiley Eastern Ltd.

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST433 - STATISTICAL QUALITY CONTROL (2020 Batch)

Total Teaching Hours for Semester:60
No of Lecture Hours/Week:4
Max Marks:100
Credits:4

Course Objectives/Course Description

This course provides an introduction to the application of statistical tools in the industrial environment to study, analyze and control the quality of products

Course Outcome

By the end of the course, the learner will be able to:

CO1: Understand concepts of control charts in quality improvement

CO2: Analyse process capability using control charts

CO3: Construct modified control charts to monitor the process

CO4: Evaluate the quality of products using various acceptance sampling plans

Unit-1
Teaching Hours:12
Statistical Process Control

Meaning and scope of statistical quality control - Causes of quality variation - Control charts for variables and attributes - Rational subgroups - Construction and operation of, σ, R, np, p, c and u charts - Operating characteristic curves of control charts. Process capability analysis using histogram, probability plotting and control chart - Process capability ratios and their interpretations.

Unit-2
Teaching Hours:12

Specification limits and tolerance limits - Modified control charts - Basic principles and design of cumulative - sum control charts – Concept of V-mask procedure – Tabular CUSUM charts - Construction of Moving range - moving-average and geometric moving-average control charts.

Unit-3
Teaching Hours:12
Attribute sampling plans

Acceptance sampling: Sampling inspection by attributes – single, double and multiple sampling plans – Rectifying Inspection - Measures of performance: OC, ASN, ATI and AOQ functions - Concepts of AQL, LTPD and IQL - Dodge – Romig and MIL-STD-105D tables

Unit-4
Teaching Hours:12
Variables Sampling Plans

Sampling inspection by variables - known and unknown sigma variables sampling plan - Merits and limitations of variables sampling plan - single, double and multiple sampling plans - Derivation of OC curve – determination of plan parameters.

Unit-5
Teaching Hours:12
Continuous and Cumulative Sampling Plans

Continuous Sampling Plans (CSP): CSP-1- CSP-2 - CSP-3 - Skip-Lot Sampling Plans (SkSP): SkSP-1 - SkSP-2  with SSP as reference plan - Chain Sampling Plans (ChSP - 1) with SSP as reference plan - Tighten-Normal-Tighted (TNT) sampling plan with SSP as reference plan– Decision Lines.

Text Books And Reference Books:

1. Montgomery, D. C. (2019). Introduction to Statistical Quality Control, Eighth Edition, Wiley India, New Delhi.

1 Juran, J.M., and De Feo, J.A. (2010). Juran’s Quality control Handbook – The Complete Guide to Performance Excellence, Sixth Edition, Tata McGraw-Hill, New Delhi.

2. Schilling, E. G., and Nuebauer, D.V. (2009). Acceptance Sampling in Quality Control, Second Edition, CRC Press, New York

3. Duncan, A. J. (2003.). Quality Control and Industrial Statistics, Irwin-Illinois, US.

Evaluation Pattern
 Component Marks CIA I 10 Mid Semester Examination (CIA II) 25 CIA III 10 Attendance 05 End Semester Exam 50 Total 100

MST471A - NEURAL NETWORKS AND DEEP LEARNING (2020 Batch)

Total Teaching Hours for Semester:75
No of Lecture Hours/Week:5
Max Marks:150
Credits:4

Course Objectives/Course Description

The objective of this course is to provide fundamental knowledge of neural networks and deep learning. This course gives a brief idea of the basics of neural networks, shallow and deep neural networks and other methods to build various research projects.

Course Outcome

By the end of the course, the learner will be able to:

CO1: Identify the difference between biological and arithmetic neural networks.

CO2: Demonstrate the different types of supervised learning algorithms.

CO3: Build and train various Convolution Neural Networks

CO4: Implement Recurrent Neural Networks and other artificial neural networks for real-time applications.

 Unit-1 Teaching Hours:15 Introduction to Artificial Neural Networks Fundamental concepts of Artificial Neural Networks (ANN) - Biological neural networks - Comparison between biological neuron and artificial neuron - Evolution of neural networks - Scope and limitations of ANN - Basic models of ANN - Learning methods - Activation functions - Important terminologies of ANN: Weights - Bias - Threshold - Learning Rate - Momentum factor - Vigilance parameters. Practical  Assignments: Exercise on construction of Artificial Neural Networks. Exercise on training and testing the data Unit-2 Teaching Hours:15 Supervised Learning Algorithms Concept of supervised learning algorithms - Perceptron networks - Adaptive linear neuron (Adaline) - Multiple adaptive linear neuron - Back-Propagation network: Learning factors - Initial weights - Learning rate ɑ - Momentum factor - Generalization - Training and testing of the data. Practical  Assignments:  3. Exercise on multiple adaptive linear neurons 4. Exercise on Back-propagation networks Unit-3 Teaching Hours:15 Unsupervised Learning Algorithms Concept of unsupervised learning algorithms - Fixed weight competitive net: Maxnet - Mexican Hat net - Hamming networks - Kohonen self-organizing feature maps - Learning vector quantization. Practical  Assignments: 5. Exercise on Maxnet and Mexican Hat net.6. Exercise on Hamming networks7. Exercise on Kohonen self-organizing feature maps8. Exercise on Learning vector quantization Unit-4 Teaching Hours:15 Convolution Neural Networks Introduction - Components of Convolution Neural Networks (CNN) architecture: Padding - Strides - Rectified linear unit layer - Exponential linear unit - Pooling - Fully connected layers  - Local response normalization - Hierarchical feature engineering -  Training CNN using Backpropagation through convolutions - Case studies: AlexNet - GoogLeNet. Practical  Assignments: 9. Exercise on building CNN with the rectified linear unit and exponential linear unit10. Exercise on Hierarchical feature engineering11. Exercise on training CNN using Backpropagation through convolutions12. Exercise on feature prediction using AlexNet and GoogLeNet. Unit-5 Teaching Hours:15 Deep Reinforcement Learning Stateless algorithms: Naive algorithms - Upper bounding methods - Simple reinforcement learning for Tic-Tac-Toe - Straw-Man algorithms - Bootstrapping for value function learning - One policy versus off policy methods: SARSA - Policy gradient methods: Finite difference method - Likelihood ratio method - Monte Carlo tree search. Practical  Assignments: 13. Exercise on Naive and upper bounding algorithms for data classification.14. Exercise on Bootstrapping for value function learning15. Exercise on SARSA method16. Exercise on Finite difference and likelihood ratio methods17. Exercise on Monte Carlo tree search method. Text Books And Reference Books: Charu C. Aggarwal (2018) Neural Networks and Deep Learning A Textbook, Springer International Publishing, Switzerland. Essential Reading / Recommended Reading S.N Sivanandam, S.N Deepa (2018). Principles of soft computing. Wiley India. S Lovelyn Rose, L Ashok Kumar, Karthika Renuka (2019). Deep Learning using Python. Wiley India. Francois Chollet (2017). Deep Learning with Python. Manning Publishing. Andreas C. Muller & Sarah Guido (2017). Introduction to Machine Learning with Python. O’Reilly Media, Inc. Evaluation PatternCIA - 50% ESE - 50% MST471B - SPATIAL STATISTICS (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course has been conceptualized in order to understand the fundamental and applied concepts of spatial statistics that describe the diverse set of methods to model and analyze the various types of Spatial data. Course Outcome By the end of the course, the learner will be able to: CO1: Demonstrate an understanding of the fundamental concepts of spatial statistical analysis. CO2: Identify the various types of spatial data by plots. CO3: Apply the appropriate statistical model to the various types of spatial data. CO4: Analyze and interpret the spatial data problems of various disciplines.
 Unit-1 Teaching Hours:15 Introduction to spatial statistics Spatial data - Types of spatial data- Geostatistical data, Lattice data, Point pattern data with examples  - Visualizing spatial data: Traditional plots, lattice plots and interactive plots – Exploratory spatial data analysis - Intrinsic stationarity, Square-Root-Differences Cloud -  The Pocket plot – Decomposing the data into large and small scale variation -  Analysis of residuals – Variogram of residuals. Practical Assignments:   Exercise on the visualization of spatial data using traditional plots,  Exercise on the visualization of spatial data using lattice and interactive plots Exercise on exploratory data analysis Unit-2 Teaching Hours:15 Geostatistical data Stationary Processes: Variogram, Covariogram and Correlogram - Estimation of variogram: Comparison of the variogram and covariogram estimation, exact distribution theory of the variogram - Robust estimation of variogram –  Spectral representations: Valid covariograms and variograms - Variogram model fitting: Criteria for fitting a variogram model, properties of variogram-parameter estimators, Cross-validating the fitted variogram. Practical Assignments:   Exercise on exploratory variogram analysis Exercise on variogram Exercise on variogram modelling  Exercise on residual variogram modeling Unit-3 Teaching Hours:15 Spatial prediction and kriging Scale of variation -  Ordinary Kriging: Effect of variogram parameters on Kriging,  Lognormal and Trans-Gaussian Kriging, Cokriging – Robust Kriging – Universal Kriging : Estimation of variogram for Universal Kriging – Median-Polish Kriging: Gridded and non-gridded data, Median Polishing spatial data, Bias in Median-Based  covariogram estimators – Applications of Geostatistics.  Practical Assignments:   Exercise on Ordinary Kriging Exercise on Robust Kriging Exercise on Universal Kriging Unit-4 Teaching Hours:15 Spatial models on lattice data Lattices – Spatial data analysis, Trend removal -  Conditionally and simultaneously specified spatial gaussian models – Markov random fields – Conditionally specified spatial models for discrete and continuous data – Parameter estimation for Lattice models using gaussian maximum likelihood estimation– Properties of estimators – Statistical image analysis and remote sensing.  Practical Assignments:   Exercise on the estimation of parameters of lattice models Exercise on spatial autocorrelation Exercise on the fitting of lattice models Unit-5 Teaching Hours:15 Spatial point patterns Spatial point patterns data analysis: Complete spatial randomness, regularity and clustering – Kernel estimators of intensity function – Distance methods: Nearest-Neighbor methods – Statistical spatial analysis of point processes:  Stationary and Isotropic point processes – Palm distribution – Models and model fitting: Inhomogeneous Poisson, Cox and Poisson cluster process Practical Assignments:   Exercise on plotting spatial point patterns under the boundary Exercise on distance methods Exercise on Kernel smoothing Exercise on Inhomogeneous Poisson process Text Books And Reference Books:1. Cressie, Noel A.C. (2015). Statistics for Spatial Data. Revised Edition. Wiley Interscience Publication. Essential Reading / Recommended Reading1. Bivand Roger S., Pebesma Edzer J.  and Gomez-Rubio V. (2013). Applied Spatial Data Analysis with R. Springer New York(2nd Edition). Evaluation PatternCIA 50%+ESE 50% MST471C - BIG DATA ANALYTICS (2020 Batch) Total Teaching Hours for Semester:75 No of Lecture Hours/Week:5 Max Marks:150 Credits:4 Course Objectives/Course Description This course has been designed to train the students in handling different types of Big data sets and provide knowledge about the methods of handling these types of data sets. Course Outcome By the end of the course, the learner will be able to: CO1: Demonstrate the understanding of basic concepts of Big data CO2: Identify different types of Hadoop architecture CO3: Illustrate the parallel processing of data using MapReduce techniques CO4: Analyze the Big data under Spark architecture CO5: Demonstrate the programming of Big data using Hive and Pig environments