top of page

RBI -DSIM SYLLABUS

SUNRISE CLASSES IS NOW IN LAXMI NAGAR NEAR METRO STATION EXIT FROM GATE NO. 5


ree

ree

ree


ree


ree


ree


RBI-DSIM SYLLABUS 


1. Theory of Probability and Probability Distributions   Classical and axiomatic approach of probability and its properties, Bayes theorem and its application,  strong and weak laws of large numbers, characteristic functions, central limit theorem, probability  inequalities.  Standard probability distributions – Binomial, Poison, Geometric, Negative binomial, Uniform,  Normal, exponential, Logistic, Log-normal, Beta, Gamma, Weibull, Bivariate normal etc.   Exact Sampling distributions - Chi-square, student’s t, F and Z distributions and their applications.  Asymptotic sampling distributions and large sample tests, association and analysis of contingency  tables.   Sampling Theory:  Standard sampling methods such simple random sampling, Stratified random sampling, Systematic  sampling, Cluster sampling, Two stage sampling, Probability proportional to size etc. Ratio  estimation, Regression estimation, non-sampling errors and problem of non-response, and  Correspondence and categorical data analysis.


  2. Linear Models and Economic Statistics  Simple linear regression - assumptions, estimation, and inference diagnostic checks; polynomial  regression, transformations on Y or X (Box-Cox, square root, log etc.), method of weighted least  squares, inverse regression. Multiple regression - Standard Gauss Markov setup, least squares  estimation and related properties, regression analysis with correlated observations. Simultaneous  estimation of linear parametric functions, Testing of hypotheses; Confidence intervals and regions;  Multicollinearity and ridge regression, LASSO.  Definition and construction of index numbers, Standard index numbers; Conversion of chain base  index to fixed base and vice-versa; base shifting, splicing and deflating of index numbers;  Measurement of economic inequality: Gini's coefficient, Lorenz curves etc. 

  

3. Statistical Inference: Estimation, Testing of Hypothesis and Non-Parametric Test  Estimation: Concepts of estimation, unbiasedness, sufficiency, consistency and efficiency.  Factorization theorem. Complete statistic, Minimum variance unbiased estimator (MVUE), Rao Blackwell and Lehmann-Scheffe theorems and their applications. Cramer-Rao inequality.  Methods of Estimation: Method of moments, method of maximum likelihood estimation, method of  least square, method of minimum Chi-square, basic idea of Bayes estimators.  Principles of Test of Significance: Type-I and Type-II errors, critical region, level of significance,  size and power, best critical region, most powerful test, uniformly most powerful test, Neyman  Pearson theory of testing of hypothesis. Likelihood ratio tests, Tests of goodness of fit. Bartlett's test  for homogeneity of variances.  Non-Parametric Test: The Kolmogorov-Smirnov test, Sign test, Wilcoxon Signed-rank test,  Wilcoxon Rank-Sum test, Mann Whitney U-test, Kruskal-Walls one way ANOVA test, Friedman's  test, Kendall's Tau coefficient, Spearman's coefficient of rank correlation. 


 4. Stochastic Processes  Poisson Processes: Arrival, interarrival and conditional arrival distributions. Non-homogeneous  Processes. Law of Rare Events and Poisson Process. Compound Poisson Processes.    Markov Chains: Transition probability matrix, Chapman- Kolmogorov equations, Regular chains  and Stationary distributions, Periodicity, Limit theorems. Patterns for recurrent events. Brownian  Motion - Limit of Random Walk, its defining characteristics and peculiarities; Martingales.  5. Multivariate Analysis  Multivariate normal distribution and its properties and characterization; Wishart matrix, its distribution  and properties, Hotelling’s T2 statistic, its distribution and properties, and its applications in tests on  mean vector, Mahalanobis’ D2 statistics; Canonical correlation analysis, Principal components  analysis, Factor analysis and cluster analysis. 


 6. Econometrics and Time Series  General linear model and its extensions, ordinary least squares and generalized least squares  estimation and prediction, heteroscedastic disturbances, pure and mixed estimation. Auto  correlation, its consequences and related tests; Theil BLUS procedure, estimation and prediction;  issue of multi-collinearity, its implications and tools for handling it; Ridge regression.   Linear regression and stochastic regression, instrumental variable regression, autoregressive linear  regression, distributed lag models, estimation of lags by OLS method. Simultaneous linear equations  model and its generalization, identification problem, restrictions on structural parameters, rank and  order conditions; different estimation methods for simultaneous equations model, prediction and  simultaneous confidence intervals.   Exploratory analysis of time series; Concepts of weak and strong stationarity; AR, MA and ARMA  processes and their properties; model identification based on ACF and PACF; model estimation and  diagnostic tests; Box-Jenkins models; ARCH/GARCH models.  Inference with Non-Stationary Models: ARIMA model, determination of the order of integration,  trend stationarity and difference stationary processes, tests of non-stationarity.  


7. Statistical Computing  Simulation techniques for various probability models, and resampling methods jack-knife, bootstrap  and cross-validation; techniques for robust linear regression, nonlinear and generalized linear  regression problem, tree-structured regression and classification; Analysis of incomplete data – EM  algorithm, single and multiple imputation; Markov Chain Monte Carlo and annealing techniques,  Gibbs sampling, Metropolis-Hastings algorithm; Neural Networks, Association Rules and learning  algorithms.  


8. Data Science, Artificial Intelligence and Machine Learning Techniques  Introduction to supervised and unsupervised pattern classification; unsupervised and reinforcement  learning, basics of optimization, model accuracy measures.   Supervised Algorithms: Linear Regression, Logistic Regression, Penalized Regression, Naïve  Bayes, Nearest Neighbour, Decision Tree, Support Vector Machine, Kernel density estimation and  kernel discriminant analysis; Classification under a regression framework, neural network, kernel  regression and tree and random forests.  Unsupervised Classification: Hierarchical and non-hierarchical methods: k-means, k-medoids and  linkage methods, Cluster validation indices: Dunn index, Gap statistics.  Bagging (Random Forest) and Boosting (Adaptive Boosting, Gradient Boosting) techniques;  Recurrent Neural Network (RNN); Convolutional Neural Network; Natural Language Processing.

  • call
  • gmail-02
  • Blogger
  • SUNRISE CLASSES TELEGRAM LINK
  • Whatsapp
  • LinkedIn
  • Facebook
  • Twitter
  • YouTube
  • Pinterest
  • Instagram
bottom of page