Neural Network Modeling

Course Details
Code: DMNN42
Tuition (USD): $1,600.00 • Classroom (2 days)
$1,600.00 • Virtual (2 days)

This course helps you understand and apply two popular artificial neural network algorithms: multi-layer perceptrons and radial basis functions. Both the theoretical and practical issues of fitting neural networks are covered. Specifically, this course teaches you how to choose an appropriate neural network architecture, how to determine the relevant training method, how to implement neural network models in a distributed computing environment, and how to construct custom neural networks using the NEURAL procedure.

Skills Gained

  • construct multilayer perceptron and radial basis function neural networks
  • choose an appropriate network architecture and training method
  • avoid overfitting neural networks
  • perform autoregressive time series analysis using neural networks
  • interpret neural network models
  • implement neural networks in a distributed computing environment.

Who Can Benefit

  • Data analysts and modelers with a strong mathematical background


  • Before attending this course, you should
  • have an understanding of basic statistical concepts, which you can gain from the Statistics I: Introduction to ANOVA, Regression, and Logistic Regression course.
  • have completed the SAS(R) Programming I: Essentials course or have equivalent knowledge.
  • be familiar with SAS Enterprise Miner software. You can gain this knowledge from the Applied Analytics Using SAS(R) Enterprise Miner(TM) 5.2 course.
  • have completed a college-level calculus course.

Course Details

Introduction to Neural Networks

  • provide a brief history of neural networks
  • describe key concepts underlying neural networks
  • illustrate traditional approaches to nonlinear modeling

Network Architecture

  • define the linear perceptron neural network
  • describe combination and activation functions
  • show how a linear perceptron is a generalized linear model that is able to model many target distributions
  • detail multilayer and skip-layer perceptrons
  • detail ordinary and normalized radial basis functions


  • describe the problem of local minima
  • describe the parameter estimation methods
  • outline the optimization (training) techniques that are available in the Neural Network node

The NEURAL Procedure

  • overview of PROC NEURAL
  • input selection using PROC NEURAL
  • define sequential network construction (SNC)
  • illustrate the SNC paradigm
  • stochastic gradient descent

Augmented Networks

  • implementing a time delay neural network
  • interpreting a neural network with a continuous target
  • interpreting a neural network with a categorical target

The HP Neural Node

  • outline the challenge of big data
  • introduce SAS High-Performance Analytics
  • describe the HP Neural node's interface


  • the DMDB procedure
  • the NEURAL procedure

Empirical Partial Residuals

  • generating empirical partial residual plots to guide variable selection