Threshold Logic Units
Introduction
In this tutorial you will learn about:
 The activation and output of a TLU
 Synaptic weights and the threshold
 Input patterns
 The decision boundary
Activation
In a Threshold Logic Unit (TLU) the output of the unit y in
response to a particular input pattern is calculated in two
stages. First the activation is calculated. The activation a is
the weighted sum of the inputs:

(1) 
Where x_i is the ith element of the input vector and w_i is the ith
element of the weight vector. The current activation in the TLU in the
demonstration below is represented by the dot on
the green plane in the graph. The green plane shows all the possible
activation values as the inputs vary. The current activation is also
marked on the vertical axis. First set the weights to nonzero
values. Then alter the inputs and watch the activation change. The
activation value a is displayed in green by the diagram of
the TLU. Do this now.
Output
The activation is passed through a threshold function to obtain the
output y :

(2) 
The red plane in the demonstration represents
the threshold of the unit, h . It cuts across the activation
plane. If the activation is higher than or equal to the threshold then
the output of the unit will be 1, otherwise it will be 0. You can see
the value of y change as the activation moves across the
threshold. Do this now. The threshold function is
also referred to as an activation function, a transfer function, or an
output function.
Input Patterns
The inputs presented to the TLU at the same time are called an input
pattern. Input patterns are also referred to as input vectors,
exemplars, and data points.
The decision boundary
The line where the activation is equal to the threshold is called the
decision boundary of the TLU. This is where the two planes intersect,
marked by a red line. If you alter the synaptic weights and the
threshold then the decision boundary will move. We can alter the TLUs
response to input patterns by moving the decision boundary.
Now work through the exercises .
Demonstration
Exercises
 Alter the weights in turn using the slider bars. What
happens when both weights are 0?
 What happens when w_{1} is greater than 0? Alter the
corresponding input x_{1} . What happens?
Why?
 Now set the one of the weights to be negative. What happens to
the activation surface? If we increase the corresponding input the
activation goes down. What name is given to the type of synapse
represented by this weight?
 Now increase the threshold h. What happens to the output
of the unit?
 Describe in detail, either to a friend or on paper, how the
weights, inputs, activation and output are related.