Threshold Logic Units


In this tutorial you will learn about:


In a Threshold Logic Unit (TLU) the output of the unit y in response to a particular input pattern is calculated in two stages. First the activation is calculated. The activation a is the weighted sum of the inputs:

\begin{displaymath}a = \sum_{i=1}^{n} \vec{w}_{i} \, \vec{x}_{i} = \vec{w}
 . \, \vec{x}
\end{displaymath} (1)

Where x_i is the ith element of the input vector and w_i is the ith element of the weight vector. The current activation in the TLU in the demonstration below is represented by the dot on the green plane in the graph. The green plane shows all the possible activation values as the inputs vary. The current activation is also marked on the vertical axis. First set the weights to non-zero values. Then alter the inputs and watch the activation change. The activation value a is displayed in green by the diagram of the TLU. Do this now.


The activation is passed through a threshold function to obtain the output y :

\begin{displaymath}y = \left\{ \begin{array}{ll}
1 & \mbox{if $a \geq h$ } \\
0 & \mbox{if $a < h$ }
\end{array} \right.
\end{displaymath} (2)

The red plane in the demonstration represents the threshold of the unit, h . It cuts across the activation plane. If the activation is higher than or equal to the threshold then the output of the unit will be 1, otherwise it will be 0. You can see the value of y change as the activation moves across the threshold. Do this now. The threshold function is also referred to as an activation function, a transfer function, or an output function.

Input Patterns

The inputs presented to the TLU at the same time are called an input pattern. Input patterns are also referred to as input vectors, exemplars, and data points.

The decision boundary

The line where the activation is equal to the threshold is called the decision boundary of the TLU. This is where the two planes intersect, marked by a red line. If you alter the synaptic weights and the threshold then the decision boundary will move. We can alter the TLUs response to input patterns by moving the decision boundary.

Now work through the exercises .



  1. Alter the weights in turn using the slider bars. What happens when both weights are 0?

  2. What happens when w1 is greater than 0? Alter the corresponding input x1 . What happens? Why?

  3. Now set the one of the weights to be negative. What happens to the activation surface? If we increase the corresponding input the activation goes down. What name is given to the type of synapse represented by this weight?

  4. Now increase the threshold h. What happens to the output of the unit?

  5. Describe in detail, either to a friend or on paper, how the weights, inputs, activation and output are related.