2. (10 points) In class, we discussed how to represent XOR-like functions using quadratic
features, since standard linear classifiers (such as perceptrons) are insufficient for this task.
However, here we show that XOR-like functions can indeed be simulated using multi-layer
networks of perceptrons. This example shows a glimpse of the expressive power of "deep
neural networks": merely increasing the depth from 1 to 2 layers can help reproduce nonlinear
a. Consider a standard two-variable XOR function, where we have 2-dimensional inputs
if x1 = x2
x1,22 = 1, and output y = x1 (XOR)x =
Geometrically argue why a single perceptron cannot be used to simulate the above function.
b. Graphically depict, and write down the perceptron equation for, the optimal
region for the following logical functions:
(i) x1 (AND) (NOT (x2))
(ii) (NOT(x1) ) (AND)x2
(iii) x1 (OR)x2 Make note of the weights corresponding to the optimal decision boundary
for each function.
c. Using the above information, simulate a two-layer perceptron network for the XOR
operation with the learned weights from Part (b). You can do this by taking the outputs of
the logical functions presented in Part (b) and combining them carefully.
This material may consist of step-by-step explanations on how to solve a problem or examples of proper writing, including the use of citations, references, bibliographies, and formatting. This material is made available for the sole purpose of studying and learning - misuse is strictly forbidden.