mlp
mlp(
@activation 2
@batchsize 50
@hiddenlayers 3 3
@learnrate 0.01
@maxiter 1000
@momentum 0.9
@tapin 0
@tapout -1
@outputactivation 0
@validation 0.2
@mode 0
@useseed 0
) -> llll
Generates a Multilayer Perceptron Neural Network object for either regression or classification tasks.
Arguments
@activation[int]: Activation function for hidden layers. (default:2).0: Identity (-∞ to ∞—i.e., no activation).1: Sigmoid (0 to 1).2: ReLU. (0 to ∞).3: Tanh (-1 to 1).
@batchsize[int]: Batch size. (default:50).@hiddenlayers[list/int]: A list of integers, each specifying the desired size of every fully connected, hidden layer. (default:3 3).@learnrate[float]: Learning rate. (default:0.01).@maxiter[int]: Maximum number of iterations during training. (default:1000).@momentum[float]: Momentum. (default:0.9).@tapin[int]: The index of the layer to use as input to the neural network forpredict(0-based). The default of0is the first layer (the original input layer),1is the first hidden layer, etc. This can be used to access different parts of a trained neural network such as the encoder or decoder of an autoencoder. (default:0).@tapout[int]: The index of the layer to use as output to the neural network forpredict(0-based). The default of-1is the last layer (the original output layer). This can be used to access different parts of a trained neural network such as the encoder or decoder of an autoencoder. (default:-1).@outputactivation[int]: Activation function for output layers. Ignored when@modeis1. (default:0).0: Identity (-∞ to ∞—i.e., no activation).1: Sigmoid (0 to 1).2: ReLU. (0 to ∞).3: Tanh (-1 to 1).
@validation[float]: Factor of the dataset to use for validation during training. (default:0.2).@mode[int]: MLP prediction mode, determining the type of set required during the fit process, as well as the type of set returned during prediction. (default:0).0: Regression—meant to be fitted with adatasetas an output, via thefitfunction.1: Classification—meant to be fitted with alabelsetas an output, via thefitfunction.
@useseed[int]: Use random seed for parameter initialization. (default:0).0: Off1: On
Output
Multilayer perceptron object [llll]
Usage
$indata = null;
$outdata = null;
## generate basic dataset based on "less-than" function
for $i in 1...100 do (
$a = rand(0, 1);
$b = rand(0, 1);
$indata _= [$a $b]; ## input point
$outdata _= [$a < $b] ## expected output (0 or 1)
);
$indataset = dataset($indata); ## input points
$outdataset = dataset($outdata); ## expected outputs
$model = mlp(@outputactivation 1); ## create mlp model
## fit (i.e., train) model to learn input/output mappping from dataset
for $i in 1...10 do ( ## repeat training to minimize loss
print(fit($model, $indataset, $outdataset), 'Loss:')
);
writeobject($model, "./mlp.json"); ## write as JSON for future use (optional)
$model = readobject("./mlp.json"); ## read pre-trained model from JSON (optional)
$xpoint = 0.25 0.75; ## sample point
$pred = predict($model, $xpoint); ## generate prediction
print($pred, "prediction:") ## should be (almost) 1.0, as 0.25 < 0.75 is true