Discover our
resources
Sensors tab
.png)
The Sensors window lets you select the robot sensors to be used.
Click on a sensor to choose one of the possible usage options.
Find out more about the different sensors used in different robots:
You can also choose the number of instants, and in the case of a simulated robot, change the arena parameters.
Actions tab
.png)
The Actions window lets you choose which actions the robot can perform.
Click on an action to add it to the neural network.
⚠️ Adding a new action will delete the robot's experience memory.
Discover the different actions used depending on the robots involved:
You can also choose speed, duration (from 0.1s to 2s), and whether or not you want the robot to pause between actions, and for how long.
Awards tab
.png)
If you have launched an activity with reinforcement learning, you can use the Rewards tab to choose the type of reward and its configuration.
Speed and blocking
Rewardthe robot if it goes fast, punish if it stops. You can customize the reward the robot gets when it performs a turning action, and the penalty value when it blocks or backs up.
Color in image
Reward the robot if many of the camera pixels are of a certain color.
You can customize hue, luminance and saturation.
Color and blocking
Reward the robot if many of the camera pixels are of a certain color, and punish it if it stops.
You can customize hue, luminance and saturation.
Line tracking
Recompensethe robot if it detects black just below it. You can customize the blocking detection threshold.
Python code
Set up your own rewards using a Python code file.
AI tab
.png)
The AI tab lets you choose how the robot is trained, in particular whether to choose supervised learning or reinforcement learning (Deep Q Learning). It also lets you adjust the parameters of the various algorithms.
The parameters may differ from one activity to another, from one type of learning to another and from one algorithm to another. But the main parameters are as follows:
- Type of learning
- Supervised learning
- Reinforcement learning
- No
- Algorithm: Select the AI algorithm used
- Neural network
- K nearest neighbors
- Python code
- Learning speed: Increase for faster learning... but decrease if divergence errors appear.
- Gamma: Adjusts the importance given to immediate rewards (value close to 0) in relation to more distant rewards (value close to 1).
- Exploration: Frequency of explorations (value between 0 and 1).
- Layers of intermediate neurons: Number of neurons in each intermediate layer e.g. leave blank to connect inputs directly to outputs, put "100 50" for two intermediate layers of 100 and 50 neurons respectively.
- Activation function: choice of activation function for intermediate layers.
- 2 neurons per binary variable: Check to have binary inputs represented by 2 neurons (of which one and only one will always be activated); Uncheck to use only 1 neuron.
- Neuronal bias: Check to allow neurons to adjust their activation threshold (this is equivalent to considering that all neurons receive a constant input that they can adjust, not represented in the graphical interface).
- Experience memory: Check to ensure that the AI continues to learn from past actions and rewards.
Visualization tab
.png)
The View tab lets you choose what is displayed or not on the main screen.
- Display type: neural network, state space, or both
- Animation: animating activity in the network (movement from inputs to outputs)
- Connections : Show network connections
- Apprenticeships: Show apprenticeships
- green: connections intensify
- red: connections decreasing
- Synaptic activity: Display network activity
- yellow: exciting activities
- blue: inhibition activities
- Input/output values : Display values of input and output neurons
- Connection values : Display connection weights. This option automatically activates the I/O display.
- Spacing between neurons: switches input neurons from vertical to horizontal mode
For AlphAI robot only: you can change the color of the shell. Note: the robot will memorize its new color, which will then appear on the connection screen.