Organism #1¶
-
class
sensorimotor_dependencies.organisms.
Organism1
(seed=1, retina_size=1.0, M_size=40, E_size=40, nb_joints=4, nb_eyes=2, nb_lights=3, extero=20, proprio=4, nb_generating_motor_commands=50, nb_generating_env_positions=50, neighborhood_size=1e-08, sigma=<ufunc 'tanh'>)[source]¶ Organism 1:
By default:
- The arm has
nb_joints
joints- each of which has
proprio
proprioceptive sensors
(whose outputs depend on the position of the joint)
- each of which has
nb_eyes
eyes (for each of them: 3 spatial and 3 orientation coordinates)- on which there are
extero
omnidirectional exteroceptive photosensors
- on which there are
the motor command is
M_size
-dimensionalthe environment consists of:
nb_lights
lights (3 spatial coordinates andnb_lights
luminance values for each of them)
Other parameters:
- Random seed:
seed
- Sensory inputs are generated from
nb_generating_motor_commands
motor commands andnb_generating_env_positions
environment positions - Neighborhood size of the linear approximation:
neighborhood_size
i.e. Motor commands/Environmental positions drawn from normal distribution with mean zero and standard deviationneighborhood_size
(Coordinates differing from 0 by more than the std deviation are set equal to 0) retina_size
size of the retina: variance of the normal distribution from which are drawn theC[i,k]
(relative position of photosensork
within eyei
)
Parameter Value Dimension of motor commands M_size
Dimension of environmental control vector E_size
Number of eyes nb_eyes
Number of joints nb_joints
Dimension of proprioceptive inputs proprio*nb_joints
Dimension of exteroceptive inputs extero*nb_eyes
Diaphragms None Number of lights nb_lights
Light luminance Fixed -
get_dimensions
(dim_red='PCA', return_eigenvalues=False)[source]¶ Compute and returns the estimated dimension of the rigid group of compensated movements and the estimated number of parameters needed to describe the variations in the exteroceptive inputs when: only the body (resp. the environment, resp. both of them) change.
The procedure is the following one:
- One gets rid of proprioceptive inputs by noting that these don’t change when no motor command is issued and the environment changes, contrary to exteroceptive inputs.
- We estimate the dimension of the space of sensory inputs obtained through variations of the motor commands only with resort to a dimension reduction technique (
utils.PCA
orutils.MDS
). - We do the same for sensory inputs obtained through variations of the environment only.
- We reiterate for variations of both the *motor commands and the environment alike.
- Finally, we compute the dimension of the rigid space of compensated movements: it is the sum of the formers minus the latter.
At the end, the results
- are stored in a markdown table string:
self.dim_table
- are added to the representation string (accessed via
__str__
) of the object
Parameters: - dim_red ({'PCA', 'MDA'}, optional) – Dimension reduction algorithm used to compute the number of degrees of freedom (
PCA
by default). - return_eigenvalues (bool, optional) – Returns the eigenvalues and their ratios \(λ_{i+1}/λ_i\) (
False
by default).
Returns: self.dim_rigid_group, self.dim_extero, self.dim_env, self.dim_env_extero – Estimated dimension of the rigid group of compensated movements (stored in
self.dim_rigid_group
), and number of parameters needed to describe the exteroceptive variations when:- only the body moves
- this number is stored in
self.dim_extero
- only the environment changes
- this number is stored in
self.dim_env
- both the body and the environment change
- this number is stored in
self.dim_env_extero
Return type: tuple(int, int, int, int)
Examples
>>> O = organisms.Organism1(); O.get_dimensions() (4, 10, 5, 11)
>>> print(O.dim_table) **Characteristics**|**Value** -|- Dimension for body (p)|10 Dimension for environment (e)|5 Dimension for both (b)|11 Dimension of group of compensated movements|4
>>> print(str(O)) **Characteristics**|**Value** -|- Dimension of motor commands|40 Dimension of environmental control vector|40 Dimension of proprioceptive inputs|16 Dimension of exteroceptive inputs|40 Number of eyes|2 Number of joints|4 Diaphragms|None Number of lights|3 Light luminance|Fixed Dimension for body (p)|10 Dimension for environment (e)|5 Dimension for both (b)|11 Dimension of group of compensated movements|4
>>> print(organisms.Organism1(extero=1).get_dimensions(return_eigenvalues=True)) (1, 1, 1, 1, [array([ 5.28255070e-33, 1.23259516e-32]), array([ 7.29495097e-33, 8.04960107e-33]), array([ 7.26535685e-33, 7.52677159e-33])], [[2.333333333333333], [1.103448275862069], [1.035980991174474]])
-
get_proprioception
(return_trials=False)[source]¶ Computes a mask indicating the sensory inputs the organism can reliably deem to be proprioceptive, since they remain silent when: - the motor command is fixed - the environment changes
Useful to separate proprioceptive inputs from exteroceptive ones
Examples
>>> O = organisms.Organism1(proprio=1, nb_joints=2, extero=1, nb_eyes=2); O.get_proprioception(); O.mask_proprio array([ True, True, False, False], dtype=bool)
>>> from sensorimotor_dependencies import organisms; O = organisms.Organism1(); O.get_proprioception(return_trials=True) array([[-0.07403972, -0.27175696, -0.57920141, ..., 0.19565543, 0.52651921, 0.43479947], [-0.07403972, -0.27175696, -0.57920141, ..., 0.19565542, 0.52651921, 0.43479947], [-0.07403972, -0.27175696, -0.57920141, ..., 0.19565543, 0.52651921, 0.43479947], ..., [-0.07403972, -0.27175696, -0.57920141, ..., 0.19565542, 0.52651921, 0.43479947], [-0.07403972, -0.27175696, -0.57920141, ..., 0.19565542, 0.52651921, 0.43479947], [-0.07403972, -0.27175696, -0.57920141, ..., 0.19565542, 0.5265192 , 0.43479947]])
-
get_sensory_inputs
(M, E, QPaL=None)[source]¶ Compute sensory inputs for motor command
M
and environment positionE
$$\begin{align*} (Q,P,a) &≝ σ(W_1 · σ(W_2 · M − μ_2)−μ_1\\ L &≝ σ(V_1 ·σ(V_2 · E − ν_2) − ν_1)\\ ∀1≤ k ≤ p’, 1≤i≤p, \quad S^e_{i,k} &≝ d_i \sum\limits_{j} \frac{θ_j}{\Vert P_i + Rot(a_i^θ, a_i^φ, a_i^ψ) \cdot C_{i,k} - L_j \Vert^2}\\ (S^p_i)_{1≤ i ≤ q’q} &≝ σ(U_1 · σ(U_2 · Q − τ_2) − τ_1)\\ \end{align*}$$
where
- \(W_1, W_2, V_1, V_2, U_1, U_2\) are matrices with coefficients drawn randomly from a uniform distribution between \(−1\) and \(1\)
- the vectors \(μ_1, μ_2, ν_1, ν_2, τ_1, τ_2\) too
- \(σ\) is an arbitrary nonlinearity (e.g. the hyperbolic tangent function)
- the \(C_{i,k}\) are drawn from a centered normal distribution
whose variance (which can be understood as the size of the retina) is so that the sensory changes resulting from a rotation of the eye are of the same order of magnitude as the ones resulting from a translation of the eye - \(θ\) and \(d\) are constants drawn at random in the interval \([0.5, 1]\)
Parameters: - M ((M_size,) array) – Motor command vector
- E ((E_size,) array) – Environmental control vector
- QPaL ({4-tuple of arrays, None}, optional) –
\(Q, P, a\) and \(L\) values. If left unspecified, then they are computed as above, with the _get_QPaL method.
This optional argument come in handy for Organisms 2 and 3, for which we use this very method (avoiding heavy overloading)
Returns: Concatenation of proprioceptive and exteroceptive sensory inputs
Return type: (proprio*nb_joints + extero*nb_eyes,) array
-
get_variations
()[source]¶ - Compute the variations in the exteroceptive inputs when:
- only the environment changes
- result stored in
self.env_variations
- only the motor commands change
- result stored in
self.mot_variations
- both change
- result stored in
self.env_mot_variations
-
neighborhood_lin_approx
(size)[source]¶ Neighborhood linear approximation:
Parameters: size (int) – Neighborhood size of the linear approximation Returns: rand_vect – Random vector drawn from a normal distribution with mean zero and standard deviation neighborhood_size
where coordinates differing from0
by more than the standard deviation have been set equal to0
Return type: (size,) array