Skip to contents

This function calculates various entropic information measures of two variates (each variate may consist of joint variates): the mutual information, the conditional entropies, and the entropies.

Usage

mutualinfo(
  Y1names,
  Y2names,
  X = NULL,
  learnt,
  nsamples = 3600,
  unit = "Sh",
  parallel = TRUE,
  silent = TRUE
)

Arguments

Y1names

String vector: first group of joint variates

Y2names

String vector or NULL: second group of joint variates

X

matrix or data.frame or NULL: values of some variates conditional on which we want the probabilities.

learnt

Either a string with the name of a directory or full path for an 'learnt.rds' object, or such an object itself.

nsamples

numeric: number of samples from which to approximately calculate the mutual information. Default 3600

unit

Either one of 'Sh' for shannon (default), 'Hart' for hartley, 'nat' for natural unit, or a positive real indicating the base of the logarithms to be used.

parallel,

logical or numeric: whether to use pre-existing parallel workers, or how many to create and use.

silent

logical: give warnings or updates in the computation?

Value

A list consisting of the elements MI, CondEn12, CondEn21, En1, En2, MImax, unit, Y1names, Y1names. All elements except unit, Y1names, Y2names are a vector of value and error. Element MI is the mutual information between (joint) variates Y1names and (joint) variates Y2names. ElementCondEn12 is the conditional entropy of the first variate given the second, and vice versa for CondEn21. Elements En1 and En1 are the (differential) entropies of the first and second variates. Element MImax is the maximum possible value of the mutual information. Elements unit, Y1names, Y2names are identical to the same inputs.