Skip to content

pgm_head

Bases: head

A probabilistic graphical model (PGM)-based head for implementing multi-channel modules.

This head supports combinatorial normal expansion and optional parameter reconciliation and residual connections. It is tailored for use with probabilistic graphical models.

Attributes:

Name Type Description
m int

Input dimension of the head.

n int

Output dimension of the head.

d int

Degree for combinatorial expansion.

with_replacement bool

Whether combinations are generated with replacement.

channel_num int

Number of channels for multi-channel processing.

parameters_init_method str

Initialization method for parameters.

device str

Device to host the head (e.g., 'cpu' or 'cuda').

Methods:

Name Description
__init__

Initializes the PGM head with specified configurations.

Source code in tinybig/head/basic_heads.py
class pgm_head(head):
    """
    A probabilistic graphical model (PGM)-based head for implementing multi-channel modules.

    This head supports combinatorial normal expansion and optional parameter reconciliation
    and residual connections. It is tailored for use with probabilistic graphical models.

    Attributes
    ----------
    m : int
        Input dimension of the head.
    n : int
        Output dimension of the head.
    d : int
        Degree for combinatorial expansion.
    with_replacement : bool
        Whether combinations are generated with replacement.
    channel_num : int
        Number of channels for multi-channel processing.
    parameters_init_method : str
        Initialization method for parameters.
    device : str
        Device to host the head (e.g., 'cpu' or 'cuda').

    Methods
    -------
    __init__(...)
        Initializes the PGM head with specified configurations.
    """
    def __init__(
        self, m: int, n: int,
        name: str = 'perceptron_head',
        distribution: str = 'normal',
        d: int = 2, with_replacement: bool = False,
        enable_bias: bool = False,
        # optional parameters
        with_lorr: bool = False,
        r: int = 3,
        with_residual: bool = False,
        channel_num: int = 1,
        # other parameters
        parameters_init_method: str = 'xavier_normal',
        device: str = 'cpu', *args, **kwargs
    ):
        """
        Initializes the PGM head.

        Parameters
        ----------
        m : int
            Input dimension.
        n : int
            Output dimension.
        name : str, optional
            Name of the PGM head, default is 'pgm_head'.
        distribution : str, optional
            Distribution type for combinatorial expansion, default is 'normal'.
        d : int, optional
            Degree for combinatorial expansion, default is 2.
        with_replacement : bool, optional
            Whether combinations are generated with replacement, default is False.
        enable_bias : bool, optional
            Whether to enable bias in reconciliation functions, default is False.
        with_lorr : bool, optional
            Whether to use LORR reconciliation, default is False.
        r : int, optional
            Parameter for reconciliation functions, default is 3.
        with_residual : bool, optional
            Whether to include a residual connection, default is False.
        channel_num : int, optional
            Number of channels for multi-channel processing, default is 1.
        parameters_init_method : str, optional
            Initialization method for parameters, default is 'xavier_normal'.
        device : str, optional
            Device to host the head, default is 'cpu'.

        Returns
        -------
        None
        """
        if distribution == 'normal':
            data_transformation = combinatorial_normal_expansion(
                d=d, with_replacement=with_replacement,
                device=device,
            )
        else:
            raise ValueError('tinybig only supports normal, exponential, cauchy, gamma, laplace or chi2 distributions...')

        if with_lorr:
            parameter_fabrication = lorr_reconciliation(
                r=r,
                enable_bias=enable_bias,
                device=device,
            )
        else:
            parameter_fabrication = identity_reconciliation(
                enable_bias=enable_bias,
                device=device,
            )

        if with_residual:
            remainder = linear_remainder(
                device=device
            )
        else:
            remainder = zero_remainder(
                device=device,
            )

        super().__init__(
            m=m, n=n, name=name,
            data_transformation=data_transformation,
            parameter_fabrication=parameter_fabrication,
            remainder=remainder,
            channel_num=channel_num,
            parameters_init_method=parameters_init_method,
            device=device, *args, **kwargs
        )

__init__(m, n, name='perceptron_head', distribution='normal', d=2, with_replacement=False, enable_bias=False, with_lorr=False, r=3, with_residual=False, channel_num=1, parameters_init_method='xavier_normal', device='cpu', *args, **kwargs)

Initializes the PGM head.

Parameters:

Name Type Description Default
m int

Input dimension.

required
n int

Output dimension.

required
name str

Name of the PGM head, default is 'pgm_head'.

'perceptron_head'
distribution str

Distribution type for combinatorial expansion, default is 'normal'.

'normal'
d int

Degree for combinatorial expansion, default is 2.

2
with_replacement bool

Whether combinations are generated with replacement, default is False.

False
enable_bias bool

Whether to enable bias in reconciliation functions, default is False.

False
with_lorr bool

Whether to use LORR reconciliation, default is False.

False
r int

Parameter for reconciliation functions, default is 3.

3
with_residual bool

Whether to include a residual connection, default is False.

False
channel_num int

Number of channels for multi-channel processing, default is 1.

1
parameters_init_method str

Initialization method for parameters, default is 'xavier_normal'.

'xavier_normal'
device str

Device to host the head, default is 'cpu'.

'cpu'

Returns:

Type Description
None
Source code in tinybig/head/basic_heads.py
def __init__(
    self, m: int, n: int,
    name: str = 'perceptron_head',
    distribution: str = 'normal',
    d: int = 2, with_replacement: bool = False,
    enable_bias: bool = False,
    # optional parameters
    with_lorr: bool = False,
    r: int = 3,
    with_residual: bool = False,
    channel_num: int = 1,
    # other parameters
    parameters_init_method: str = 'xavier_normal',
    device: str = 'cpu', *args, **kwargs
):
    """
    Initializes the PGM head.

    Parameters
    ----------
    m : int
        Input dimension.
    n : int
        Output dimension.
    name : str, optional
        Name of the PGM head, default is 'pgm_head'.
    distribution : str, optional
        Distribution type for combinatorial expansion, default is 'normal'.
    d : int, optional
        Degree for combinatorial expansion, default is 2.
    with_replacement : bool, optional
        Whether combinations are generated with replacement, default is False.
    enable_bias : bool, optional
        Whether to enable bias in reconciliation functions, default is False.
    with_lorr : bool, optional
        Whether to use LORR reconciliation, default is False.
    r : int, optional
        Parameter for reconciliation functions, default is 3.
    with_residual : bool, optional
        Whether to include a residual connection, default is False.
    channel_num : int, optional
        Number of channels for multi-channel processing, default is 1.
    parameters_init_method : str, optional
        Initialization method for parameters, default is 'xavier_normal'.
    device : str, optional
        Device to host the head, default is 'cpu'.

    Returns
    -------
    None
    """
    if distribution == 'normal':
        data_transformation = combinatorial_normal_expansion(
            d=d, with_replacement=with_replacement,
            device=device,
        )
    else:
        raise ValueError('tinybig only supports normal, exponential, cauchy, gamma, laplace or chi2 distributions...')

    if with_lorr:
        parameter_fabrication = lorr_reconciliation(
            r=r,
            enable_bias=enable_bias,
            device=device,
        )
    else:
        parameter_fabrication = identity_reconciliation(
            enable_bias=enable_bias,
            device=device,
        )

    if with_residual:
        remainder = linear_remainder(
            device=device
        )
    else:
        remainder = zero_remainder(
            device=device,
        )

    super().__init__(
        m=m, n=n, name=name,
        data_transformation=data_transformation,
        parameter_fabrication=parameter_fabrication,
        remainder=remainder,
        channel_num=channel_num,
        parameters_init_method=parameters_init_method,
        device=device, *args, **kwargs
    )