TfELM
Loading...
Searching...
No Matches
Public Member Functions | Static Public Member Functions | Public Attributes | List of all members
ActivationFunction.ActivationFunction Class Reference

Public Member Functions

 __init__ (self, act_name="", act_param=1.0, act_param2=1.0, knots=None)
 
 leaky_relu (self, x)
 
 prelu (self, x)
 
 elu (self, x)
 
 isru (self, x)
 
 isrlu (self, x)
 
 selu (self, x)
 
 ssigmoid (self, x)
 
 swish (self, x)
 
 hardshrink (self, x)
 
 softshrink (self, x)
 
 sqrelu (self, x)
 
 softexp (self, x)
 
 tlu (self, x)
 
 aq (self, x)
 
 imq (self, x)
 
 gswish (self, x)
 
 invgamma (self, x)
 
 softclip (self, x)
 
 soft_exponential (self, x)
 
 srelu (self, x)
 
 aqrelu (self, x)
 
 leaky_softplus (self, x)
 
 smooth_sigmoid (self, x)
 
 soft_clipping (self, x)
 
 hard_shrink (self, x)
 
 smooth_hard_tanh (self, x)
 
 egaulu (self, x)
 
 bent_identity_smoothed (self, x)
 
 inv_multiquadratic (self, x)
 
 asymmetric_gaussian (self, x)
 
 inv_quadratic (self, x)
 
 gaussian_squared (self, x)
 
 symmetric_sigmoid (self, x)
 
 inv_cubic (self, x)
 
 cauchy (self, x)
 
 exponential_quadratic (self, x)
 
 rational_quadratic (self, x)
 
 cubic_spline (self, x)
 
 symmetric_soft_clipping (self, x)
 
 binary_step (self, x)
 
 imrbf (self, x)
 
 nrelu (self, x)
 

Static Public Member Functions

 identity (x)
 
 sigmoid (x)
 
 tanh (x)
 
 relu (x)
 
 softplus (x)
 
 bent_identity (x)
 
 gaussian (x)
 
 sinusoidal (x)
 
 softmax (x)
 
 silu (x)
 
 gelu (x)
 
 log (x)
 
 cube (x)
 
 inverse (x)
 
 mish (x)
 
 bis (x)
 
 gompertz (x)
 
 elliott (x)
 
 isq (x)
 
 sine (x)
 
 arctan (x)
 
 sin_transfer (x)
 
 hsigmoid (x)
 
 tsigmoid (x)
 
 arcsinh (x)
 
 logit (x)
 
 logsigmoid (x)
 
 cosine (x)
 
 relu_cos (x)
 
 cos_sigmoid (x)
 
 triangular (x)
 
 hardtanh (x)
 
 inverse_sine (x)
 
 bezier (x)
 
 bsigmoid (x)
 
 power (x, a=1.0)
 
 inverse_cosine (x)
 
 sinusoid (x)
 
 inv_logit (x)
 
 inverse_tangent (x)
 
 hswish (x)
 
 gelu2 (x)
 
 sinusoid2 (x)
 
 inverse_tanh (x)
 
 gaussian_tangent (x)
 
 exp_cosine (x)
 
 gaussian_cdf (x)
 
 hmish (x)
 
 log_exp (x)
 
 cubic (x)
 
 exp_sine (x)
 
 sym_sigmoid (x)
 
 square (x)
 
 swish_gaussian (x)
 
 bipolar_sigmoid (x)
 
 log_sigmoid (x)
 
 hard_sigmoid (x)
 
 invsqrt (x)
 
 gauss_tanh (x)
 
 logarithm (x)
 
 inv_sine (x)
 
 hard_tanh (x)
 
 pos_softplus (x)
 
 inv_cosine (x)
 
 cloglog (x)
 

Public Attributes

 act_param
 
 act_param2
 
 knots
 

Detailed Description

    A class containing various activation functions.

    Attributes:
    -----------
    - act_param (float): The parameter used by some activation functions. Defaults to 1.0.
    - act_param2 (float): The second parameter used by some activation functions. Defaults to 1.0.
    - knots (list): A list of knots used by the cubic spline function. Defaults to [1, 1, 1, 1, 1].

    Methods:
    -----------
    - identity(x): Identity function.
    - sigmoid(x): Sigmoid function.
    - tanh(x): Hyperbolic tangent function.
    - relu(x): Rectified Linear Unit (ReLU) function.
    - leaky_relu(x): Leaky ReLU function.
    - prelu(x): Parametric ReLU function.
    - elu(x): Exponential Linear Unit (ELU) function.
    - softplus(x): Softplus function.
    - bent_identity(x): Bent Identity function.
    - gaussian(x): Gaussian function.
    - sinusoidal(x): Sinusoidal function.
    - isru(x): Inverse Square Root Unit (ISRU) function.
    - isrlu(x): Inverse Square Root Linear Unit (ISRLU) function.
    - selu(x): Scaled Exponential Linear Unit (SELU) function.
    - softmax(x): Softmax function.
    - ssigmoid(x): Symmetric Sigmoid function.
    - silu(x): SiLU (Swish) function.
    - gelu(x): Gaussian Error Linear Units (GELU) function.
    - log(x): Logarithmic function.
    - cube(x): Cubic function.
    - inverse(x): Inverse function.
    - swish(x): Swish function.
    - mish(x): Mish function.
    - bis(x): Bent Identity Smoothed function.
    - gompertz(x): Gompertz function.
    - elliott(x): Elliott function.
    - isq(x): Inverse Square function.
    - hardshrink(x): Hard Shrink function.
    - softshrink(x): Soft Shrink function.
    - sqrelu(x): Squared Rectified Linear Unit (SQReLU) function.
    - sine(x): Sine function.
    - softexp(x): Soft Exponential function.
    - arctan(x): Arctan function.
    - sin_transfer(x): Sinusoidal Transfer function.
    - hsigmoid(x): Hard Sigmoid function.
    - tsigmoid(x): Tangent Sigmoid function.
    - arcsinh(x): ArcSinH function.
    - logit(x): Logit function.
    - tlu(x): Truncated Linear Unit (TLU) function.
    - aq(x): Asymmetric Quadratic function.
    - logsigmoid(x): Logarithmic Sigmoid function.
    - cosine(x): Cosine function.
    - relu_cos(x): Rectified Cosine function.
    - imq(x): Inverse Multiquadratic function.
    - cos_sigmoid(x): Cosine Sigmoid function.
    - triangular(x): Triangular function.
    - hardtanh(x): Hard Tanh function.
    - inverse_sine(x): Inverse Sine function.
    - bezier(x): Quadratic Bezier function.
    - bsigmoid(x): Bipolar Sigmoid function.
    - power(x, a=1.0): Power function.
    - gswish(x): Gaussian Swish function.
    - invgamma(x): Inverse Gamma function.
    - softclip(x): Soft Clip function.
    - inverse_cosine(x): Inverse Cosine function.
    - sinusoid(x): Sinusoid function.
    - inv_logit(x): Inverse Logit function.
    - soft_exponential(x): Soft Exponential function.
    - srelu(x): Smooth Rectified Linear Unit (SReLU) function.
    - inverse_tangent(x): Inverse Tangent function.
    - hswish(x): Hard Swish function.
    - aqrelu(x): Asymmetric Quadratic ReLU function.
    - gelu2(x): Gaussian Error Linear Units 2 (GELU2) function.
    - sinusoid2(x): Sinusoid 2 function.
    - inverse_tanh(x): Inverse Hyperbolic Tangent function.
    - leaky_softplus(x): Leaky Softplus function.
    - gaussian_tangent(x): Gaussian Tangent function.
    - exp_cosine(x): Exponential Cosine function.
    - gaussian_cdf(x): Gaussian Cumulative Distribution Function (CDF) function.
    - hmish(x): Hard-Mish function.
    - smooth_sigmoid(x): Smooth Sigmoid function.
    - log_exp(x): Logarithm of Exponential function.
    - cubic(x): Cubic function.
    - exp_sine(x): Exponential Sine function.
    - sym_sigmoid(x): Symmetric Sigmoid function.
    - square(x): Squared function.
    - soft_clipping(x): Soft Clipping function.
    - swish_gaussian(x): Swish-Gaussian function.
    - hard_shrink(x): Hard Shrink function.
    - smooth_hard_tanh(x): Smooth Hard Tanh function.
    - bipolar_sigmoid(x): Bipolar Sigmoid function.
    - log_sigmoid(x): Logarithmic Sigmoid function.
    - hard_sigmoid(x): Hard Sigmoid function.
    - invsqrt(x): Inverse Square Root function.
    - gauss_tanh(x): Gaussian Tangent Hyperbolic function.
    - egaulu(x): EGAULU function.
    - logarithm(x): Logarithm function.
    - inv_sine(x): Inverse Sine function.
    - hard_tanh(x): Hard Tanh function.
    - bent_identity_smoothed(x): Bent Identity Smoothed function.
    - pos_softplus(x): Positive Softplus function.
    - inv_multiquadratic(x): Inverse Multiquadratic function.
    - inv_cosine(x): Inverse Cosine function.
    - asymmetric_gaussian(x): Asymmetric Gaussian function.
    - inv_quadratic(x): Inverse Quadratic function.
    - gaussian_squared(x): Gaussian Squared function.
    - symmetric_sigmoid(x): Symmetric Sigmoid function.
    - inv_cubic(x): Inverse Cubic function.
    - cauchy(x): Cauchy function.
    - exponential_quadratic(x): Exponential Quadratic function.
    - rational_quadratic(x): Rational Quadratic function.
    - cubic_spline(x): Cubic Spline function.
    - symmetric_soft_clipping(x): Symmetric Soft Clipping function.
    - binary_step(x): Binary Step function.
    - imrbf(x): Inverse Multiquadratic Radial Basis Function (IMRBF) function.
    - cloglog(x): Complementary Log-Log (cLogLog) function.
    - nrelu(x): Noisy Rectified Linear Unit (NReLU) function.

Constructor & Destructor Documentation

◆ __init__()

ActivationFunction.ActivationFunction.__init__ ( self,
act_name = "",
act_param = 1.0,
act_param2 = 1.0,
knots = None )
    Initialize the ActivationFunction.

    Parameters:
    - act_name (str): Name of the activation function. Defaults to "".
    - act_param (float): The parameter used by some activation functions. Defaults to 1.0.
    - act_param2 (float): The second parameter used by some activation functions. Defaults to 1.0.
    - knots (list): A list of knots used by the cubic spline function. Defaults to [1, 1, 1, 1, 1].

Member Function Documentation

◆ aq()

ActivationFunction.ActivationFunction.aq ( self,
x )
Asymmetric Quadratic function.

◆ aqrelu()

ActivationFunction.ActivationFunction.aqrelu ( self,
x )
Asymmetric Quadratic Function.

◆ arcsinh()

ActivationFunction.ActivationFunction.arcsinh ( x)
static
ArcSinH function.

◆ arctan()

ActivationFunction.ActivationFunction.arctan ( x)
static
ArcTan function.

◆ asymmetric_gaussian()

ActivationFunction.ActivationFunction.asymmetric_gaussian ( self,
x )
Asymmetric Gaussian Function

◆ bent_identity()

ActivationFunction.ActivationFunction.bent_identity ( x)
static
Bent Identity function.

◆ bent_identity_smoothed()

ActivationFunction.ActivationFunction.bent_identity_smoothed ( self,
x )
Bent Identity Smoothed Function.

◆ bezier()

ActivationFunction.ActivationFunction.bezier ( x)
static
Quadratic Bezier Function.

◆ binary_step()

ActivationFunction.ActivationFunction.binary_step ( self,
x )
Binary Step Function.

◆ bipolar_sigmoid()

ActivationFunction.ActivationFunction.bipolar_sigmoid ( x)
static
Bipolar Sigmoid Function.

◆ bis()

ActivationFunction.ActivationFunction.bis ( x)
static
Bent Identity Smoothed function.

◆ bsigmoid()

ActivationFunction.ActivationFunction.bsigmoid ( x)
static
Bipolar Sigmoid Function.

◆ cauchy()

ActivationFunction.ActivationFunction.cauchy ( self,
x )
Cauchy Function.

◆ cloglog()

ActivationFunction.ActivationFunction.cloglog ( x)
static
Complementary Log-Log (cLogLog) Function.

◆ cos_sigmoid()

ActivationFunction.ActivationFunction.cos_sigmoid ( x)
static
Cosine Sigmoid Function.

◆ cosine()

ActivationFunction.ActivationFunction.cosine ( x)
static
Cosine function.

◆ cube()

ActivationFunction.ActivationFunction.cube ( x)
static
Cube function.

◆ cubic()

ActivationFunction.ActivationFunction.cubic ( x)
static
Cubic Function.

◆ cubic_spline()

ActivationFunction.ActivationFunction.cubic_spline ( self,
x )
Cubic Spline Function.

◆ egaulu()

ActivationFunction.ActivationFunction.egaulu ( self,
x )
EGAULU Function.

◆ elliott()

ActivationFunction.ActivationFunction.elliott ( x)
static
Elliott function.

◆ elu()

ActivationFunction.ActivationFunction.elu ( self,
x )
Exponential Linear Unit (ELU) function.

◆ exp_cosine()

ActivationFunction.ActivationFunction.exp_cosine ( x)
static
Exponential Cosine Function.

◆ exp_sine()

ActivationFunction.ActivationFunction.exp_sine ( x)
static
Exponential Sine Function.

◆ exponential_quadratic()

ActivationFunction.ActivationFunction.exponential_quadratic ( self,
x )
Exponential Quadratic Function.

◆ gauss_tanh()

ActivationFunction.ActivationFunction.gauss_tanh ( x)
static
Gaussian Tangent Hyperbolic Function.

◆ gaussian()

ActivationFunction.ActivationFunction.gaussian ( x)
static
Gaussian function.

◆ gaussian_cdf()

ActivationFunction.ActivationFunction.gaussian_cdf ( x)
static
Gaussian Cumulative Distribution Function (CDF) Function.

◆ gaussian_squared()

ActivationFunction.ActivationFunction.gaussian_squared ( self,
x )
Gaussian Squared Function.

◆ gaussian_tangent()

ActivationFunction.ActivationFunction.gaussian_tangent ( x)
static
Gaussian Tangent Function.

◆ gelu()

ActivationFunction.ActivationFunction.gelu ( x)
static
Gaussian Error Linear Units (GELU) function.

◆ gelu2()

ActivationFunction.ActivationFunction.gelu2 ( x)
static
Gaussian Error Linear Unit 2 (GELU2) Function.

◆ gompertz()

ActivationFunction.ActivationFunction.gompertz ( x)
static
Gompertz function.

◆ gswish()

ActivationFunction.ActivationFunction.gswish ( self,
x )
Gaussian Swish Function.

◆ hard_shrink()

ActivationFunction.ActivationFunction.hard_shrink ( self,
x )
Hard Shrink Function.

◆ hard_sigmoid()

ActivationFunction.ActivationFunction.hard_sigmoid ( x)
static
Hard Sigmoid Function.

◆ hard_tanh()

ActivationFunction.ActivationFunction.hard_tanh ( x)
static
Hard Tanh Function.

◆ hardshrink()

ActivationFunction.ActivationFunction.hardshrink ( self,
x )
Hard Shrink function.

◆ hardtanh()

ActivationFunction.ActivationFunction.hardtanh ( x)
static
Hard-Tanh Function.

◆ hmish()

ActivationFunction.ActivationFunction.hmish ( x)
static
Hard-Mish Function.

◆ hsigmoid()

ActivationFunction.ActivationFunction.hsigmoid ( x)
static
Hard Sigmoid function.

◆ hswish()

ActivationFunction.ActivationFunction.hswish ( x)
static
Hard Swish Function.

◆ identity()

ActivationFunction.ActivationFunction.identity ( x)
static
Identity function.

◆ imq()

ActivationFunction.ActivationFunction.imq ( self,
x )
Inverse Multiquadratic Function.

◆ imrbf()

ActivationFunction.ActivationFunction.imrbf ( self,
x )
Inverse Multiquadratic Radial Basis Function (IMRBF) Function.

◆ inv_cosine()

ActivationFunction.ActivationFunction.inv_cosine ( x)
static
Inverse Cosine Function.

◆ inv_cubic()

ActivationFunction.ActivationFunction.inv_cubic ( self,
x )
Inverse Cubic Function.

◆ inv_logit()

ActivationFunction.ActivationFunction.inv_logit ( x)
static
Inverse Logit Function.

◆ inv_multiquadratic()

ActivationFunction.ActivationFunction.inv_multiquadratic ( self,
x )
Inverse Multiquadratic Function.

◆ inv_quadratic()

ActivationFunction.ActivationFunction.inv_quadratic ( self,
x )
Inverse Quadratic Function.

◆ inv_sine()

ActivationFunction.ActivationFunction.inv_sine ( x)
static
Inverse Sine Function.

◆ inverse()

ActivationFunction.ActivationFunction.inverse ( x)
static
Inverse function.

◆ inverse_cosine()

ActivationFunction.ActivationFunction.inverse_cosine ( x)
static
Inverse Cosine Function.

◆ inverse_sine()

ActivationFunction.ActivationFunction.inverse_sine ( x)
static
Inverse Sine Function.

◆ inverse_tangent()

ActivationFunction.ActivationFunction.inverse_tangent ( x)
static
Inverse Tangent Function.

◆ inverse_tanh()

ActivationFunction.ActivationFunction.inverse_tanh ( x)
static
Inverse Hyperbolic Tangent Function.

◆ invgamma()

ActivationFunction.ActivationFunction.invgamma ( self,
x )
Inverse Gamma Function.

◆ invsqrt()

ActivationFunction.ActivationFunction.invsqrt ( x)
static
Inverse Square Root Function.

◆ isq()

ActivationFunction.ActivationFunction.isq ( x)
static
Inverse Square function.

◆ isrlu()

ActivationFunction.ActivationFunction.isrlu ( self,
x )
Inverse Square Root Linear Unit (ISRLU) function.

◆ isru()

ActivationFunction.ActivationFunction.isru ( self,
x )
Inverse Square Root Unit (ISRU) function.

◆ leaky_relu()

ActivationFunction.ActivationFunction.leaky_relu ( self,
x )
Leaky Rectified Linear Unit (ReLU) function.

◆ leaky_softplus()

ActivationFunction.ActivationFunction.leaky_softplus ( self,
x )
Leaky Softplus Function.

◆ log()

ActivationFunction.ActivationFunction.log ( x)
static
Logarithmic function.

◆ log_exp()

ActivationFunction.ActivationFunction.log_exp ( x)
static
Logarithm of Exponential Function.

◆ log_sigmoid()

ActivationFunction.ActivationFunction.log_sigmoid ( x)
static
Logarithmic Sigmoid Function.

◆ logarithm()

ActivationFunction.ActivationFunction.logarithm ( x)
static
Logarithm Function.

◆ logit()

ActivationFunction.ActivationFunction.logit ( x)
static
Logit function.

◆ logsigmoid()

ActivationFunction.ActivationFunction.logsigmoid ( x)
static
Logarithmic Sigmoid function.

◆ mish()

ActivationFunction.ActivationFunction.mish ( x)
static
Mish function.

◆ nrelu()

ActivationFunction.ActivationFunction.nrelu ( self,
x )
Noisy Rectified Linear Unit (NReLU) Function.

◆ pos_softplus()

ActivationFunction.ActivationFunction.pos_softplus ( x)
static
Positive Softplus Function.

◆ power()

ActivationFunction.ActivationFunction.power ( x,
a = 1.0 )
static
Power Function

◆ prelu()

ActivationFunction.ActivationFunction.prelu ( self,
x )
Parametric Rectified Linear Unit (ReLU) function.

◆ rational_quadratic()

ActivationFunction.ActivationFunction.rational_quadratic ( self,
x )
Rational Quadratic Function.

◆ relu()

ActivationFunction.ActivationFunction.relu ( x)
static
Rectified Linear Unit (ReLU) function.

◆ relu_cos()

ActivationFunction.ActivationFunction.relu_cos ( x)
static
Rectified Cosine Function.

◆ selu()

ActivationFunction.ActivationFunction.selu ( self,
x )
Scaled Exponential Linear Unit (SELU) function.

◆ sigmoid()

ActivationFunction.ActivationFunction.sigmoid ( x)
static
Sigmoid function.

◆ silu()

ActivationFunction.ActivationFunction.silu ( x)
static
SiLU (Swish) function.

◆ sin_transfer()

ActivationFunction.ActivationFunction.sin_transfer ( x)
static
Sinusoidal Transfer function.

◆ sine()

ActivationFunction.ActivationFunction.sine ( x)
static
Sine function.

◆ sinusoid()

ActivationFunction.ActivationFunction.sinusoid ( x)
static
Sinusoid Function.

◆ sinusoid2()

ActivationFunction.ActivationFunction.sinusoid2 ( x)
static
Sinusoidal Function 2.

◆ sinusoidal()

ActivationFunction.ActivationFunction.sinusoidal ( x)
static
Sinusoidal function.

◆ smooth_hard_tanh()

ActivationFunction.ActivationFunction.smooth_hard_tanh ( self,
x )
Smooth Hard Tanh Function.

◆ smooth_sigmoid()

ActivationFunction.ActivationFunction.smooth_sigmoid ( self,
x )
Smooth Sigmoid Function.

◆ soft_clipping()

ActivationFunction.ActivationFunction.soft_clipping ( self,
x )
Soft Clipping Function.

◆ soft_exponential()

ActivationFunction.ActivationFunction.soft_exponential ( self,
x )
Soft Exponential Function.

◆ softclip()

ActivationFunction.ActivationFunction.softclip ( self,
x )
Soft Clip Function.

◆ softexp()

ActivationFunction.ActivationFunction.softexp ( self,
x )
Soft Exponential function.

◆ softmax()

ActivationFunction.ActivationFunction.softmax ( x)
static
Softmax function.

◆ softplus()

ActivationFunction.ActivationFunction.softplus ( x)
static
SoftPlus function.

◆ softshrink()

ActivationFunction.ActivationFunction.softshrink ( self,
x )
Soft Shrink function.

◆ sqrelu()

ActivationFunction.ActivationFunction.sqrelu ( self,
x )
Squared Rectified Linear Unit (SQReLU) function.

◆ square()

ActivationFunction.ActivationFunction.square ( x)
static
Squared Function.

◆ srelu()

ActivationFunction.ActivationFunction.srelu ( self,
x )
Smooth Rectified Linear Unit (SReLU) Function.

◆ ssigmoid()

ActivationFunction.ActivationFunction.ssigmoid ( self,
x )
Symmetric Sigmoid function.

◆ swish()

ActivationFunction.ActivationFunction.swish ( self,
x )
Swish function.

◆ swish_gaussian()

ActivationFunction.ActivationFunction.swish_gaussian ( x)
static
Swish-Gaussian Function.

◆ sym_sigmoid()

ActivationFunction.ActivationFunction.sym_sigmoid ( x)
static
Symmetric Sigmoid Function.

◆ symmetric_sigmoid()

ActivationFunction.ActivationFunction.symmetric_sigmoid ( self,
x )
Symmetric Sigmoid Function.

◆ symmetric_soft_clipping()

ActivationFunction.ActivationFunction.symmetric_soft_clipping ( self,
x )
Symmetric Soft Clipping Function.

◆ tanh()

ActivationFunction.ActivationFunction.tanh ( x)
static
Hyperbolic tangent function.

◆ tlu()

ActivationFunction.ActivationFunction.tlu ( self,
x )
Truncated Linear Unit (TLU) function.

◆ triangular()

ActivationFunction.ActivationFunction.triangular ( x)
static
Triangular Function.

◆ tsigmoid()

ActivationFunction.ActivationFunction.tsigmoid ( x)
static
Tangent Sigmoid function.

The documentation for this class was generated from the following file: