TfELM
|
Public Member Functions | |
__init__ (self, act_name="", act_param=1.0, act_param2=1.0, knots=None) | |
leaky_relu (self, x) | |
prelu (self, x) | |
elu (self, x) | |
isru (self, x) | |
isrlu (self, x) | |
selu (self, x) | |
ssigmoid (self, x) | |
swish (self, x) | |
hardshrink (self, x) | |
softshrink (self, x) | |
sqrelu (self, x) | |
softexp (self, x) | |
tlu (self, x) | |
aq (self, x) | |
imq (self, x) | |
gswish (self, x) | |
invgamma (self, x) | |
softclip (self, x) | |
soft_exponential (self, x) | |
srelu (self, x) | |
aqrelu (self, x) | |
leaky_softplus (self, x) | |
smooth_sigmoid (self, x) | |
soft_clipping (self, x) | |
hard_shrink (self, x) | |
smooth_hard_tanh (self, x) | |
egaulu (self, x) | |
bent_identity_smoothed (self, x) | |
inv_multiquadratic (self, x) | |
asymmetric_gaussian (self, x) | |
inv_quadratic (self, x) | |
gaussian_squared (self, x) | |
symmetric_sigmoid (self, x) | |
inv_cubic (self, x) | |
cauchy (self, x) | |
exponential_quadratic (self, x) | |
rational_quadratic (self, x) | |
cubic_spline (self, x) | |
symmetric_soft_clipping (self, x) | |
binary_step (self, x) | |
imrbf (self, x) | |
nrelu (self, x) | |
Static Public Member Functions | |
identity (x) | |
sigmoid (x) | |
tanh (x) | |
relu (x) | |
softplus (x) | |
bent_identity (x) | |
gaussian (x) | |
sinusoidal (x) | |
softmax (x) | |
silu (x) | |
gelu (x) | |
log (x) | |
cube (x) | |
inverse (x) | |
mish (x) | |
bis (x) | |
gompertz (x) | |
elliott (x) | |
isq (x) | |
sine (x) | |
arctan (x) | |
sin_transfer (x) | |
hsigmoid (x) | |
tsigmoid (x) | |
arcsinh (x) | |
logit (x) | |
logsigmoid (x) | |
cosine (x) | |
relu_cos (x) | |
cos_sigmoid (x) | |
triangular (x) | |
hardtanh (x) | |
inverse_sine (x) | |
bezier (x) | |
bsigmoid (x) | |
power (x, a=1.0) | |
inverse_cosine (x) | |
sinusoid (x) | |
inv_logit (x) | |
inverse_tangent (x) | |
hswish (x) | |
gelu2 (x) | |
sinusoid2 (x) | |
inverse_tanh (x) | |
gaussian_tangent (x) | |
exp_cosine (x) | |
gaussian_cdf (x) | |
hmish (x) | |
log_exp (x) | |
cubic (x) | |
exp_sine (x) | |
sym_sigmoid (x) | |
square (x) | |
swish_gaussian (x) | |
bipolar_sigmoid (x) | |
log_sigmoid (x) | |
hard_sigmoid (x) | |
invsqrt (x) | |
gauss_tanh (x) | |
logarithm (x) | |
inv_sine (x) | |
hard_tanh (x) | |
pos_softplus (x) | |
inv_cosine (x) | |
cloglog (x) | |
Public Attributes | |
act_param | |
act_param2 | |
knots | |
A class containing various activation functions. Attributes: ----------- - act_param (float): The parameter used by some activation functions. Defaults to 1.0. - act_param2 (float): The second parameter used by some activation functions. Defaults to 1.0. - knots (list): A list of knots used by the cubic spline function. Defaults to [1, 1, 1, 1, 1]. Methods: ----------- - identity(x): Identity function. - sigmoid(x): Sigmoid function. - tanh(x): Hyperbolic tangent function. - relu(x): Rectified Linear Unit (ReLU) function. - leaky_relu(x): Leaky ReLU function. - prelu(x): Parametric ReLU function. - elu(x): Exponential Linear Unit (ELU) function. - softplus(x): Softplus function. - bent_identity(x): Bent Identity function. - gaussian(x): Gaussian function. - sinusoidal(x): Sinusoidal function. - isru(x): Inverse Square Root Unit (ISRU) function. - isrlu(x): Inverse Square Root Linear Unit (ISRLU) function. - selu(x): Scaled Exponential Linear Unit (SELU) function. - softmax(x): Softmax function. - ssigmoid(x): Symmetric Sigmoid function. - silu(x): SiLU (Swish) function. - gelu(x): Gaussian Error Linear Units (GELU) function. - log(x): Logarithmic function. - cube(x): Cubic function. - inverse(x): Inverse function. - swish(x): Swish function. - mish(x): Mish function. - bis(x): Bent Identity Smoothed function. - gompertz(x): Gompertz function. - elliott(x): Elliott function. - isq(x): Inverse Square function. - hardshrink(x): Hard Shrink function. - softshrink(x): Soft Shrink function. - sqrelu(x): Squared Rectified Linear Unit (SQReLU) function. - sine(x): Sine function. - softexp(x): Soft Exponential function. - arctan(x): Arctan function. - sin_transfer(x): Sinusoidal Transfer function. - hsigmoid(x): Hard Sigmoid function. - tsigmoid(x): Tangent Sigmoid function. - arcsinh(x): ArcSinH function. - logit(x): Logit function. - tlu(x): Truncated Linear Unit (TLU) function. - aq(x): Asymmetric Quadratic function. - logsigmoid(x): Logarithmic Sigmoid function. - cosine(x): Cosine function. - relu_cos(x): Rectified Cosine function. - imq(x): Inverse Multiquadratic function. - cos_sigmoid(x): Cosine Sigmoid function. - triangular(x): Triangular function. - hardtanh(x): Hard Tanh function. - inverse_sine(x): Inverse Sine function. - bezier(x): Quadratic Bezier function. - bsigmoid(x): Bipolar Sigmoid function. - power(x, a=1.0): Power function. - gswish(x): Gaussian Swish function. - invgamma(x): Inverse Gamma function. - softclip(x): Soft Clip function. - inverse_cosine(x): Inverse Cosine function. - sinusoid(x): Sinusoid function. - inv_logit(x): Inverse Logit function. - soft_exponential(x): Soft Exponential function. - srelu(x): Smooth Rectified Linear Unit (SReLU) function. - inverse_tangent(x): Inverse Tangent function. - hswish(x): Hard Swish function. - aqrelu(x): Asymmetric Quadratic ReLU function. - gelu2(x): Gaussian Error Linear Units 2 (GELU2) function. - sinusoid2(x): Sinusoid 2 function. - inverse_tanh(x): Inverse Hyperbolic Tangent function. - leaky_softplus(x): Leaky Softplus function. - gaussian_tangent(x): Gaussian Tangent function. - exp_cosine(x): Exponential Cosine function. - gaussian_cdf(x): Gaussian Cumulative Distribution Function (CDF) function. - hmish(x): Hard-Mish function. - smooth_sigmoid(x): Smooth Sigmoid function. - log_exp(x): Logarithm of Exponential function. - cubic(x): Cubic function. - exp_sine(x): Exponential Sine function. - sym_sigmoid(x): Symmetric Sigmoid function. - square(x): Squared function. - soft_clipping(x): Soft Clipping function. - swish_gaussian(x): Swish-Gaussian function. - hard_shrink(x): Hard Shrink function. - smooth_hard_tanh(x): Smooth Hard Tanh function. - bipolar_sigmoid(x): Bipolar Sigmoid function. - log_sigmoid(x): Logarithmic Sigmoid function. - hard_sigmoid(x): Hard Sigmoid function. - invsqrt(x): Inverse Square Root function. - gauss_tanh(x): Gaussian Tangent Hyperbolic function. - egaulu(x): EGAULU function. - logarithm(x): Logarithm function. - inv_sine(x): Inverse Sine function. - hard_tanh(x): Hard Tanh function. - bent_identity_smoothed(x): Bent Identity Smoothed function. - pos_softplus(x): Positive Softplus function. - inv_multiquadratic(x): Inverse Multiquadratic function. - inv_cosine(x): Inverse Cosine function. - asymmetric_gaussian(x): Asymmetric Gaussian function. - inv_quadratic(x): Inverse Quadratic function. - gaussian_squared(x): Gaussian Squared function. - symmetric_sigmoid(x): Symmetric Sigmoid function. - inv_cubic(x): Inverse Cubic function. - cauchy(x): Cauchy function. - exponential_quadratic(x): Exponential Quadratic function. - rational_quadratic(x): Rational Quadratic function. - cubic_spline(x): Cubic Spline function. - symmetric_soft_clipping(x): Symmetric Soft Clipping function. - binary_step(x): Binary Step function. - imrbf(x): Inverse Multiquadratic Radial Basis Function (IMRBF) function. - cloglog(x): Complementary Log-Log (cLogLog) function. - nrelu(x): Noisy Rectified Linear Unit (NReLU) function.
ActivationFunction.ActivationFunction.__init__ | ( | self, | |
act_name = "", | |||
act_param = 1.0, | |||
act_param2 = 1.0, | |||
knots = None ) |
Initialize the ActivationFunction. Parameters: - act_name (str): Name of the activation function. Defaults to "". - act_param (float): The parameter used by some activation functions. Defaults to 1.0. - act_param2 (float): The second parameter used by some activation functions. Defaults to 1.0. - knots (list): A list of knots used by the cubic spline function. Defaults to [1, 1, 1, 1, 1].
ActivationFunction.ActivationFunction.aq | ( | self, | |
x ) |
Asymmetric Quadratic function.
ActivationFunction.ActivationFunction.aqrelu | ( | self, | |
x ) |
Asymmetric Quadratic Function.
|
static |
ArcSinH function.
|
static |
ArcTan function.
ActivationFunction.ActivationFunction.asymmetric_gaussian | ( | self, | |
x ) |
Asymmetric Gaussian Function
|
static |
Bent Identity function.
ActivationFunction.ActivationFunction.bent_identity_smoothed | ( | self, | |
x ) |
Bent Identity Smoothed Function.
|
static |
Quadratic Bezier Function.
ActivationFunction.ActivationFunction.binary_step | ( | self, | |
x ) |
Binary Step Function.
|
static |
Bipolar Sigmoid Function.
|
static |
Bent Identity Smoothed function.
|
static |
Bipolar Sigmoid Function.
ActivationFunction.ActivationFunction.cauchy | ( | self, | |
x ) |
Cauchy Function.
|
static |
Complementary Log-Log (cLogLog) Function.
|
static |
Cosine Sigmoid Function.
|
static |
Cosine function.
|
static |
Cube function.
|
static |
Cubic Function.
ActivationFunction.ActivationFunction.cubic_spline | ( | self, | |
x ) |
Cubic Spline Function.
ActivationFunction.ActivationFunction.egaulu | ( | self, | |
x ) |
EGAULU Function.
|
static |
Elliott function.
ActivationFunction.ActivationFunction.elu | ( | self, | |
x ) |
Exponential Linear Unit (ELU) function.
|
static |
Exponential Cosine Function.
|
static |
Exponential Sine Function.
ActivationFunction.ActivationFunction.exponential_quadratic | ( | self, | |
x ) |
Exponential Quadratic Function.
|
static |
Gaussian Tangent Hyperbolic Function.
|
static |
Gaussian function.
|
static |
Gaussian Cumulative Distribution Function (CDF) Function.
ActivationFunction.ActivationFunction.gaussian_squared | ( | self, | |
x ) |
Gaussian Squared Function.
|
static |
Gaussian Tangent Function.
|
static |
Gaussian Error Linear Units (GELU) function.
|
static |
Gaussian Error Linear Unit 2 (GELU2) Function.
|
static |
Gompertz function.
ActivationFunction.ActivationFunction.gswish | ( | self, | |
x ) |
Gaussian Swish Function.
ActivationFunction.ActivationFunction.hard_shrink | ( | self, | |
x ) |
Hard Shrink Function.
|
static |
Hard Sigmoid Function.
|
static |
Hard Tanh Function.
ActivationFunction.ActivationFunction.hardshrink | ( | self, | |
x ) |
Hard Shrink function.
|
static |
Hard-Tanh Function.
|
static |
Hard-Mish Function.
|
static |
Hard Sigmoid function.
|
static |
Hard Swish Function.
|
static |
Identity function.
ActivationFunction.ActivationFunction.imq | ( | self, | |
x ) |
Inverse Multiquadratic Function.
ActivationFunction.ActivationFunction.imrbf | ( | self, | |
x ) |
Inverse Multiquadratic Radial Basis Function (IMRBF) Function.
|
static |
Inverse Cosine Function.
ActivationFunction.ActivationFunction.inv_cubic | ( | self, | |
x ) |
Inverse Cubic Function.
|
static |
Inverse Logit Function.
ActivationFunction.ActivationFunction.inv_multiquadratic | ( | self, | |
x ) |
Inverse Multiquadratic Function.
ActivationFunction.ActivationFunction.inv_quadratic | ( | self, | |
x ) |
Inverse Quadratic Function.
|
static |
Inverse Sine Function.
|
static |
Inverse function.
|
static |
Inverse Cosine Function.
|
static |
Inverse Sine Function.
|
static |
Inverse Tangent Function.
|
static |
Inverse Hyperbolic Tangent Function.
ActivationFunction.ActivationFunction.invgamma | ( | self, | |
x ) |
Inverse Gamma Function.
|
static |
Inverse Square Root Function.
|
static |
Inverse Square function.
ActivationFunction.ActivationFunction.isrlu | ( | self, | |
x ) |
Inverse Square Root Linear Unit (ISRLU) function.
ActivationFunction.ActivationFunction.isru | ( | self, | |
x ) |
Inverse Square Root Unit (ISRU) function.
ActivationFunction.ActivationFunction.leaky_relu | ( | self, | |
x ) |
Leaky Rectified Linear Unit (ReLU) function.
ActivationFunction.ActivationFunction.leaky_softplus | ( | self, | |
x ) |
Leaky Softplus Function.
|
static |
Logarithmic function.
|
static |
Logarithm of Exponential Function.
|
static |
Logarithmic Sigmoid Function.
|
static |
Logarithm Function.
|
static |
Logit function.
|
static |
Logarithmic Sigmoid function.
|
static |
Mish function.
ActivationFunction.ActivationFunction.nrelu | ( | self, | |
x ) |
Noisy Rectified Linear Unit (NReLU) Function.
|
static |
Positive Softplus Function.
|
static |
Power Function
ActivationFunction.ActivationFunction.prelu | ( | self, | |
x ) |
Parametric Rectified Linear Unit (ReLU) function.
ActivationFunction.ActivationFunction.rational_quadratic | ( | self, | |
x ) |
Rational Quadratic Function.
|
static |
Rectified Linear Unit (ReLU) function.
|
static |
Rectified Cosine Function.
ActivationFunction.ActivationFunction.selu | ( | self, | |
x ) |
Scaled Exponential Linear Unit (SELU) function.
|
static |
Sigmoid function.
|
static |
SiLU (Swish) function.
|
static |
Sinusoidal Transfer function.
|
static |
Sine function.
|
static |
Sinusoid Function.
|
static |
Sinusoidal Function 2.
|
static |
Sinusoidal function.
ActivationFunction.ActivationFunction.smooth_hard_tanh | ( | self, | |
x ) |
Smooth Hard Tanh Function.
ActivationFunction.ActivationFunction.smooth_sigmoid | ( | self, | |
x ) |
Smooth Sigmoid Function.
ActivationFunction.ActivationFunction.soft_clipping | ( | self, | |
x ) |
Soft Clipping Function.
ActivationFunction.ActivationFunction.soft_exponential | ( | self, | |
x ) |
Soft Exponential Function.
ActivationFunction.ActivationFunction.softclip | ( | self, | |
x ) |
Soft Clip Function.
ActivationFunction.ActivationFunction.softexp | ( | self, | |
x ) |
Soft Exponential function.
|
static |
Softmax function.
|
static |
SoftPlus function.
ActivationFunction.ActivationFunction.softshrink | ( | self, | |
x ) |
Soft Shrink function.
ActivationFunction.ActivationFunction.sqrelu | ( | self, | |
x ) |
Squared Rectified Linear Unit (SQReLU) function.
|
static |
Squared Function.
ActivationFunction.ActivationFunction.srelu | ( | self, | |
x ) |
Smooth Rectified Linear Unit (SReLU) Function.
ActivationFunction.ActivationFunction.ssigmoid | ( | self, | |
x ) |
Symmetric Sigmoid function.
ActivationFunction.ActivationFunction.swish | ( | self, | |
x ) |
Swish function.
|
static |
Swish-Gaussian Function.
|
static |
Symmetric Sigmoid Function.
ActivationFunction.ActivationFunction.symmetric_sigmoid | ( | self, | |
x ) |
Symmetric Sigmoid Function.
ActivationFunction.ActivationFunction.symmetric_soft_clipping | ( | self, | |
x ) |
Symmetric Soft Clipping Function.
|
static |
Hyperbolic tangent function.
ActivationFunction.ActivationFunction.tlu | ( | self, | |
x ) |
Truncated Linear Unit (TLU) function.
|
static |
Triangular Function.
|
static |
Tangent Sigmoid function.