Relu

"""
This script demonstrates the implementation of the ReLU function.

It's a kind of activation function defined as the positive part of its argument in the
context of neural network.
The function takes a vector of K real numbers as input and then argmax(x, 0).
After through ReLU, the element of the vector always 0 or real number.

Script inspired from its corresponding Wikipedia article
https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
"""
from __future__ import annotations

import numpy as np


def relu(vector: list[float]):
    """
    Implements the relu function

    Parameters:
        vector (np.array,list,tuple): A  numpy array of shape (1,n)
        consisting of real values or a similar list,tuple


    Returns:
        relu_vec (np.array): The input numpy array, after applying
        relu.

    >>> vec = np.array([-1, 0, 5])
    >>> relu(vec)
    array([0, 0, 5])
    """

    # compare two arrays and then return element-wise maxima.
    return np.maximum(0, vector)


if __name__ == "__main__":
    print(np.array(relu([-1, 0, 5])))  # --> [0, 0, 5]
Algerlogo

Β© Alger 2022

About us

We are a group of programmers helping each other build new things, whether it be writing complex encryption programs, or simple ciphers. Our goal is to work together to document and model beautiful, helpful and interesting algorithms using code. We are an open-source community - anyone can contribute. We check each other's work, communicate and collaborate to solve problems. We strive to be welcoming, respectful, yet make sure that our code follows the latest programming guidelines.