Electron microscopy
 
Logistic Function/Sigmoid Function
- Python for Integrated Circuits -
- An Online Book -
Python for Integrated Circuits                                                                                   http://www.globalsino.com/ICs/        


Chapter/Index: Introduction | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Appendix

=================================================================================

A sigmoid function is a type of activation function commonly used in artificial neural networks. The sigmoid function is a mathematical function that maps any real-valued number to a value between 0 and 1. The logistic function is a specific form of the sigmoid function. The logistic function, denoted as σ(x), is defined as:

        Upload Files to Webpages --------------------------------------------------- [3871a]

where,

          x can be any real number.

The logistic function maps any real number "x" to a value between 0 and 1, making it particularly useful in binary classification problems where you want to model probabilities or make decisions based on a threshold. Figure 3871a plots the sigmoid function shown in Equation 3871a. It can been seen that the sigmoid function is an S-shaped curve that is particularly useful in binary classification problems.

Upload Files to Webpages

Figure 3871a. Sigmoid function ( Code).

Figure 3871a shows when , it implies that x is greater than or equal to 0.

The term "sigmoid" is a more general term that refers to any S-shaped curve. The logistic function is an example of a sigmoid function because it has an S-shaped curve when plotted. Other sigmoid functions include the hyperbolic tangent function (tanh) and the arctangent function (arctan), among others (see Figure 3871b).

         Upload Files to Webpages

Figure 3871b. Sigmoid functions ( Code).

The sigmoid function is often used in the output layer of a binary classification model to squash the output into the range [0, 1], representing the probability of belonging to one of the two classes. It's also used in the hidden layers of a neural network when you want to introduce non-linearity to the model.

The sigmoid function has some advantages, such as smooth gradients and clear interpretation as probabilities. However, it also has some drawbacks, such as the vanishing gradient problem, which can make training deep networks challenging.

From Equation 3871a, by using the chain rule and the quotient rule, we can have,

         Sigmoid ---------------------------------- [3871b]

In mathematical notation, if we have,

         Sigmoid ---------------------------------- [3871c]

Then, we will have,

         Sigmoid ---------------------------------- [3871d]

This is the quotient rule.

============================================

         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         

 

 

 

 

 



















































 

 

 

 

 

=================================================================================