Electron microscopy
 
tf.keras.layers.Hashing
- Python for Integrated Circuits -
- An Online Book -
Python for Integrated Circuits                                                                                   http://www.globalsino.com/ICs/        


Chapter/Index: Introduction | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Appendix

=================================================================================

tf.keras.layers.Hashing is a preprocessing layer which hashes and bins categorical features, and performs categorical feature hashing, also known as the "hashing layer" or "hashing trick". Hashing is a non-trainable layer.

Categorical features preprocessing layers includes:
        i) CategoryEncoding layer.
                tf.keras.layers.CategoryEncoding(num_tokens=None, output_mode='multi_hot', sparse=False, **kwargs)
        ii) Hashing layer.
                tf.keras.layers.Hashing(num_bins, mask_value=None, salt=None, output_mode="int", sparse=False, **kwargs)
        iii) StringLookup layer.
                tf.keras.layers.StringLookup(
                max_tokens=None,
                num_oov_indices=1,
                mask_token=None,
                oov_token="[UNK]",
                vocabulary=None,
                idf_weights=None,
                encoding=None,
                invert=False,
                output_mode="int",
                sparse=False,
                pad_to_max_tokens=False,
                **kwargs
                )
        iv) IntegerLookup layer.
                tf.keras.layers.IntegerLookup(
                max_tokens=None,
                num_oov_indices=1,
                mask_token=None,
                oov_token=-1,
                vocabulary=None,
                vocabulary_dtype="int64",
                idf_weights=None,
                invert=False,
                output_mode="int",
                sparse=False,
                pad_to_max_tokens=False,
                **kwargs
                )

============================================

         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         

 

 

 

 

 



















































 

 

 

 

 

=================================================================================