.. index:: pair: page; Softmax .. _doxid-dev_guide_softmax: Softmax ======= :target:`doxid-dev_guide_softmax_1md_doc_primitives_softmax` :ref:`API Reference ` .. _doxid-dev_guide_softmax_1autotoc_md354: General ~~~~~~~ The softmax primitive performs forward or backward softmax or logsoftmax operation along a particular axis on data with arbitrary dimensions. All other axes are treated as independent (batch). .. _doxid-dev_guide_softmax_1autotoc_md355: Forward ------- In general form, the operation is defined by the following formulas (the variable names follow the standard :ref:`Naming Conventions `). Softmax: .. math:: \dst(\overline{ou}, c, \overline{in}) = \frac {e^{\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})}} { \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} } Logsoftmax: .. math:: \dst(\overline{ou}, c, \overline{in}) = \ln\left({\frac { e^{\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})} } { \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} }}\right) = \left(\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})\right) - \ln\left( \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} \right) Above * :math:`c` is the axis over which the operation is computed on, * :math:`\overline{ou}` is the outermost index (to the left of the axis), * :math:`\overline{in}` is the innermost index (to the right of the axis), and * :math:`\nu` is used to produce numerically stable results and defined as: .. math:: \nu(\overline{ou}, \overline{in}) = \max\limits_{ic} \src(\overline{ou}, ic, \overline{in}) .. _doxid-dev_guide_softmax_1autotoc_md356: Difference Between Forward Training and Forward Inference +++++++++++++++++++++++++++++++++++++++++++++++++++++++++ There is no difference between the :ref:`dnnl_forward_training ` and :ref:`dnnl_forward_inference ` propagation kinds. .. _doxid-dev_guide_softmax_1autotoc_md357: Backward -------- The backward propagation computes :math:`\diffsrc(ou, c, in)`, based on :math:`\diffdst(ou, c, in)` and :math:`\dst(ou, c, in)`. .. _doxid-dev_guide_softmax_1autotoc_md358: Execution Arguments ~~~~~~~~~~~~~~~~~~~ When executed, the inputs and outputs should be mapped to an execution argument index as specified by the following table. ======================= ========================= Primitive input/output Execution argument index ======================= ========================= :math:`\src` DNNL_ARG_SRC :math:`\dst` DNNL_ARG_DST :math:`\diffsrc` DNNL_ARG_DIFF_SRC :math:`\diffdst` DNNL_ARG_DIFF_DST ======================= ========================= .. _doxid-dev_guide_softmax_1autotoc_md359: Implementation Details ~~~~~~~~~~~~~~~~~~~~~~ .. _doxid-dev_guide_softmax_1autotoc_md360: General Notes ------------- #. Both forward and backward propagation support in-place operations, meaning that ``src`` can be used as input and output for forward propagation, and ``diff_dst`` can be used as input and output for backward propagation. In case of in-place operation, the original data will be overwritten. This support is limited to cases when data types of ``src`` / ``dst`` or ``diff_src`` / ``diff_dst`` are identical. .. _doxid-dev_guide_softmax_1autotoc_md361: Post-ops and Attributes ----------------------- Attributes enable you to modify the behavior of the softmax primitive. The following attributes are supported by the softmax primitive: ============ ========== ============================================================================================= =================================================== ================================== Propagation Type Operation Description Restrictions ============ ========== ============================================================================================= =================================================== ================================== forward attribute :ref:`Output scale ` Scales the result of softmax by given scale factor int8 softmax only, zero mask only ============ ========== ============================================================================================= =================================================== ================================== .. _doxid-dev_guide_softmax_1autotoc_md362: Data Type Support ----------------- The softmax primitive supports the following combinations of data types: =================== ================== ============ Propagation Source Destination =================== ================== ============ forward / backward f32, bf16 f32, bf16 forward f16 f16 forward f32, bf16, u8, s8 u8, s8 forward u8, s8 f32, bf16 =================== ================== ============ .. _doxid-dev_guide_softmax_1autotoc_md363: Data Representation ------------------- .. _doxid-dev_guide_softmax_1autotoc_md364: Source, Destination, and Their Gradients ++++++++++++++++++++++++++++++++++++++++ The softmax primitive works with arbitrary data tensors. There is no special meaning associated with any logical dimensions. However, the softmax axis is typically referred to as channels (hence in formulas we use :math:`c`). .. _doxid-dev_guide_softmax_1autotoc_md365: Implementation Limitations ~~~~~~~~~~~~~~~~~~~~~~~~~~ #. Refer to :ref:`Data Types ` for limitations related to data types support. #. GPU * Only tensors of 6 or fewer dimensions are supported. .. _doxid-dev_guide_softmax_1autotoc_md366: Performance Tips ~~~~~~~~~~~~~~~~ #. Use in-place operations whenever possible. #. Currently the softmax primitive is optimized for the cases where the dimension of the softmax axis is physically dense. For instance: * Optimized: 2D case, tensor :math:`A \times B`, softmax axis 1 (B), format tag :ref:`dnnl_ab ` * Optimized: 4D case, tensor :math:`A \times B \times C \times D`, softmax axis 3 (D), format tag :ref:`dnnl_abcd ` * Optimized: 4D case, tensor :math:`A \times B \times C \times D`, softmax axis 1 (B), format tag :ref:`dnnl_abcd `, and :math:`C = D = 1` * Optimized: 4D case, tensor :math:`A \times B \times C \times D`, softmax axis 1 (B), format tag :ref:`dnnl_acdb ` or :ref:`dnnl_aBcd16b `, and :math:`C \cdot D \ne 1` * Non-optimized: 2D case, tensor :math:`A \times B`, softmax axis 0 (A), format tag :ref:`dnnl_ab `, and :math:`B \ne 1` * Non-optimized: 2D case, tensor :math:`A \times B`, softmax axis 1 (B), format tag :ref:`dnnl_ba `, and :math:`A \ne 1` * Non-optimized: 4D case, tensor :math:`A \times B \times C \times D`, softmax axis 2 (C), format tag :ref:`dnnl_acdb `, and and :math:`D \cdot B \ne 1` .. _doxid-dev_guide_softmax_1autotoc_md367: Example ~~~~~~~ :ref:`Softmax Primitive Example ` Key optimizations included in this example: * In-place primitive execution; * Softmax along axis 1 (C) for 2D tensors.