Available on AArch64 or 
target_arch="arm64ec" only.Expand description
Platform-specific intrinsics for the aarch64 platform.
See the module documentation for more details.
Structs§
- ARM-specific 64-bit wide vector of two packedf32.
- ARM-specific type containing twofloat32x2_tvectors.
- ARM-specific type containing threefloat32x2_tvectors.
- ARM-specific type containing fourfloat32x2_tvectors.
- ARM-specific 128-bit wide vector of four packedf32.
- ARM-specific type containing twofloat32x4_tvectors.
- ARM-specific type containing threefloat32x4_tvectors.
- ARM-specific type containing fourfloat32x4_tvectors.
- ARM-specific 64-bit wide vector of one packedf64.
- ARM-specific type containing twofloat64x1_tvectors.
- ARM-specific type containing threefloat64x1_tvectors.
- ARM-specific type containing fourfloat64x1_tvectors.
- ARM-specific 128-bit wide vector of two packedf64.
- ARM-specific type containing twofloat64x2_tvectors.
- ARM-specific type containing threefloat64x2_tvectors.
- ARM-specific type containing fourfloat64x2_tvectors.
- ARM-specific 64-bit wide vector of eight packedi8.
- ARM-specific type containing twoint8x8_tvectors.
- ARM-specific type containing threeint8x8_tvectors.
- ARM-specific type containing fourint8x8_tvectors.
- ARM-specific 128-bit wide vector of sixteen packedi8.
- ARM-specific type containing twoint8x16_tvectors.
- ARM-specific type containing threeint8x16_tvectors.
- ARM-specific type containing fourint8x16_tvectors.
- ARM-specific 64-bit wide vector of four packedi16.
- ARM-specific type containing twoint16x4_tvectors.
- ARM-specific type containing threeint16x4_tvectors.
- ARM-specific type containing fourint16x4_tvectors.
- ARM-specific 128-bit wide vector of eight packedi16.
- ARM-specific type containing twoint16x8_tvectors.
- ARM-specific type containing threeint16x8_tvectors.
- ARM-specific type containing fourint16x8_tvectors.
- ARM-specific 64-bit wide vector of two packedi32.
- ARM-specific type containing twoint32x2_tvectors.
- ARM-specific type containing threeint32x2_tvectors.
- ARM-specific type containing fourint32x2_tvectors.
- ARM-specific 128-bit wide vector of four packedi32.
- ARM-specific type containing twoint32x4_tvectors.
- ARM-specific type containing threeint32x4_tvectors.
- ARM-specific type containing fourint32x4_tvectors.
- ARM-specific 64-bit wide vector of one packedi64.
- ARM-specific type containing twoint64x1_tvectors.
- ARM-specific type containing threeint64x1_tvectors.
- ARM-specific type containing fourint64x1_tvectors.
- ARM-specific 128-bit wide vector of two packedi64.
- ARM-specific type containing twoint64x2_tvectors.
- ARM-specific type containing threeint64x2_tvectors.
- ARM-specific type containing fourint64x2_tvectors.
- ARM-specific 64-bit wide polynomial vector of eight packedp8.
- ARM-specific type containing twopoly8x8_tvectors.
- ARM-specific type containing threepoly8x8_tvectors.
- ARM-specific type containing fourpoly8x8_tvectors.
- ARM-specific 128-bit wide vector of sixteen packedp8.
- ARM-specific type containing twopoly8x16_tvectors.
- ARM-specific type containing threepoly8x16_tvectors.
- ARM-specific type containing fourpoly8x16_tvectors.
- ARM-specific 64-bit wide vector of four packedp16.
- ARM-specific type containing twopoly16x4_tvectors.
- ARM-specific type containing threepoly16x4_tvectors.
- ARM-specific type containing fourpoly16x4_tvectors.
- ARM-specific 128-bit wide vector of eight packedp16.
- ARM-specific type containing twopoly16x8_tvectors.
- ARM-specific type containing threepoly16x8_tvectors.
- ARM-specific type containing fourpoly16x8_tvectors.
- ARM-specific 64-bit wide vector of one packedp64.
- ARM-specific type containing twopoly64x1_tvectors.
- ARM-specific type containing threepoly64x1_tvectors.
- ARM-specific type containing fourpoly64x1_tvectors.
- ARM-specific 128-bit wide vector of two packedp64.
- ARM-specific type containing twopoly64x2_tvectors.
- ARM-specific type containing threepoly64x2_tvectors.
- ARM-specific type containing fourpoly64x2_tvectors.
- ARM-specific 64-bit wide vector of eight packedu8.
- ARM-specific type containing twouint8x8_tvectors.
- ARM-specific type containing threeuint8x8_tvectors.
- ARM-specific type containing fouruint8x8_tvectors.
- ARM-specific 128-bit wide vector of sixteen packedu8.
- ARM-specific type containing twouint8x16_tvectors.
- ARM-specific type containing threeuint8x16_tvectors.
- ARM-specific type containing fouruint8x16_tvectors.
- ARM-specific 64-bit wide vector of four packedu16.
- ARM-specific type containing twouint16x4_tvectors.
- ARM-specific type containing threeuint16x4_tvectors.
- ARM-specific type containing fouruint16x4_tvectors.
- ARM-specific 128-bit wide vector of eight packedu16.
- ARM-specific type containing twouint16x8_tvectors.
- ARM-specific type containing threeuint16x8_tvectors.
- ARM-specific type containing fouruint16x8_tvectors.
- ARM-specific 64-bit wide vector of two packedu32.
- ARM-specific type containing twouint32x2_tvectors.
- ARM-specific type containing threeuint32x2_tvectors.
- ARM-specific type containing fouruint32x2_tvectors.
- ARM-specific 128-bit wide vector of four packedu32.
- ARM-specific type containing twouint32x4_tvectors.
- ARM-specific type containing threeuint32x4_tvectors.
- ARM-specific type containing fouruint32x4_tvectors.
- ARM-specific 64-bit wide vector of one packedu64.
- ARM-specific type containing twouint64x1_tvectors.
- ARM-specific type containing threeuint64x1_tvectors.
- ARM-specific type containing fouruint64x1_tvectors.
- ARM-specific 128-bit wide vector of two packedu64.
- ARM-specific type containing twouint64x2_tvectors.
- ARM-specific type containing threeuint64x2_tvectors.
- ARM-specific type containing fouruint64x2_tvectors.
- SYExperimentalFull system is the required shareability domain, reads and writes are the required access types
Constants§
- _PREFETCH_LOCALITY0ExperimentalSeeprefetch.
- _PREFETCH_LOCALITY1ExperimentalSeeprefetch.
- _PREFETCH_LOCALITY2ExperimentalSeeprefetch.
- _PREFETCH_LOCALITY3ExperimentalSeeprefetch.
- _PREFETCH_READExperimentalSeeprefetch.
- _PREFETCH_WRITEExperimentalSeeprefetch.
- _TMFAILURE_CNCLExperimentalTransaction executed a TCANCEL instruction
- _TMFAILURE_DBGExperimentalTransaction aborted due to a debug trap.
- _TMFAILURE_ERRExperimentalTransaction aborted because a non-permissible operation was attempted
- _TMFAILURE_IMPExperimentalFallback error type for any other reason
- _TMFAILURE_INTExperimentalTransaction failed from interrupt
- _TMFAILURE_MEMExperimentalTransaction aborted because a conflict occurred
- _TMFAILURE_NESTExperimentalTransaction aborted due to transactional nesting level was exceeded
- _TMFAILURE_REASONExperimentalExtraction mask for failure reason
- _TMFAILURE_RTRYExperimentalTransaction retry is possible.
- _TMFAILURE_SIZEExperimentalTransaction aborted due to read or write set limit was exceeded
- _TMFAILURE_TRIVIALExperimentalIndicates a TRIVIAL version of TM is available
- _TMSTART_SUCCESSExperimentalTransaction successfully started.
Functions§
- vaba_s8⚠neon
- vaba_s16⚠neon
- vaba_s32⚠neon
- vaba_u8⚠neon
- vaba_u16⚠neon
- vaba_u32⚠neon
- vabal_high_s8⚠neonSigned Absolute difference and Accumulate Long
- vabal_high_s16⚠neonSigned Absolute difference and Accumulate Long
- vabal_high_s32⚠neonSigned Absolute difference and Accumulate Long
- vabal_high_u8⚠neonUnsigned Absolute difference and Accumulate Long
- vabal_high_u16⚠neonUnsigned Absolute difference and Accumulate Long
- vabal_high_u32⚠neonUnsigned Absolute difference and Accumulate Long
- vabal_s8⚠neonSigned Absolute difference and Accumulate Long
- vabal_s16⚠neonSigned Absolute difference and Accumulate Long
- vabal_s32⚠neonSigned Absolute difference and Accumulate Long
- vabal_u8⚠neonUnsigned Absolute difference and Accumulate Long
- vabal_u16⚠neonUnsigned Absolute difference and Accumulate Long
- vabal_u32⚠neonUnsigned Absolute difference and Accumulate Long
- vabaq_s8⚠neon
- vabaq_s16⚠neon
- vabaq_s32⚠neon
- vabaq_u8⚠neon
- vabaq_u16⚠neon
- vabaq_u32⚠neon
- vabd_f32⚠neonAbsolute difference between the arguments of Floating
- vabd_f64⚠neonAbsolute difference between the arguments of Floating
- vabd_s8⚠neonAbsolute difference between the arguments
- vabd_s16⚠neonAbsolute difference between the arguments
- vabd_s32⚠neonAbsolute difference between the arguments
- vabd_u8⚠neonAbsolute difference between the arguments
- vabd_u16⚠neonAbsolute difference between the arguments
- vabd_u32⚠neonAbsolute difference between the arguments
- vabdd_f64⚠neonFloating-point absolute difference
- vabdl_high_s8⚠neonSigned Absolute difference Long
- vabdl_high_s16⚠neonSigned Absolute difference Long
- vabdl_high_s32⚠neonSigned Absolute difference Long
- vabdl_high_u8⚠neonUnsigned Absolute difference Long
- vabdl_high_u16⚠neonUnsigned Absolute difference Long
- vabdl_high_u32⚠neonUnsigned Absolute difference Long
- vabdl_s8⚠neonSigned Absolute difference Long
- vabdl_s16⚠neonSigned Absolute difference Long
- vabdl_s32⚠neonSigned Absolute difference Long
- vabdl_u8⚠neonUnsigned Absolute difference Long
- vabdl_u16⚠neonUnsigned Absolute difference Long
- vabdl_u32⚠neonUnsigned Absolute difference Long
- vabdq_f32⚠neonAbsolute difference between the arguments of Floating
- vabdq_f64⚠neonAbsolute difference between the arguments of Floating
- vabdq_s8⚠neonAbsolute difference between the arguments
- vabdq_s16⚠neonAbsolute difference between the arguments
- vabdq_s32⚠neonAbsolute difference between the arguments
- vabdq_u8⚠neonAbsolute difference between the arguments
- vabdq_u16⚠neonAbsolute difference between the arguments
- vabdq_u32⚠neonAbsolute difference between the arguments
- vabds_f32⚠neonFloating-point absolute difference
- vabs_f32⚠neonFloating-point absolute value
- vabs_f64⚠neonFloating-point absolute value
- vabs_s8⚠neonAbsolute value (wrapping).
- vabs_s16⚠neonAbsolute value (wrapping).
- vabs_s32⚠neonAbsolute value (wrapping).
- vabs_s64⚠neonAbsolute Value (wrapping).
- vabsd_s64⚠neonAbsolute Value (wrapping).
- vabsq_f32⚠neonFloating-point absolute value
- vabsq_f64⚠neonFloating-point absolute value
- vabsq_s8⚠neonAbsolute value (wrapping).
- vabsq_s16⚠neonAbsolute value (wrapping).
- vabsq_s32⚠neonAbsolute value (wrapping).
- vabsq_s64⚠neonAbsolute Value (wrapping).
- vadd_f32⚠neonVector add.
- vadd_f64⚠neonVector add.
- vadd_p8⚠neonBitwise exclusive OR
- vadd_p16⚠neonBitwise exclusive OR
- vadd_p64⚠neonBitwise exclusive OR
- vadd_s8⚠neonVector add.
- vadd_s16⚠neonVector add.
- vadd_s32⚠neonVector add.
- vadd_s64⚠neonVector add.
- vadd_u8⚠neonVector add.
- vadd_u16⚠neonVector add.
- vadd_u32⚠neonVector add.
- vadd_u64⚠neonVector add.
- vaddd_s64⚠neonVector add.
- vaddd_u64⚠neonVector add.
- vaddhn_high_s16⚠neonAdd returning High Narrow (high half).
- vaddhn_high_s32⚠neonAdd returning High Narrow (high half).
- vaddhn_high_s64⚠neonAdd returning High Narrow (high half).
- vaddhn_high_u16⚠neonAdd returning High Narrow (high half).
- vaddhn_high_u32⚠neonAdd returning High Narrow (high half).
- vaddhn_high_u64⚠neonAdd returning High Narrow (high half).
- vaddhn_s16⚠neonAdd returning High Narrow.
- vaddhn_s32⚠neonAdd returning High Narrow.
- vaddhn_s64⚠neonAdd returning High Narrow.
- vaddhn_u16⚠neonAdd returning High Narrow.
- vaddhn_u32⚠neonAdd returning High Narrow.
- vaddhn_u64⚠neonAdd returning High Narrow.
- vaddl_high_s8⚠neonSigned Add Long (vector, high half).
- vaddl_high_s16⚠neonSigned Add Long (vector, high half).
- vaddl_high_s32⚠neonSigned Add Long (vector, high half).
- vaddl_high_u8⚠neonUnsigned Add Long (vector, high half).
- vaddl_high_u16⚠neonUnsigned Add Long (vector, high half).
- vaddl_high_u32⚠neonUnsigned Add Long (vector, high half).
- vaddl_s8⚠neonSigned Add Long (vector).
- vaddl_s16⚠neonSigned Add Long (vector).
- vaddl_s32⚠neonSigned Add Long (vector).
- vaddl_u8⚠neonUnsigned Add Long (vector).
- vaddl_u16⚠neonUnsigned Add Long (vector).
- vaddl_u32⚠neonUnsigned Add Long (vector).
- vaddlv_s8⚠neonSigned Add Long across Vector
- vaddlv_s16⚠neonSigned Add Long across Vector
- vaddlv_s32⚠neonSigned Add Long across Vector
- vaddlv_u8⚠neonUnsigned Add Long across Vector
- vaddlv_u16⚠neonUnsigned Add Long across Vector
- vaddlv_u32⚠neonUnsigned Add Long across Vector
- vaddlvq_s8⚠neonSigned Add Long across Vector
- vaddlvq_s16⚠neonSigned Add Long across Vector
- vaddlvq_s32⚠neonSigned Add Long across Vector
- vaddlvq_u8⚠neonUnsigned Add Long across Vector
- vaddlvq_u16⚠neonUnsigned Add Long across Vector
- vaddlvq_u32⚠neonUnsigned Add Long across Vector
- vaddq_f32⚠neonVector add.
- vaddq_f64⚠neonVector add.
- vaddq_p8⚠neonBitwise exclusive OR
- vaddq_p16⚠neonBitwise exclusive OR
- vaddq_p64⚠neonBitwise exclusive OR
- vaddq_p128⚠neonBitwise exclusive OR
- vaddq_s8⚠neonVector add.
- vaddq_s16⚠neonVector add.
- vaddq_s32⚠neonVector add.
- vaddq_s64⚠neonVector add.
- vaddq_u8⚠neonVector add.
- vaddq_u16⚠neonVector add.
- vaddq_u32⚠neonVector add.
- vaddq_u64⚠neonVector add.
- vaddv_f32⚠neonFloating-point add across vector
- vaddv_s8⚠neonAdd across vector
- vaddv_s16⚠neonAdd across vector
- vaddv_s32⚠neonAdd across vector
- vaddv_u8⚠neonAdd across vector
- vaddv_u16⚠neonAdd across vector
- vaddv_u32⚠neonAdd across vector
- vaddvq_f32⚠neonFloating-point add across vector
- vaddvq_f64⚠neonFloating-point add across vector
- vaddvq_s8⚠neonAdd across vector
- vaddvq_s16⚠neonAdd across vector
- vaddvq_s32⚠neonAdd across vector
- vaddvq_s64⚠neonAdd across vector
- vaddvq_u8⚠neonAdd across vector
- vaddvq_u16⚠neonAdd across vector
- vaddvq_u32⚠neonAdd across vector
- vaddvq_u64⚠neonAdd across vector
- vaddw_high_s8⚠neonSigned Add Wide (high half).
- vaddw_high_s16⚠neonSigned Add Wide (high half).
- vaddw_high_s32⚠neonSigned Add Wide (high half).
- vaddw_high_u8⚠neonUnsigned Add Wide (high half).
- vaddw_high_u16⚠neonUnsigned Add Wide (high half).
- vaddw_high_u32⚠neonUnsigned Add Wide (high half).
- vaddw_s8⚠neonSigned Add Wide.
- vaddw_s16⚠neonSigned Add Wide.
- vaddw_s32⚠neonSigned Add Wide.
- vaddw_u8⚠neonUnsigned Add Wide.
- vaddw_u16⚠neonUnsigned Add Wide.
- vaddw_u32⚠neonUnsigned Add Wide.
- vaesdq_u8⚠aesAES single round decryption.
- vaeseq_u8⚠aesAES single round encryption.
- vaesimcq_u8⚠aesAES inverse mix columns.
- vaesmcq_u8⚠aesAES mix columns.
- vand_s8⚠neonVector bitwise and
- vand_s16⚠neonVector bitwise and
- vand_s32⚠neonVector bitwise and
- vand_s64⚠neonVector bitwise and
- vand_u8⚠neonVector bitwise and
- vand_u16⚠neonVector bitwise and
- vand_u32⚠neonVector bitwise and
- vand_u64⚠neonVector bitwise and
- vandq_s8⚠neonVector bitwise and
- vandq_s16⚠neonVector bitwise and
- vandq_s32⚠neonVector bitwise and
- vandq_s64⚠neonVector bitwise and
- vandq_u8⚠neonVector bitwise and
- vandq_u16⚠neonVector bitwise and
- vandq_u32⚠neonVector bitwise and
- vandq_u64⚠neonVector bitwise and
- vbcaxq_s8⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_s16⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_s32⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_s64⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_u8⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_u16⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_u32⚠neon,sha3Bit clear and exclusive OR
- vbcaxq_u64⚠neon,sha3Bit clear and exclusive OR
- vbic_s8⚠neonVector bitwise bit clear
- vbic_s16⚠neonVector bitwise bit clear
- vbic_s32⚠neonVector bitwise bit clear
- vbic_s64⚠neonVector bitwise bit clear
- vbic_u8⚠neonVector bitwise bit clear
- vbic_u16⚠neonVector bitwise bit clear
- vbic_u32⚠neonVector bitwise bit clear
- vbic_u64⚠neonVector bitwise bit clear
- vbicq_s8⚠neonVector bitwise bit clear
- vbicq_s16⚠neonVector bitwise bit clear
- vbicq_s32⚠neonVector bitwise bit clear
- vbicq_s64⚠neonVector bitwise bit clear
- vbicq_u8⚠neonVector bitwise bit clear
- vbicq_u16⚠neonVector bitwise bit clear
- vbicq_u32⚠neonVector bitwise bit clear
- vbicq_u64⚠neonVector bitwise bit clear
- vbsl_f32⚠neonBitwise Select.
- vbsl_f64⚠neonBitwise Select instructions. This instruction sets each bit in the destination SIMD&FP register to the corresponding bit from the first source SIMD&FP register when the original destination bit was 1, otherwise from the second source SIMD&FP register.
- vbsl_p8⚠neonBitwise Select.
- vbsl_p16⚠neonBitwise Select.
- vbsl_p64⚠neonBitwise Select.
- vbsl_s8⚠neonBitwise Select instructions. This instruction sets each bit in the destination SIMD&FP register to the corresponding bit from the first source SIMD&FP register when the original destination bit was 1, otherwise from the second source SIMD&FP register. Bitwise Select.
- vbsl_s16⚠neonBitwise Select.
- vbsl_s32⚠neonBitwise Select.
- vbsl_s64⚠neonBitwise Select.
- vbsl_u8⚠neonBitwise Select.
- vbsl_u16⚠neonBitwise Select.
- vbsl_u32⚠neonBitwise Select.
- vbsl_u64⚠neonBitwise Select.
- vbslq_f32⚠neonBitwise Select. (128-bit)
- vbslq_f64⚠neonBitwise Select. (128-bit)
- vbslq_p8⚠neonBitwise Select. (128-bit)
- vbslq_p16⚠neonBitwise Select. (128-bit)
- vbslq_p64⚠neonBitwise Select. (128-bit)
- vbslq_s8⚠neonBitwise Select. (128-bit)
- vbslq_s16⚠neonBitwise Select. (128-bit)
- vbslq_s32⚠neonBitwise Select. (128-bit)
- vbslq_s64⚠neonBitwise Select. (128-bit)
- vbslq_u8⚠neonBitwise Select. (128-bit)
- vbslq_u16⚠neonBitwise Select. (128-bit)
- vbslq_u32⚠neonBitwise Select. (128-bit)
- vbslq_u64⚠neonBitwise Select. (128-bit)
- vcage_f32⚠neonFloating-point absolute compare greater than or equal
- vcage_f64⚠neonFloating-point absolute compare greater than or equal
- vcaged_f64⚠neonFloating-point absolute compare greater than or equal
- vcageq_f32⚠neonFloating-point absolute compare greater than or equal
- vcageq_f64⚠neonFloating-point absolute compare greater than or equal
- vcages_f32⚠neonFloating-point absolute compare greater than or equal
- vcagt_f32⚠neonFloating-point absolute compare greater than
- vcagt_f64⚠neonFloating-point absolute compare greater than
- vcagtd_f64⚠neonFloating-point absolute compare greater than
- vcagtq_f32⚠neonFloating-point absolute compare greater than
- vcagtq_f64⚠neonFloating-point absolute compare greater than
- vcagts_f32⚠neonFloating-point absolute compare greater than
- vcale_f32⚠neonFloating-point absolute compare less than or equal
- vcale_f64⚠neonFloating-point absolute compare less than or equal
- vcaled_f64⚠neonFloating-point absolute compare less than or equal
- vcaleq_f32⚠neonFloating-point absolute compare less than or equal
- vcaleq_f64⚠neonFloating-point absolute compare less than or equal
- vcales_f32⚠neonFloating-point absolute compare less than or equal
- vcalt_f32⚠neonFloating-point absolute compare less than
- vcalt_f64⚠neonFloating-point absolute compare less than
- vcaltd_f64⚠neonFloating-point absolute compare less than
- vcaltq_f32⚠neonFloating-point absolute compare less than
- vcaltq_f64⚠neonFloating-point absolute compare less than
- vcalts_f32⚠neonFloating-point absolute compare less than
- vceq_f32⚠neonFloating-point compare equal
- vceq_f64⚠neonFloating-point compare equal
- vceq_p8⚠neonCompare bitwise Equal (vector)
- vceq_p64⚠neonCompare bitwise Equal (vector)
- vceq_s8⚠neonCompare bitwise Equal (vector)
- vceq_s16⚠neonCompare bitwise Equal (vector)
- vceq_s32⚠neonCompare bitwise Equal (vector)
- vceq_s64⚠neonCompare bitwise Equal (vector)
- vceq_u8⚠neonCompare bitwise Equal (vector)
- vceq_u16⚠neonCompare bitwise Equal (vector)
- vceq_u32⚠neonCompare bitwise Equal (vector)
- vceq_u64⚠neonCompare bitwise Equal (vector)
- vceqd_f64⚠neonFloating-point compare equal
- vceqd_s64⚠neonCompare bitwise equal
- vceqd_u64⚠neonCompare bitwise equal
- vceqq_f32⚠neonFloating-point compare equal
- vceqq_f64⚠neonFloating-point compare equal
- vceqq_p8⚠neonCompare bitwise Equal (vector)
- vceqq_p64⚠neonCompare bitwise Equal (vector)
- vceqq_s8⚠neonCompare bitwise Equal (vector)
- vceqq_s16⚠neonCompare bitwise Equal (vector)
- vceqq_s32⚠neonCompare bitwise Equal (vector)
- vceqq_s64⚠neonCompare bitwise Equal (vector)
- vceqq_u8⚠neonCompare bitwise Equal (vector)
- vceqq_u16⚠neonCompare bitwise Equal (vector)
- vceqq_u32⚠neonCompare bitwise Equal (vector)
- vceqq_u64⚠neonCompare bitwise Equal (vector)
- vceqs_f32⚠neonFloating-point compare equal
- vceqz_f32⚠neonFloating-point compare bitwise equal to zero
- vceqz_f64⚠neonFloating-point compare bitwise equal to zero
- vceqz_p8⚠neonSigned compare bitwise equal to zero
- vceqz_p64⚠neonSigned compare bitwise equal to zero
- vceqz_s8⚠neonSigned compare bitwise equal to zero
- vceqz_s16⚠neonSigned compare bitwise equal to zero
- vceqz_s32⚠neonSigned compare bitwise equal to zero
- vceqz_s64⚠neonSigned compare bitwise equal to zero
- vceqz_u8⚠neonUnsigned compare bitwise equal to zero
- vceqz_u16⚠neonUnsigned compare bitwise equal to zero
- vceqz_u32⚠neonUnsigned compare bitwise equal to zero
- vceqz_u64⚠neonUnsigned compare bitwise equal to zero
- vceqzd_f64⚠neonFloating-point compare bitwise equal to zero
- vceqzd_s64⚠neonCompare bitwise equal to zero
- vceqzd_u64⚠neonCompare bitwise equal to zero
- vceqzq_f32⚠neonFloating-point compare bitwise equal to zero
- vceqzq_f64⚠neonFloating-point compare bitwise equal to zero
- vceqzq_p8⚠neonSigned compare bitwise equal to zero
- vceqzq_p64⚠neonSigned compare bitwise equal to zero
- vceqzq_s8⚠neonSigned compare bitwise equal to zero
- vceqzq_s16⚠neonSigned compare bitwise equal to zero
- vceqzq_s32⚠neonSigned compare bitwise equal to zero
- vceqzq_s64⚠neonSigned compare bitwise equal to zero
- vceqzq_u8⚠neonUnsigned compare bitwise equal to zero
- vceqzq_u16⚠neonUnsigned compare bitwise equal to zero
- vceqzq_u32⚠neonUnsigned compare bitwise equal to zero
- vceqzq_u64⚠neonUnsigned compare bitwise equal to zero
- vceqzs_f32⚠neonFloating-point compare bitwise equal to zero
- vcge_f32⚠neonFloating-point compare greater than or equal
- vcge_f64⚠neonFloating-point compare greater than or equal
- vcge_s8⚠neonCompare signed greater than or equal
- vcge_s16⚠neonCompare signed greater than or equal
- vcge_s32⚠neonCompare signed greater than or equal
- vcge_s64⚠neonCompare signed greater than or equal
- vcge_u8⚠neonCompare unsigned greater than or equal
- vcge_u16⚠neonCompare unsigned greater than or equal
- vcge_u32⚠neonCompare unsigned greater than or equal
- vcge_u64⚠neonCompare unsigned greater than or equal
- vcged_f64⚠neonFloating-point compare greater than or equal
- vcged_s64⚠neonCompare greater than or equal
- vcged_u64⚠neonCompare greater than or equal
- vcgeq_f32⚠neonFloating-point compare greater than or equal
- vcgeq_f64⚠neonFloating-point compare greater than or equal
- vcgeq_s8⚠neonCompare signed greater than or equal
- vcgeq_s16⚠neonCompare signed greater than or equal
- vcgeq_s32⚠neonCompare signed greater than or equal
- vcgeq_s64⚠neonCompare signed greater than or equal
- vcgeq_u8⚠neonCompare unsigned greater than or equal
- vcgeq_u16⚠neonCompare unsigned greater than or equal
- vcgeq_u32⚠neonCompare unsigned greater than or equal
- vcgeq_u64⚠neonCompare unsigned greater than or equal
- vcges_f32⚠neonFloating-point compare greater than or equal
- vcgez_f32⚠neonFloating-point compare greater than or equal to zero
- vcgez_f64⚠neonFloating-point compare greater than or equal to zero
- vcgez_s8⚠neonCompare signed greater than or equal to zero
- vcgez_s16⚠neonCompare signed greater than or equal to zero
- vcgez_s32⚠neonCompare signed greater than or equal to zero
- vcgez_s64⚠neonCompare signed greater than or equal to zero
- vcgezd_f64⚠neonFloating-point compare greater than or equal to zero
- vcgezd_s64⚠neonCompare signed greater than or equal to zero
- vcgezq_f32⚠neonFloating-point compare greater than or equal to zero
- vcgezq_f64⚠neonFloating-point compare greater than or equal to zero
- vcgezq_s8⚠neonCompare signed greater than or equal to zero
- vcgezq_s16⚠neonCompare signed greater than or equal to zero
- vcgezq_s32⚠neonCompare signed greater than or equal to zero
- vcgezq_s64⚠neonCompare signed greater than or equal to zero
- vcgezs_f32⚠neonFloating-point compare greater than or equal to zero
- vcgt_f32⚠neonFloating-point compare greater than
- vcgt_f64⚠neonFloating-point compare greater than
- vcgt_s8⚠neonCompare signed greater than
- vcgt_s16⚠neonCompare signed greater than
- vcgt_s32⚠neonCompare signed greater than
- vcgt_s64⚠neonCompare signed greater than
- vcgt_u8⚠neonCompare unsigned greater than
- vcgt_u16⚠neonCompare unsigned greater than
- vcgt_u32⚠neonCompare unsigned greater than
- vcgt_u64⚠neonCompare unsigned greater than
- vcgtd_f64⚠neonFloating-point compare greater than
- vcgtd_s64⚠neonCompare greater than
- vcgtd_u64⚠neonCompare greater than
- vcgtq_f32⚠neonFloating-point compare greater than
- vcgtq_f64⚠neonFloating-point compare greater than
- vcgtq_s8⚠neonCompare signed greater than
- vcgtq_s16⚠neonCompare signed greater than
- vcgtq_s32⚠neonCompare signed greater than
- vcgtq_s64⚠neonCompare signed greater than
- vcgtq_u8⚠neonCompare unsigned greater than
- vcgtq_u16⚠neonCompare unsigned greater than
- vcgtq_u32⚠neonCompare unsigned greater than
- vcgtq_u64⚠neonCompare unsigned greater than
- vcgts_f32⚠neonFloating-point compare greater than
- vcgtz_f32⚠neonFloating-point compare greater than zero
- vcgtz_f64⚠neonFloating-point compare greater than zero
- vcgtz_s8⚠neonCompare signed greater than zero
- vcgtz_s16⚠neonCompare signed greater than zero
- vcgtz_s32⚠neonCompare signed greater than zero
- vcgtz_s64⚠neonCompare signed greater than zero
- vcgtzd_f64⚠neonFloating-point compare greater than zero
- vcgtzd_s64⚠neonCompare signed greater than zero
- vcgtzq_f32⚠neonFloating-point compare greater than zero
- vcgtzq_f64⚠neonFloating-point compare greater than zero
- vcgtzq_s8⚠neonCompare signed greater than zero
- vcgtzq_s16⚠neonCompare signed greater than zero
- vcgtzq_s32⚠neonCompare signed greater than zero
- vcgtzq_s64⚠neonCompare signed greater than zero
- vcgtzs_f32⚠neonFloating-point compare greater than zero
- vcle_f32⚠neonFloating-point compare less than or equal
- vcle_f64⚠neonFloating-point compare less than or equal
- vcle_s8⚠neonCompare signed less than or equal
- vcle_s16⚠neonCompare signed less than or equal
- vcle_s32⚠neonCompare signed less than or equal
- vcle_s64⚠neonCompare signed less than or equal
- vcle_u8⚠neonCompare unsigned less than or equal
- vcle_u16⚠neonCompare unsigned less than or equal
- vcle_u32⚠neonCompare unsigned less than or equal
- vcle_u64⚠neonCompare unsigned less than or equal
- vcled_f64⚠neonFloating-point compare less than or equal
- vcled_s64⚠neonCompare less than or equal
- vcled_u64⚠neonCompare less than or equal
- vcleq_f32⚠neonFloating-point compare less than or equal
- vcleq_f64⚠neonFloating-point compare less than or equal
- vcleq_s8⚠neonCompare signed less than or equal
- vcleq_s16⚠neonCompare signed less than or equal
- vcleq_s32⚠neonCompare signed less than or equal
- vcleq_s64⚠neonCompare signed less than or equal
- vcleq_u8⚠neonCompare unsigned less than or equal
- vcleq_u16⚠neonCompare unsigned less than or equal
- vcleq_u32⚠neonCompare unsigned less than or equal
- vcleq_u64⚠neonCompare unsigned less than or equal
- vcles_f32⚠neonFloating-point compare less than or equal
- vclez_f32⚠neonFloating-point compare less than or equal to zero
- vclez_f64⚠neonFloating-point compare less than or equal to zero
- vclez_s8⚠neonCompare signed less than or equal to zero
- vclez_s16⚠neonCompare signed less than or equal to zero
- vclez_s32⚠neonCompare signed less than or equal to zero
- vclez_s64⚠neonCompare signed less than or equal to zero
- vclezd_f64⚠neonFloating-point compare less than or equal to zero
- vclezd_s64⚠neonCompare less than or equal to zero
- vclezq_f32⚠neonFloating-point compare less than or equal to zero
- vclezq_f64⚠neonFloating-point compare less than or equal to zero
- vclezq_s8⚠neonCompare signed less than or equal to zero
- vclezq_s16⚠neonCompare signed less than or equal to zero
- vclezq_s32⚠neonCompare signed less than or equal to zero
- vclezq_s64⚠neonCompare signed less than or equal to zero
- vclezs_f32⚠neonFloating-point compare less than or equal to zero
- vcls_s8⚠neonCount leading sign bits
- vcls_s16⚠neonCount leading sign bits
- vcls_s32⚠neonCount leading sign bits
- vcls_u8⚠neonCount leading sign bits
- vcls_u16⚠neonCount leading sign bits
- vcls_u32⚠neonCount leading sign bits
- vclsq_s8⚠neonCount leading sign bits
- vclsq_s16⚠neonCount leading sign bits
- vclsq_s32⚠neonCount leading sign bits
- vclsq_u8⚠neonCount leading sign bits
- vclsq_u16⚠neonCount leading sign bits
- vclsq_u32⚠neonCount leading sign bits
- vclt_f32⚠neonFloating-point compare less than
- vclt_f64⚠neonFloating-point compare less than
- vclt_s8⚠neonCompare signed less than
- vclt_s16⚠neonCompare signed less than
- vclt_s32⚠neonCompare signed less than
- vclt_s64⚠neonCompare signed less than
- vclt_u8⚠neonCompare unsigned less than
- vclt_u16⚠neonCompare unsigned less than
- vclt_u32⚠neonCompare unsigned less than
- vclt_u64⚠neonCompare unsigned less than
- vcltd_f64⚠neonFloating-point compare less than
- vcltd_s64⚠neonCompare less than
- vcltd_u64⚠neonCompare less than
- vcltq_f32⚠neonFloating-point compare less than
- vcltq_f64⚠neonFloating-point compare less than
- vcltq_s8⚠neonCompare signed less than
- vcltq_s16⚠neonCompare signed less than
- vcltq_s32⚠neonCompare signed less than
- vcltq_s64⚠neonCompare signed less than
- vcltq_u8⚠neonCompare unsigned less than
- vcltq_u16⚠neonCompare unsigned less than
- vcltq_u32⚠neonCompare unsigned less than
- vcltq_u64⚠neonCompare unsigned less than
- vclts_f32⚠neonFloating-point compare less than
- vcltz_f32⚠neonFloating-point compare less than zero
- vcltz_f64⚠neonFloating-point compare less than zero
- vcltz_s8⚠neonCompare signed less than zero
- vcltz_s16⚠neonCompare signed less than zero
- vcltz_s32⚠neonCompare signed less than zero
- vcltz_s64⚠neonCompare signed less than zero
- vcltzd_f64⚠neonFloating-point compare less than zero
- vcltzd_s64⚠neonCompare less than zero
- vcltzq_f32⚠neonFloating-point compare less than zero
- vcltzq_f64⚠neonFloating-point compare less than zero
- vcltzq_s8⚠neonCompare signed less than zero
- vcltzq_s16⚠neonCompare signed less than zero
- vcltzq_s32⚠neonCompare signed less than zero
- vcltzq_s64⚠neonCompare signed less than zero
- vcltzs_f32⚠neonFloating-point compare less than zero
- vclz_s8⚠neonCount leading zero bits
- vclz_s16⚠neonCount leading zero bits
- vclz_s32⚠neonCount leading zero bits
- vclz_u8⚠neonCount leading zero bits
- vclz_u16⚠neonCount leading zero bits
- vclz_u32⚠neonCount leading zero bits
- vclzq_s8⚠neonCount leading zero bits
- vclzq_s16⚠neonCount leading zero bits
- vclzq_s32⚠neonCount leading zero bits
- vclzq_u8⚠neonCount leading zero bits
- vclzq_u16⚠neonCount leading zero bits
- vclzq_u32⚠neonCount leading zero bits
- vcnt_p8⚠neonPopulation count per byte.
- vcnt_s8⚠neonPopulation count per byte.
- vcnt_u8⚠neonPopulation count per byte.
- vcntq_p8⚠neonPopulation count per byte.
- vcntq_s8⚠neonPopulation count per byte.
- vcntq_u8⚠neonPopulation count per byte.
- vcombine_f32⚠neonVector combine
- vcombine_f64⚠neonVector combine
- vcombine_p8⚠neonVector combine
- vcombine_p16⚠neonVector combine
- vcombine_p64⚠neonVector combine
- vcombine_s8⚠neonVector combine
- vcombine_s16⚠neonVector combine
- vcombine_s32⚠neonVector combine
- vcombine_s64⚠neonVector combine
- vcombine_u8⚠neonVector combine
- vcombine_u16⚠neonVector combine
- vcombine_u32⚠neonVector combine
- vcombine_u64⚠neonVector combine
- vcopy_lane_f32⚠neonInsert vector element from another vector element
- vcopy_lane_f64⚠neonDuplicate vector element to vector or scalar
- vcopy_lane_p8⚠neonInsert vector element from another vector element
- vcopy_lane_p16⚠neonInsert vector element from another vector element
- vcopy_lane_p64⚠neonDuplicate vector element to vector or scalar
- vcopy_lane_s8⚠neonInsert vector element from another vector element
- vcopy_lane_s16⚠neonInsert vector element from another vector element
- vcopy_lane_s32⚠neonInsert vector element from another vector element
- vcopy_lane_s64⚠neonDuplicate vector element to vector or scalar
- vcopy_lane_u8⚠neonInsert vector element from another vector element
- vcopy_lane_u16⚠neonInsert vector element from another vector element
- vcopy_lane_u32⚠neonInsert vector element from another vector element
- vcopy_lane_u64⚠neonDuplicate vector element to vector or scalar
- vcopy_laneq_f32⚠neonInsert vector element from another vector element
- vcopy_laneq_f64⚠neonDuplicate vector element to vector or scalar
- vcopy_laneq_p8⚠neonInsert vector element from another vector element
- vcopy_laneq_p16⚠neonInsert vector element from another vector element
- vcopy_laneq_p64⚠neonDuplicate vector element to vector or scalar
- vcopy_laneq_s8⚠neonInsert vector element from another vector element
- vcopy_laneq_s16⚠neonInsert vector element from another vector element
- vcopy_laneq_s32⚠neonInsert vector element from another vector element
- vcopy_laneq_s64⚠neonDuplicate vector element to vector or scalar
- vcopy_laneq_u8⚠neonInsert vector element from another vector element
- vcopy_laneq_u16⚠neonInsert vector element from another vector element
- vcopy_laneq_u32⚠neonInsert vector element from another vector element
- vcopy_laneq_u64⚠neonDuplicate vector element to vector or scalar
- vcopyq_lane_f32⚠neonInsert vector element from another vector element
- vcopyq_lane_f64⚠neonInsert vector element from another vector element
- vcopyq_lane_p8⚠neonInsert vector element from another vector element
- vcopyq_lane_p16⚠neonInsert vector element from another vector element
- vcopyq_lane_p64⚠neonInsert vector element from another vector element
- vcopyq_lane_s8⚠neonInsert vector element from another vector element
- vcopyq_lane_s16⚠neonInsert vector element from another vector element
- vcopyq_lane_s32⚠neonInsert vector element from another vector element
- vcopyq_lane_s64⚠neonInsert vector element from another vector element
- vcopyq_lane_u8⚠neonInsert vector element from another vector element
- vcopyq_lane_u16⚠neonInsert vector element from another vector element
- vcopyq_lane_u32⚠neonInsert vector element from another vector element
- vcopyq_lane_u64⚠neonInsert vector element from another vector element
- vcopyq_laneq_f32⚠neonInsert vector element from another vector element
- vcopyq_laneq_f64⚠neonInsert vector element from another vector element
- vcopyq_laneq_p8⚠neonInsert vector element from another vector element
- vcopyq_laneq_p16⚠neonInsert vector element from another vector element
- vcopyq_laneq_p64⚠neonInsert vector element from another vector element
- vcopyq_laneq_s8⚠neonInsert vector element from another vector element
- vcopyq_laneq_s16⚠neonInsert vector element from another vector element
- vcopyq_laneq_s32⚠neonInsert vector element from another vector element
- vcopyq_laneq_s64⚠neonInsert vector element from another vector element
- vcopyq_laneq_u8⚠neonInsert vector element from another vector element
- vcopyq_laneq_u16⚠neonInsert vector element from another vector element
- vcopyq_laneq_u32⚠neonInsert vector element from another vector element
- vcopyq_laneq_u64⚠neonInsert vector element from another vector element
- vcreate_f32⚠neonInsert vector element from another vector element
- vcreate_f64⚠neonInsert vector element from another vector element
- vcreate_p8⚠neonInsert vector element from another vector element
- vcreate_p16⚠neonInsert vector element from another vector element
- vcreate_p64⚠neon,aesInsert vector element from another vector element
- vcreate_s8⚠neonInsert vector element from another vector element
- vcreate_s16⚠neonInsert vector element from another vector element
- vcreate_s32⚠neonInsert vector element from another vector element
- vcreate_s64⚠neonInsert vector element from another vector element
- vcreate_u8⚠neonInsert vector element from another vector element
- vcreate_u16⚠neonInsert vector element from another vector element
- vcreate_u32⚠neonInsert vector element from another vector element
- vcreate_u64⚠neonInsert vector element from another vector element
- vcvt_f32_f64⚠neonFloating-point convert to lower precision narrow
- vcvt_f32_s32⚠neonFixed-point convert to floating-point
- vcvt_f32_u32⚠neonFixed-point convert to floating-point
- vcvt_f64_f32⚠neonFloating-point convert to higher precision long
- vcvt_f64_s64⚠neonFixed-point convert to floating-point
- vcvt_f64_u64⚠neonFixed-point convert to floating-point
- vcvt_high_f32_f64⚠neonFloating-point convert to lower precision narrow
- vcvt_high_f64_f32⚠neonFloating-point convert to higher precision long
- vcvt_n_f32_s32⚠neonFixed-point convert to floating-point
- vcvt_n_f32_u32⚠neonFixed-point convert to floating-point
- vcvt_n_f64_s64⚠neonFixed-point convert to floating-point
- vcvt_n_f64_u64⚠neonFixed-point convert to floating-point
- vcvt_n_s32_f32⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvt_n_s64_f64⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvt_n_u32_f32⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvt_n_u64_f64⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvt_s32_f32⚠neonFloating-point convert to signed fixed-point, rounding toward zero
- vcvt_s64_f64⚠neonFloating-point convert to signed fixed-point, rounding toward zero
- vcvt_u32_f32⚠neonFloating-point convert to unsigned fixed-point, rounding toward zero
- vcvt_u64_f64⚠neonFloating-point convert to unsigned fixed-point, rounding toward zero
- vcvta_s32_f32⚠neonFloating-point convert to signed integer, rounding to nearest with ties to away
- vcvta_s64_f64⚠neonFloating-point convert to signed integer, rounding to nearest with ties to away
- vcvta_u32_f32⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to away
- vcvta_u64_f64⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to away
- vcvtad_s64_f64⚠neonFloating-point convert to integer, rounding to nearest with ties to away
- vcvtad_u64_f64⚠neonFloating-point convert to integer, rounding to nearest with ties to away
- vcvtaq_s32_f32⚠neonFloating-point convert to signed integer, rounding to nearest with ties to away
- vcvtaq_s64_f64⚠neonFloating-point convert to signed integer, rounding to nearest with ties to away
- vcvtaq_u32_f32⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to away
- vcvtaq_u64_f64⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to away
- vcvtas_s32_f32⚠neonFloating-point convert to integer, rounding to nearest with ties to away
- vcvtas_u32_f32⚠neonFloating-point convert to integer, rounding to nearest with ties to away
- vcvtd_f64_s64⚠neonFixed-point convert to floating-point
- vcvtd_f64_u64⚠neonFixed-point convert to floating-point
- vcvtd_n_f64_s64⚠neonFixed-point convert to floating-point
- vcvtd_n_f64_u64⚠neonFixed-point convert to floating-point
- vcvtd_n_s64_f64⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvtd_n_u64_f64⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvtd_s64_f64⚠neonFixed-point convert to floating-point
- vcvtd_u64_f64⚠neonFixed-point convert to floating-point
- vcvtm_s32_f32⚠neonFloating-point convert to signed integer, rounding toward minus infinity
- vcvtm_s64_f64⚠neonFloating-point convert to signed integer, rounding toward minus infinity
- vcvtm_u32_f32⚠neonFloating-point convert to unsigned integer, rounding toward minus infinity
- vcvtm_u64_f64⚠neonFloating-point convert to unsigned integer, rounding toward minus infinity
- vcvtmd_s64_f64⚠neonFloating-point convert to signed integer, rounding toward minus infinity
- vcvtmd_u64_f64⚠neonFloating-point convert to unsigned integer, rounding toward minus infinity
- vcvtmq_s32_f32⚠neonFloating-point convert to signed integer, rounding toward minus infinity
- vcvtmq_s64_f64⚠neonFloating-point convert to signed integer, rounding toward minus infinity
- vcvtmq_u32_f32⚠neonFloating-point convert to unsigned integer, rounding toward minus infinity
- vcvtmq_u64_f64⚠neonFloating-point convert to unsigned integer, rounding toward minus infinity
- vcvtms_s32_f32⚠neonFloating-point convert to signed integer, rounding toward minus infinity
- vcvtms_u32_f32⚠neonFloating-point convert to unsigned integer, rounding toward minus infinity
- vcvtn_s32_f32⚠neonFloating-point convert to signed integer, rounding to nearest with ties to even
- vcvtn_s64_f64⚠neonFloating-point convert to signed integer, rounding to nearest with ties to even
- vcvtn_u32_f32⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to even
- vcvtn_u64_f64⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to even
- vcvtnd_s64_f64⚠neonFloating-point convert to signed integer, rounding to nearest with ties to even
- vcvtnd_u64_f64⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to even
- vcvtnq_s32_f32⚠neonFloating-point convert to signed integer, rounding to nearest with ties to even
- vcvtnq_s64_f64⚠neonFloating-point convert to signed integer, rounding to nearest with ties to even
- vcvtnq_u32_f32⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to even
- vcvtnq_u64_f64⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to even
- vcvtns_s32_f32⚠neonFloating-point convert to signed integer, rounding to nearest with ties to even
- vcvtns_u32_f32⚠neonFloating-point convert to unsigned integer, rounding to nearest with ties to even
- vcvtp_s32_f32⚠neonFloating-point convert to signed integer, rounding toward plus infinity
- vcvtp_s64_f64⚠neonFloating-point convert to signed integer, rounding toward plus infinity
- vcvtp_u32_f32⚠neonFloating-point convert to unsigned integer, rounding toward plus infinity
- vcvtp_u64_f64⚠neonFloating-point convert to unsigned integer, rounding toward plus infinity
- vcvtpd_s64_f64⚠neonFloating-point convert to signed integer, rounding toward plus infinity
- vcvtpd_u64_f64⚠neonFloating-point convert to unsigned integer, rounding toward plus infinity
- vcvtpq_s32_f32⚠neonFloating-point convert to signed integer, rounding toward plus infinity
- vcvtpq_s64_f64⚠neonFloating-point convert to signed integer, rounding toward plus infinity
- vcvtpq_u32_f32⚠neonFloating-point convert to unsigned integer, rounding toward plus infinity
- vcvtpq_u64_f64⚠neonFloating-point convert to unsigned integer, rounding toward plus infinity
- vcvtps_s32_f32⚠neonFloating-point convert to signed integer, rounding toward plus infinity
- vcvtps_u32_f32⚠neonFloating-point convert to unsigned integer, rounding toward plus infinity
- vcvtq_f32_s32⚠neonFixed-point convert to floating-point
- vcvtq_f32_u32⚠neonFixed-point convert to floating-point
- vcvtq_f64_s64⚠neonFixed-point convert to floating-point
- vcvtq_f64_u64⚠neonFixed-point convert to floating-point
- vcvtq_n_f32_s32⚠neonFixed-point convert to floating-point
- vcvtq_n_f32_u32⚠neonFixed-point convert to floating-point
- vcvtq_n_f64_s64⚠neonFixed-point convert to floating-point
- vcvtq_n_f64_u64⚠neonFixed-point convert to floating-point
- vcvtq_n_s32_f32⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvtq_n_s64_f64⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvtq_n_u32_f32⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvtq_n_u64_f64⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvtq_s32_f32⚠neonFloating-point convert to signed fixed-point, rounding toward zero
- vcvtq_s64_f64⚠neonFloating-point convert to signed fixed-point, rounding toward zero
- vcvtq_u32_f32⚠neonFloating-point convert to unsigned fixed-point, rounding toward zero
- vcvtq_u64_f64⚠neonFloating-point convert to unsigned fixed-point, rounding toward zero
- vcvts_f32_s32⚠neonFixed-point convert to floating-point
- vcvts_f32_u32⚠neonFixed-point convert to floating-point
- vcvts_n_f32_s32⚠neonFixed-point convert to floating-point
- vcvts_n_f32_u32⚠neonFixed-point convert to floating-point
- vcvts_n_s32_f32⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvts_n_u32_f32⚠neonFloating-point convert to fixed-point, rounding toward zero
- vcvts_s32_f32⚠neonFixed-point convert to floating-point
- vcvts_u32_f32⚠neonFixed-point convert to floating-point
- vcvtx_f32_f64⚠neonFloating-point convert to lower precision narrow, rounding to odd
- vcvtx_high_f32_f64⚠neonFloating-point convert to lower precision narrow, rounding to odd
- vcvtxd_f32_f64⚠neonFloating-point convert to lower precision narrow, rounding to odd
- vdiv_f32⚠neonDivide
- vdiv_f64⚠neonDivide
- vdivq_f32⚠neonDivide
- vdivq_f64⚠neonDivide
- vdup_lane_f32⚠neonSet all vector lanes to the same value
- vdup_lane_f64⚠neonSet all vector lanes to the same value
- vdup_lane_p8⚠neonSet all vector lanes to the same value
- vdup_lane_p16⚠neonSet all vector lanes to the same value
- vdup_lane_p64⚠neonSet all vector lanes to the same value
- vdup_lane_s8⚠neonSet all vector lanes to the same value
- vdup_lane_s16⚠neonSet all vector lanes to the same value
- vdup_lane_s32⚠neonSet all vector lanes to the same value
- vdup_lane_s64⚠neonSet all vector lanes to the same value
- vdup_lane_u8⚠neonSet all vector lanes to the same value
- vdup_lane_u16⚠neonSet all vector lanes to the same value
- vdup_lane_u32⚠neonSet all vector lanes to the same value
- vdup_lane_u64⚠neonSet all vector lanes to the same value
- vdup_laneq_f32⚠neonSet all vector lanes to the same value
- vdup_laneq_f64⚠neonSet all vector lanes to the same value
- vdup_laneq_p8⚠neonSet all vector lanes to the same value
- vdup_laneq_p16⚠neonSet all vector lanes to the same value
- vdup_laneq_p64⚠neonSet all vector lanes to the same value
- vdup_laneq_s8⚠neonSet all vector lanes to the same value
- vdup_laneq_s16⚠neonSet all vector lanes to the same value
- vdup_laneq_s32⚠neonSet all vector lanes to the same value
- vdup_laneq_s64⚠neonSet all vector lanes to the same value
- vdup_laneq_u8⚠neonSet all vector lanes to the same value
- vdup_laneq_u16⚠neonSet all vector lanes to the same value
- vdup_laneq_u32⚠neonSet all vector lanes to the same value
- vdup_laneq_u64⚠neonSet all vector lanes to the same value
- vdup_n_f32⚠neonDuplicate vector element to vector or scalar
- vdup_n_f64⚠neonDuplicate vector element to vector or scalar
- vdup_n_p8⚠neonDuplicate vector element to vector or scalar
- vdup_n_p16⚠neonDuplicate vector element to vector or scalar
- vdup_n_p64⚠neonDuplicate vector element to vector or scalar
- vdup_n_s8⚠neonDuplicate vector element to vector or scalar
- vdup_n_s16⚠neonDuplicate vector element to vector or scalar
- vdup_n_s32⚠neonDuplicate vector element to vector or scalar
- vdup_n_s64⚠neonDuplicate vector element to vector or scalar
- vdup_n_u8⚠neonDuplicate vector element to vector or scalar
- vdup_n_u16⚠neonDuplicate vector element to vector or scalar
- vdup_n_u32⚠neonDuplicate vector element to vector or scalar
- vdup_n_u64⚠neonDuplicate vector element to vector or scalar
- vdupb_lane_p8⚠neonSet all vector lanes to the same value
- vdupb_lane_s8⚠neonSet all vector lanes to the same value
- vdupb_lane_u8⚠neonSet all vector lanes to the same value
- vdupb_laneq_p8⚠neonSet all vector lanes to the same value
- vdupb_laneq_s8⚠neonSet all vector lanes to the same value
- vdupb_laneq_u8⚠neonSet all vector lanes to the same value
- vdupd_lane_f64⚠neonSet all vector lanes to the same value
- vdupd_lane_s64⚠neonSet all vector lanes to the same value
- vdupd_lane_u64⚠neonSet all vector lanes to the same value
- vdupd_laneq_f64⚠neonSet all vector lanes to the same value
- vdupd_laneq_s64⚠neonSet all vector lanes to the same value
- vdupd_laneq_u64⚠neonSet all vector lanes to the same value
- vduph_lane_p16⚠neonSet all vector lanes to the same value
- vduph_lane_s16⚠neonSet all vector lanes to the same value
- vduph_lane_u16⚠neonSet all vector lanes to the same value
- vduph_laneq_p16⚠neonSet all vector lanes to the same value
- vduph_laneq_s16⚠neonSet all vector lanes to the same value
- vduph_laneq_u16⚠neonSet all vector lanes to the same value
- vdupq_lane_f32⚠neonSet all vector lanes to the same value
- vdupq_lane_f64⚠neonSet all vector lanes to the same value
- vdupq_lane_p8⚠neonSet all vector lanes to the same value
- vdupq_lane_p16⚠neonSet all vector lanes to the same value
- vdupq_lane_p64⚠neonSet all vector lanes to the same value
- vdupq_lane_s8⚠neonSet all vector lanes to the same value
- vdupq_lane_s16⚠neonSet all vector lanes to the same value
- vdupq_lane_s32⚠neonSet all vector lanes to the same value
- vdupq_lane_s64⚠neonSet all vector lanes to the same value
- vdupq_lane_u8⚠neonSet all vector lanes to the same value
- vdupq_lane_u16⚠neonSet all vector lanes to the same value
- vdupq_lane_u32⚠neonSet all vector lanes to the same value
- vdupq_lane_u64⚠neonSet all vector lanes to the same value
- vdupq_laneq_f32⚠neonSet all vector lanes to the same value
- vdupq_laneq_f64⚠neonSet all vector lanes to the same value
- vdupq_laneq_p8⚠neonSet all vector lanes to the same value
- vdupq_laneq_p16⚠neonSet all vector lanes to the same value
- vdupq_laneq_p64⚠neonSet all vector lanes to the same value
- vdupq_laneq_s8⚠neonSet all vector lanes to the same value
- vdupq_laneq_s16⚠neonSet all vector lanes to the same value
- vdupq_laneq_s32⚠neonSet all vector lanes to the same value
- vdupq_laneq_s64⚠neonSet all vector lanes to the same value
- vdupq_laneq_u8⚠neonSet all vector lanes to the same value
- vdupq_laneq_u16⚠neonSet all vector lanes to the same value
- vdupq_laneq_u32⚠neonSet all vector lanes to the same value
- vdupq_laneq_u64⚠neonSet all vector lanes to the same value
- vdupq_n_f32⚠neonDuplicate vector element to vector or scalar
- vdupq_n_f64⚠neonDuplicate vector element to vector or scalar
- vdupq_n_p8⚠neonDuplicate vector element to vector or scalar
- vdupq_n_p16⚠neonDuplicate vector element to vector or scalar
- vdupq_n_p64⚠neonDuplicate vector element to vector or scalar
- vdupq_n_s8⚠neonDuplicate vector element to vector or scalar
- vdupq_n_s16⚠neonDuplicate vector element to vector or scalar
- vdupq_n_s32⚠neonDuplicate vector element to vector or scalar
- vdupq_n_s64⚠neonDuplicate vector element to vector or scalar
- vdupq_n_u8⚠neonDuplicate vector element to vector or scalar
- vdupq_n_u16⚠neonDuplicate vector element to vector or scalar
- vdupq_n_u32⚠neonDuplicate vector element to vector or scalar
- vdupq_n_u64⚠neonDuplicate vector element to vector or scalar
- vdups_lane_f32⚠neonSet all vector lanes to the same value
- vdups_lane_s32⚠neonSet all vector lanes to the same value
- vdups_lane_u32⚠neonSet all vector lanes to the same value
- vdups_laneq_f32⚠neonSet all vector lanes to the same value
- vdups_laneq_s32⚠neonSet all vector lanes to the same value
- vdups_laneq_u32⚠neonSet all vector lanes to the same value
- veor3q_s8⚠neon,sha3Three-way exclusive OR
- veor3q_s16⚠neon,sha3Three-way exclusive OR
- veor3q_s32⚠neon,sha3Three-way exclusive OR
- veor3q_s64⚠neon,sha3Three-way exclusive OR
- veor3q_u8⚠neon,sha3Three-way exclusive OR
- veor3q_u16⚠neon,sha3Three-way exclusive OR
- veor3q_u32⚠neon,sha3Three-way exclusive OR
- veor3q_u64⚠neon,sha3Three-way exclusive OR
- veor_s8⚠neonVector bitwise exclusive or (vector)
- veor_s16⚠neonVector bitwise exclusive or (vector)
- veor_s32⚠neonVector bitwise exclusive or (vector)
- veor_s64⚠neonVector bitwise exclusive or (vector)
- veor_u8⚠neonVector bitwise exclusive or (vector)
- veor_u16⚠neonVector bitwise exclusive or (vector)
- veor_u32⚠neonVector bitwise exclusive or (vector)
- veor_u64⚠neonVector bitwise exclusive or (vector)
- veorq_s8⚠neonVector bitwise exclusive or (vector)
- veorq_s16⚠neonVector bitwise exclusive or (vector)
- veorq_s32⚠neonVector bitwise exclusive or (vector)
- veorq_s64⚠neonVector bitwise exclusive or (vector)
- veorq_u8⚠neonVector bitwise exclusive or (vector)
- veorq_u16⚠neonVector bitwise exclusive or (vector)
- veorq_u32⚠neonVector bitwise exclusive or (vector)
- veorq_u64⚠neonVector bitwise exclusive or (vector)
- vext_f32⚠neonExtract vector from pair of vectors
- vext_f64⚠neonExtract vector from pair of vectors
- vext_p8⚠neonExtract vector from pair of vectors
- vext_p16⚠neonExtract vector from pair of vectors
- vext_p64⚠neonExtract vector from pair of vectors
- vext_s8⚠neonExtract vector from pair of vectors
- vext_s16⚠neonExtract vector from pair of vectors
- vext_s32⚠neonExtract vector from pair of vectors
- vext_s64⚠neonExtract vector from pair of vectors
- vext_u8⚠neonExtract vector from pair of vectors
- vext_u16⚠neonExtract vector from pair of vectors
- vext_u32⚠neonExtract vector from pair of vectors
- vext_u64⚠neonExtract vector from pair of vectors
- vextq_f32⚠neonExtract vector from pair of vectors
- vextq_f64⚠neonExtract vector from pair of vectors
- vextq_p8⚠neonExtract vector from pair of vectors
- vextq_p16⚠neonExtract vector from pair of vectors
- vextq_p64⚠neonExtract vector from pair of vectors
- vextq_s8⚠neonExtract vector from pair of vectors
- vextq_s16⚠neonExtract vector from pair of vectors
- vextq_s32⚠neonExtract vector from pair of vectors
- vextq_s64⚠neonExtract vector from pair of vectors
- vextq_u8⚠neonExtract vector from pair of vectors
- vextq_u16⚠neonExtract vector from pair of vectors
- vextq_u32⚠neonExtract vector from pair of vectors
- vextq_u64⚠neonExtract vector from pair of vectors
- vfma_f32⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfma_f64⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfma_lane_f32⚠neonFloating-point fused multiply-add to accumulator
- vfma_lane_f64⚠neonFloating-point fused multiply-add to accumulator
- vfma_laneq_f32⚠neonFloating-point fused multiply-add to accumulator
- vfma_laneq_f64⚠neonFloating-point fused multiply-add to accumulator
- vfma_n_f32⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfma_n_f64⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfmad_lane_f64⚠neonFloating-point fused multiply-add to accumulator
- vfmad_laneq_f64⚠neonFloating-point fused multiply-add to accumulator
- vfmaq_f32⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfmaq_f64⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfmaq_lane_f32⚠neonFloating-point fused multiply-add to accumulator
- vfmaq_lane_f64⚠neonFloating-point fused multiply-add to accumulator
- vfmaq_laneq_f32⚠neonFloating-point fused multiply-add to accumulator
- vfmaq_laneq_f64⚠neonFloating-point fused multiply-add to accumulator
- vfmaq_n_f32⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfmaq_n_f64⚠neonFloating-point fused Multiply-Add to accumulator(vector)
- vfmas_lane_f32⚠neonFloating-point fused multiply-add to accumulator
- vfmas_laneq_f32⚠neonFloating-point fused multiply-add to accumulator
- vfms_f32⚠neonFloating-point fused multiply-subtract from accumulator
- vfms_f64⚠neonFloating-point fused multiply-subtract from accumulator
- vfms_lane_f32⚠neonFloating-point fused multiply-subtract to accumulator
- vfms_lane_f64⚠neonFloating-point fused multiply-subtract to accumulator
- vfms_laneq_f32⚠neonFloating-point fused multiply-subtract to accumulator
- vfms_laneq_f64⚠neonFloating-point fused multiply-subtract to accumulator
- vfms_n_f32⚠neonFloating-point fused Multiply-subtract to accumulator(vector)
- vfms_n_f64⚠neonFloating-point fused Multiply-subtract to accumulator(vector)
- vfmsd_lane_f64⚠neonFloating-point fused multiply-subtract to accumulator
- vfmsd_laneq_f64⚠neonFloating-point fused multiply-subtract to accumulator
- vfmsq_f32⚠neonFloating-point fused multiply-subtract from accumulator
- vfmsq_f64⚠neonFloating-point fused multiply-subtract from accumulator
- vfmsq_lane_f32⚠neonFloating-point fused multiply-subtract to accumulator
- vfmsq_lane_f64⚠neonFloating-point fused multiply-subtract to accumulator
- vfmsq_laneq_f32⚠neonFloating-point fused multiply-subtract to accumulator
- vfmsq_laneq_f64⚠neonFloating-point fused multiply-subtract to accumulator
- vfmsq_n_f32⚠neonFloating-point fused Multiply-subtract to accumulator(vector)
- vfmsq_n_f64⚠neonFloating-point fused Multiply-subtract to accumulator(vector)
- vfmss_lane_f32⚠neonFloating-point fused multiply-subtract to accumulator
- vfmss_laneq_f32⚠neonFloating-point fused multiply-subtract to accumulator
- vget_high_f32⚠neonDuplicate vector element to vector or scalar
- vget_high_f64⚠neonDuplicate vector element to vector or scalar
- vget_high_p8⚠neonDuplicate vector element to vector or scalar
- vget_high_p16⚠neonDuplicate vector element to vector or scalar
- vget_high_p64⚠neonDuplicate vector element to vector or scalar
- vget_high_s8⚠neonDuplicate vector element to vector or scalar
- vget_high_s16⚠neonDuplicate vector element to vector or scalar
- vget_high_s32⚠neonDuplicate vector element to vector or scalar
- vget_high_s64⚠neonDuplicate vector element to vector or scalar
- vget_high_u8⚠neonDuplicate vector element to vector or scalar
- vget_high_u16⚠neonDuplicate vector element to vector or scalar
- vget_high_u32⚠neonDuplicate vector element to vector or scalar
- vget_high_u64⚠neonDuplicate vector element to vector or scalar
- vget_lane_f32⚠neonDuplicate vector element to vector or scalar
- vget_lane_f64⚠neonDuplicate vector element to vector or scalar
- vget_lane_p8⚠neonMove vector element to general-purpose register
- vget_lane_p16⚠neonMove vector element to general-purpose register
- vget_lane_p64⚠neonMove vector element to general-purpose register
- vget_lane_s8⚠neonMove vector element to general-purpose register
- vget_lane_s16⚠neonMove vector element to general-purpose register
- vget_lane_s32⚠neonMove vector element to general-purpose register
- vget_lane_s64⚠neonMove vector element to general-purpose register
- vget_lane_u8⚠neonMove vector element to general-purpose register
- vget_lane_u16⚠neonMove vector element to general-purpose register
- vget_lane_u32⚠neonMove vector element to general-purpose register
- vget_lane_u64⚠neonMove vector element to general-purpose register
- vget_low_f32⚠neonDuplicate vector element to vector or scalar
- vget_low_f64⚠neonDuplicate vector element to vector or scalar
- vget_low_p8⚠neonDuplicate vector element to vector or scalar
- vget_low_p16⚠neonDuplicate vector element to vector or scalar
- vget_low_p64⚠neonDuplicate vector element to vector or scalar
- vget_low_s8⚠neonDuplicate vector element to vector or scalar
- vget_low_s16⚠neonDuplicate vector element to vector or scalar
- vget_low_s32⚠neonDuplicate vector element to vector or scalar
- vget_low_s64⚠neonDuplicate vector element to vector or scalar
- vget_low_u8⚠neonDuplicate vector element to vector or scalar
- vget_low_u16⚠neonDuplicate vector element to vector or scalar
- vget_low_u32⚠neonDuplicate vector element to vector or scalar
- vget_low_u64⚠neonDuplicate vector element to vector or scalar
- vgetq_lane_f32⚠neonDuplicate vector element to vector or scalar
- vgetq_lane_f64⚠neonDuplicate vector element to vector or scalar
- vgetq_lane_p8⚠neonMove vector element to general-purpose register
- vgetq_lane_p16⚠neonMove vector element to general-purpose register
- vgetq_lane_p64⚠neonMove vector element to general-purpose register
- vgetq_lane_s8⚠neonMove vector element to general-purpose register
- vgetq_lane_s16⚠neonMove vector element to general-purpose register
- vgetq_lane_s32⚠neonMove vector element to general-purpose register
- vgetq_lane_s64⚠neonMove vector element to general-purpose register
- vgetq_lane_u8⚠neonMove vector element to general-purpose register
- vgetq_lane_u16⚠neonMove vector element to general-purpose register
- vgetq_lane_u32⚠neonMove vector element to general-purpose register
- vgetq_lane_u64⚠neonMove vector element to general-purpose register
- vhadd_s8⚠neonHalving add
- vhadd_s16⚠neonHalving add
- vhadd_s32⚠neonHalving add
- vhadd_u8⚠neonHalving add
- vhadd_u16⚠neonHalving add
- vhadd_u32⚠neonHalving add
- vhaddq_s8⚠neonHalving add
- vhaddq_s16⚠neonHalving add
- vhaddq_s32⚠neonHalving add
- vhaddq_u8⚠neonHalving add
- vhaddq_u16⚠neonHalving add
- vhaddq_u32⚠neonHalving add
- vhsub_s8⚠neonSigned halving subtract
- vhsub_s16⚠neonSigned halving subtract
- vhsub_s32⚠neonSigned halving subtract
- vhsub_u8⚠neonSigned halving subtract
- vhsub_u16⚠neonSigned halving subtract
- vhsub_u32⚠neonSigned halving subtract
- vhsubq_s8⚠neonSigned halving subtract
- vhsubq_s16⚠neonSigned halving subtract
- vhsubq_s32⚠neonSigned halving subtract
- vhsubq_u8⚠neonSigned halving subtract
- vhsubq_u16⚠neonSigned halving subtract
- vhsubq_u32⚠neonSigned halving subtract
- vld1_dup_f32⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_f64⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_dup_p8⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_p16⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_p64⚠neon,aesLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_s8⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_s16⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_s32⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_s64⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_u8⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_u16⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_u32⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_dup_u64⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1_f32⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_f32_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_f32_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_f32_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_f64⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_f64_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_f64_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_f64_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_lane_f32⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_f64⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_p8⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_p16⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_p64⚠neon,aesLoad one single-element structure to one lane of one register.
- vld1_lane_s8⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_s16⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_s32⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_s64⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_u8⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_u16⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_u32⚠neonLoad one single-element structure to one lane of one register.
- vld1_lane_u64⚠neonLoad one single-element structure to one lane of one register.
- vld1_p8⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_p8_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_p8_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_p8_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_p16⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_p16_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_p16_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_p16_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_p64⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers.
- vld1_p64_x2⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers
- vld1_p64_x3⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers
- vld1_p64_x4⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers
- vld1_s8⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_s8_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s8_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s8_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s16⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_s16_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s16_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s16_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s32⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_s32_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s32_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s32_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s64⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_s64_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s64_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_s64_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u8⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_u8_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u8_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u8_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u16⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_u16_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u16_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u16_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u32⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_u32_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u32_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u32_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u64⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1_u64_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u64_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1_u64_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_dup_f32⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_f64⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_dup_p8⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_p16⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_p64⚠neon,aesLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_s8⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_s16⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_s32⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_s64⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_u8⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_u16⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_u32⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_dup_u64⚠neonLoad one single-element structure and Replicate to all lanes (of one register).
- vld1q_f32⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_f32_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_f32_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_f32_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_f64⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_f64_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_f64_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_f64_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_lane_f32⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_f64⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_p8⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_p16⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_p64⚠neon,aesLoad one single-element structure to one lane of one register.
- vld1q_lane_s8⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_s16⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_s32⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_s64⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_u8⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_u16⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_u32⚠neonLoad one single-element structure to one lane of one register.
- vld1q_lane_u64⚠neonLoad one single-element structure to one lane of one register.
- vld1q_p8⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_p8_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p8_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p8_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p16⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_p16_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p16_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p16_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p64⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_p64_x2⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p64_x3⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers
- vld1q_p64_x4⚠neon,aesLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s8⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_s8_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s8_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s8_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s16⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_s16_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s16_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s16_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s32⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_s32_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s32_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s32_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s64⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_s64_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s64_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_s64_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u8⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_u8_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u8_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u8_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u16⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_u16_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u16_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u16_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u32⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_u32_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u32_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u32_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u64⚠neonLoad multiple single-element structures to one, two, three, or four registers.
- vld1q_u64_x2⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u64_x3⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld1q_u64_x4⚠neonLoad multiple single-element structures to one, two, three, or four registers
- vld2_dup_f32⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_f64⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_p8⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_p16⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_p64⚠neon,aesLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_s8⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_s16⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_s32⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_s64⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_u8⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_u16⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_u32⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_dup_u64⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2_f32⚠neonLoad multiple 2-element structures to two registers
- vld2_f64⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_f32⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_f64⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_p8⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_p16⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_p64⚠neon,aesLoad multiple 2-element structures to two registers
- vld2_lane_s8⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_s16⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_s32⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_s64⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_u8⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_u16⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_u32⚠neonLoad multiple 2-element structures to two registers
- vld2_lane_u64⚠neonLoad multiple 2-element structures to two registers
- vld2_p8⚠neonLoad multiple 2-element structures to two registers
- vld2_p16⚠neonLoad multiple 2-element structures to two registers
- vld2_p64⚠neon,aesLoad multiple 2-element structures to two registers
- vld2_s8⚠neonLoad multiple 2-element structures to two registers
- vld2_s16⚠neonLoad multiple 2-element structures to two registers
- vld2_s32⚠neonLoad multiple 2-element structures to two registers
- vld2_s64⚠neonLoad multiple 2-element structures to two registers
- vld2_u8⚠neonLoad multiple 2-element structures to two registers
- vld2_u16⚠neonLoad multiple 2-element structures to two registers
- vld2_u32⚠neonLoad multiple 2-element structures to two registers
- vld2_u64⚠neonLoad multiple 2-element structures to two registers
- vld2q_dup_f32⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_f64⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_p8⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_p16⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_p64⚠neon,aesLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_s8⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_s16⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_s32⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_s64⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_u8⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_u16⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_u32⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_dup_u64⚠neonLoad single 2-element structure and replicate to all lanes of two registers
- vld2q_f32⚠neonLoad multiple 2-element structures to two registers
- vld2q_f64⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_f32⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_f64⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_p8⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_p16⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_p64⚠neon,aesLoad multiple 2-element structures to two registers
- vld2q_lane_s8⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_s16⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_s32⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_s64⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_u8⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_u16⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_u32⚠neonLoad multiple 2-element structures to two registers
- vld2q_lane_u64⚠neonLoad multiple 2-element structures to two registers
- vld2q_p8⚠neonLoad multiple 2-element structures to two registers
- vld2q_p16⚠neonLoad multiple 2-element structures to two registers
- vld2q_p64⚠neon,aesLoad multiple 2-element structures to two registers
- vld2q_s8⚠neonLoad multiple 2-element structures to two registers
- vld2q_s16⚠neonLoad multiple 2-element structures to two registers
- vld2q_s32⚠neonLoad multiple 2-element structures to two registers
- vld2q_s64⚠neonLoad multiple 2-element structures to two registers
- vld2q_u8⚠neonLoad multiple 2-element structures to two registers
- vld2q_u16⚠neonLoad multiple 2-element structures to two registers
- vld2q_u32⚠neonLoad multiple 2-element structures to two registers
- vld2q_u64⚠neonLoad multiple 2-element structures to two registers
- vld3_dup_f32⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_f64⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_p8⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_p16⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_p64⚠neon,aesLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_s8⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_s16⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_s32⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_s64⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_u8⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_u16⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_u32⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_dup_u64⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3_f32⚠neonLoad multiple 3-element structures to three registers
- vld3_f64⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_f32⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_f64⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_p8⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_p16⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_p64⚠neon,aesLoad multiple 3-element structures to three registers
- vld3_lane_s8⚠neonLoad multiple 3-element structures to two registers
- vld3_lane_s16⚠neonLoad multiple 3-element structures to two registers
- vld3_lane_s32⚠neonLoad multiple 3-element structures to two registers
- vld3_lane_s64⚠neonLoad multiple 3-element structures to two registers
- vld3_lane_u8⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_u16⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_u32⚠neonLoad multiple 3-element structures to three registers
- vld3_lane_u64⚠neonLoad multiple 3-element structures to three registers
- vld3_p8⚠neonLoad multiple 3-element structures to three registers
- vld3_p16⚠neonLoad multiple 3-element structures to three registers
- vld3_p64⚠neon,aesLoad multiple 3-element structures to three registers
- vld3_s8⚠neonLoad multiple 3-element structures to three registers
- vld3_s16⚠neonLoad multiple 3-element structures to three registers
- vld3_s32⚠neonLoad multiple 3-element structures to three registers
- vld3_s64⚠neonLoad multiple 3-element structures to three registers
- vld3_u8⚠neonLoad multiple 3-element structures to three registers
- vld3_u16⚠neonLoad multiple 3-element structures to three registers
- vld3_u32⚠neonLoad multiple 3-element structures to three registers
- vld3_u64⚠neonLoad multiple 3-element structures to three registers
- vld3q_dup_f32⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_f64⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_p8⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_p16⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_p64⚠neon,aesLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_s8⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_s16⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_s32⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_s64⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_u8⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_u16⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_u32⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_dup_u64⚠neonLoad single 3-element structure and replicate to all lanes of three registers
- vld3q_f32⚠neonLoad multiple 3-element structures to three registers
- vld3q_f64⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_f32⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_f64⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_p8⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_p16⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_p64⚠neon,aesLoad multiple 3-element structures to three registers
- vld3q_lane_s8⚠neonLoad multiple 3-element structures to two registers
- vld3q_lane_s16⚠neonLoad multiple 3-element structures to two registers
- vld3q_lane_s32⚠neonLoad multiple 3-element structures to two registers
- vld3q_lane_s64⚠neonLoad multiple 3-element structures to two registers
- vld3q_lane_u8⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_u16⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_u32⚠neonLoad multiple 3-element structures to three registers
- vld3q_lane_u64⚠neonLoad multiple 3-element structures to three registers
- vld3q_p8⚠neonLoad multiple 3-element structures to three registers
- vld3q_p16⚠neonLoad multiple 3-element structures to three registers
- vld3q_p64⚠neon,aesLoad multiple 3-element structures to three registers
- vld3q_s8⚠neonLoad multiple 3-element structures to three registers
- vld3q_s16⚠neonLoad multiple 3-element structures to three registers
- vld3q_s32⚠neonLoad multiple 3-element structures to three registers
- vld3q_s64⚠neonLoad multiple 3-element structures to three registers
- vld3q_u8⚠neonLoad multiple 3-element structures to three registers
- vld3q_u16⚠neonLoad multiple 3-element structures to three registers
- vld3q_u32⚠neonLoad multiple 3-element structures to three registers
- vld3q_u64⚠neonLoad multiple 3-element structures to three registers
- vld4_dup_f32⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_f64⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_p8⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_p16⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_p64⚠neon,aesLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_s8⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_s16⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_s32⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_s64⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_u8⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_u16⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_u32⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_dup_u64⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4_f32⚠neonLoad multiple 4-element structures to four registers
- vld4_f64⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_f32⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_f64⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_p8⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_p16⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_p64⚠neon,aesLoad multiple 4-element structures to four registers
- vld4_lane_s8⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_s16⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_s32⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_s64⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_u8⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_u16⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_u32⚠neonLoad multiple 4-element structures to four registers
- vld4_lane_u64⚠neonLoad multiple 4-element structures to four registers
- vld4_p8⚠neonLoad multiple 4-element structures to four registers
- vld4_p16⚠neonLoad multiple 4-element structures to four registers
- vld4_p64⚠neon,aesLoad multiple 4-element structures to four registers
- vld4_s8⚠neonLoad multiple 4-element structures to four registers
- vld4_s16⚠neonLoad multiple 4-element structures to four registers
- vld4_s32⚠neonLoad multiple 4-element structures to four registers
- vld4_s64⚠neonLoad multiple 4-element structures to four registers
- vld4_u8⚠neonLoad multiple 4-element structures to four registers
- vld4_u16⚠neonLoad multiple 4-element structures to four registers
- vld4_u32⚠neonLoad multiple 4-element structures to four registers
- vld4_u64⚠neonLoad multiple 4-element structures to four registers
- vld4q_dup_f32⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_f64⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_p8⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_p16⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_p64⚠neon,aesLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_s8⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_s16⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_s32⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_s64⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_u8⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_u16⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_u32⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_dup_u64⚠neonLoad single 4-element structure and replicate to all lanes of four registers
- vld4q_f32⚠neonLoad multiple 4-element structures to four registers
- vld4q_f64⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_f32⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_f64⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_p8⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_p16⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_p64⚠neon,aesLoad multiple 4-element structures to four registers
- vld4q_lane_s8⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_s16⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_s32⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_s64⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_u8⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_u16⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_u32⚠neonLoad multiple 4-element structures to four registers
- vld4q_lane_u64⚠neonLoad multiple 4-element structures to four registers
- vld4q_p8⚠neonLoad multiple 4-element structures to four registers
- vld4q_p16⚠neonLoad multiple 4-element structures to four registers
- vld4q_p64⚠neon,aesLoad multiple 4-element structures to four registers
- vld4q_s8⚠neonLoad multiple 4-element structures to four registers
- vld4q_s16⚠neonLoad multiple 4-element structures to four registers
- vld4q_s32⚠neonLoad multiple 4-element structures to four registers
- vld4q_s64⚠neonLoad multiple 4-element structures to four registers
- vld4q_u8⚠neonLoad multiple 4-element structures to four registers
- vld4q_u16⚠neonLoad multiple 4-element structures to four registers
- vld4q_u32⚠neonLoad multiple 4-element structures to four registers
- vld4q_u64⚠neonLoad multiple 4-element structures to four registers
- vldrq_p128⚠neonLoad SIMD&FP register (immediate offset)
- vmax_f32⚠neonMaximum (vector)
- vmax_f64⚠neonMaximum (vector)
- vmax_s8⚠neonMaximum (vector)
- vmax_s16⚠neonMaximum (vector)
- vmax_s32⚠neonMaximum (vector)
- vmax_u8⚠neonMaximum (vector)
- vmax_u16⚠neonMaximum (vector)
- vmax_u32⚠neonMaximum (vector)
- vmaxnm_f32⚠neonFloating-point Maximum Number (vector)
- vmaxnm_f64⚠neonFloating-point Maximum Number (vector)
- vmaxnmq_f32⚠neonFloating-point Maximum Number (vector)
- vmaxnmq_f64⚠neonFloating-point Maximum Number (vector)
- vmaxnmv_f32⚠neonFloating-point maximum number across vector
- vmaxnmvq_f32⚠neonFloating-point maximum number across vector
- vmaxnmvq_f64⚠neonFloating-point maximum number across vector
- vmaxq_f32⚠neonMaximum (vector)
- vmaxq_f64⚠neonMaximum (vector)
- vmaxq_s8⚠neonMaximum (vector)
- vmaxq_s16⚠neonMaximum (vector)
- vmaxq_s32⚠neonMaximum (vector)
- vmaxq_u8⚠neonMaximum (vector)
- vmaxq_u16⚠neonMaximum (vector)
- vmaxq_u32⚠neonMaximum (vector)
- vmaxv_f32⚠neonHorizontal vector max.
- vmaxv_s8⚠neonHorizontal vector max.
- vmaxv_s16⚠neonHorizontal vector max.
- vmaxv_s32⚠neonHorizontal vector max.
- vmaxv_u8⚠neonHorizontal vector max.
- vmaxv_u16⚠neonHorizontal vector max.
- vmaxv_u32⚠neonHorizontal vector max.
- vmaxvq_f32⚠neonHorizontal vector max.
- vmaxvq_f64⚠neonHorizontal vector max.
- vmaxvq_s8⚠neonHorizontal vector max.
- vmaxvq_s16⚠neonHorizontal vector max.
- vmaxvq_s32⚠neonHorizontal vector max.
- vmaxvq_u8⚠neonHorizontal vector max.
- vmaxvq_u16⚠neonHorizontal vector max.
- vmaxvq_u32⚠neonHorizontal vector max.
- vmin_f32⚠neonMinimum (vector)
- vmin_f64⚠neonMinimum (vector)
- vmin_s8⚠neonMinimum (vector)
- vmin_s16⚠neonMinimum (vector)
- vmin_s32⚠neonMinimum (vector)
- vmin_u8⚠neonMinimum (vector)
- vmin_u16⚠neonMinimum (vector)
- vmin_u32⚠neonMinimum (vector)
- vminnm_f32⚠neonFloating-point Minimum Number (vector)
- vminnm_f64⚠neonFloating-point Minimum Number (vector)
- vminnmq_f32⚠neonFloating-point Minimum Number (vector)
- vminnmq_f64⚠neonFloating-point Minimum Number (vector)
- vminnmv_f32⚠neonFloating-point minimum number across vector
- vminnmvq_f32⚠neonFloating-point minimum number across vector
- vminnmvq_f64⚠neonFloating-point minimum number across vector
- vminq_f32⚠neonMinimum (vector)
- vminq_f64⚠neonMinimum (vector)
- vminq_s8⚠neonMinimum (vector)
- vminq_s16⚠neonMinimum (vector)
- vminq_s32⚠neonMinimum (vector)
- vminq_u8⚠neonMinimum (vector)
- vminq_u16⚠neonMinimum (vector)
- vminq_u32⚠neonMinimum (vector)
- vminv_f32⚠neonHorizontal vector min.
- vminv_s8⚠neonHorizontal vector min.
- vminv_s16⚠neonHorizontal vector min.
- vminv_s32⚠neonHorizontal vector min.
- vminv_u8⚠neonHorizontal vector min.
- vminv_u16⚠neonHorizontal vector min.
- vminv_u32⚠neonHorizontal vector min.
- vminvq_f32⚠neonHorizontal vector min.
- vminvq_f64⚠neonHorizontal vector min.
- vminvq_s8⚠neonHorizontal vector min.
- vminvq_s16⚠neonHorizontal vector min.
- vminvq_s32⚠neonHorizontal vector min.
- vminvq_u8⚠neonHorizontal vector min.
- vminvq_u16⚠neonHorizontal vector min.
- vminvq_u32⚠neonHorizontal vector min.
- vmla_f32⚠neonFloating-point multiply-add to accumulator
- vmla_f64⚠neonFloating-point multiply-add to accumulator
- vmla_lane_f32⚠neonVector multiply accumulate with scalar
- vmla_lane_s16⚠neonVector multiply accumulate with scalar
- vmla_lane_s32⚠neonVector multiply accumulate with scalar
- vmla_lane_u16⚠neonVector multiply accumulate with scalar
- vmla_lane_u32⚠neonVector multiply accumulate with scalar
- vmla_laneq_f32⚠neonVector multiply accumulate with scalar
- vmla_laneq_s16⚠neonVector multiply accumulate with scalar
- vmla_laneq_s32⚠neonVector multiply accumulate with scalar
- vmla_laneq_u16⚠neonVector multiply accumulate with scalar
- vmla_laneq_u32⚠neonVector multiply accumulate with scalar
- vmla_n_f32⚠neonVector multiply accumulate with scalar
- vmla_n_s16⚠neonVector multiply accumulate with scalar
- vmla_n_s32⚠neonVector multiply accumulate with scalar
- vmla_n_u16⚠neonVector multiply accumulate with scalar
- vmla_n_u32⚠neonVector multiply accumulate with scalar
- vmla_s8⚠neonMultiply-add to accumulator
- vmla_s16⚠neonMultiply-add to accumulator
- vmla_s32⚠neonMultiply-add to accumulator
- vmla_u8⚠neonMultiply-add to accumulator
- vmla_u16⚠neonMultiply-add to accumulator
- vmla_u32⚠neonMultiply-add to accumulator
- vmlal_high_lane_s16⚠neonMultiply-add long
- vmlal_high_lane_s32⚠neonMultiply-add long
- vmlal_high_lane_u16⚠neonMultiply-add long
- vmlal_high_lane_u32⚠neonMultiply-add long
- vmlal_high_laneq_s16⚠neonMultiply-add long
- vmlal_high_laneq_s32⚠neonMultiply-add long
- vmlal_high_laneq_u16⚠neonMultiply-add long
- vmlal_high_laneq_u32⚠neonMultiply-add long
- vmlal_high_n_s16⚠neonMultiply-add long
- vmlal_high_n_s32⚠neonMultiply-add long
- vmlal_high_n_u16⚠neonMultiply-add long
- vmlal_high_n_u32⚠neonMultiply-add long
- vmlal_high_s8⚠neonSigned multiply-add long
- vmlal_high_s16⚠neonSigned multiply-add long
- vmlal_high_s32⚠neonSigned multiply-add long
- vmlal_high_u8⚠neonUnsigned multiply-add long
- vmlal_high_u16⚠neonUnsigned multiply-add long
- vmlal_high_u32⚠neonUnsigned multiply-add long
- vmlal_lane_s16⚠neonVector widening multiply accumulate with scalar
- vmlal_lane_s32⚠neonVector widening multiply accumulate with scalar
- vmlal_lane_u16⚠neonVector widening multiply accumulate with scalar
- vmlal_lane_u32⚠neonVector widening multiply accumulate with scalar
- vmlal_laneq_s16⚠neonVector widening multiply accumulate with scalar
- vmlal_laneq_s32⚠neonVector widening multiply accumulate with scalar
- vmlal_laneq_u16⚠neonVector widening multiply accumulate with scalar
- vmlal_laneq_u32⚠neonVector widening multiply accumulate with scalar
- vmlal_n_s16⚠neonVector widening multiply accumulate with scalar
- vmlal_n_s32⚠neonVector widening multiply accumulate with scalar
- vmlal_n_u16⚠neonVector widening multiply accumulate with scalar
- vmlal_n_u32⚠neonVector widening multiply accumulate with scalar
- vmlal_s8⚠neonSigned multiply-add long
- vmlal_s16⚠neonSigned multiply-add long
- vmlal_s32⚠neonSigned multiply-add long
- vmlal_u8⚠neonUnsigned multiply-add long
- vmlal_u16⚠neonUnsigned multiply-add long
- vmlal_u32⚠neonUnsigned multiply-add long
- vmlaq_f32⚠neonFloating-point multiply-add to accumulator
- vmlaq_f64⚠neonFloating-point multiply-add to accumulator
- vmlaq_lane_f32⚠neonVector multiply accumulate with scalar
- vmlaq_lane_s16⚠neonVector multiply accumulate with scalar
- vmlaq_lane_s32⚠neonVector multiply accumulate with scalar
- vmlaq_lane_u16⚠neonVector multiply accumulate with scalar
- vmlaq_lane_u32⚠neonVector multiply accumulate with scalar
- vmlaq_laneq_f32⚠neonVector multiply accumulate with scalar
- vmlaq_laneq_s16⚠neonVector multiply accumulate with scalar
- vmlaq_laneq_s32⚠neonVector multiply accumulate with scalar
- vmlaq_laneq_u16⚠neonVector multiply accumulate with scalar
- vmlaq_laneq_u32⚠neonVector multiply accumulate with scalar
- vmlaq_n_f32⚠neonVector multiply accumulate with scalar
- vmlaq_n_s16⚠neonVector multiply accumulate with scalar
- vmlaq_n_s32⚠neonVector multiply accumulate with scalar
- vmlaq_n_u16⚠neonVector multiply accumulate with scalar
- vmlaq_n_u32⚠neonVector multiply accumulate with scalar
- vmlaq_s8⚠neonMultiply-add to accumulator
- vmlaq_s16⚠neonMultiply-add to accumulator
- vmlaq_s32⚠neonMultiply-add to accumulator
- vmlaq_u8⚠neonMultiply-add to accumulator
- vmlaq_u16⚠neonMultiply-add to accumulator
- vmlaq_u32⚠neonMultiply-add to accumulator
- vmls_f32⚠neonFloating-point multiply-subtract from accumulator
- vmls_f64⚠neonFloating-point multiply-subtract from accumulator
- vmls_lane_f32⚠neonVector multiply subtract with scalar
- vmls_lane_s16⚠neonVector multiply subtract with scalar
- vmls_lane_s32⚠neonVector multiply subtract with scalar
- vmls_lane_u16⚠neonVector multiply subtract with scalar
- vmls_lane_u32⚠neonVector multiply subtract with scalar
- vmls_laneq_f32⚠neonVector multiply subtract with scalar
- vmls_laneq_s16⚠neonVector multiply subtract with scalar
- vmls_laneq_s32⚠neonVector multiply subtract with scalar
- vmls_laneq_u16⚠neonVector multiply subtract with scalar
- vmls_laneq_u32⚠neonVector multiply subtract with scalar
- vmls_n_f32⚠neonVector multiply subtract with scalar
- vmls_n_s16⚠neonVector multiply subtract with scalar
- vmls_n_s32⚠neonVector multiply subtract with scalar
- vmls_n_u16⚠neonVector multiply subtract with scalar
- vmls_n_u32⚠neonVector multiply subtract with scalar
- vmls_s8⚠neonMultiply-subtract from accumulator
- vmls_s16⚠neonMultiply-subtract from accumulator
- vmls_s32⚠neonMultiply-subtract from accumulator
- vmls_u8⚠neonMultiply-subtract from accumulator
- vmls_u16⚠neonMultiply-subtract from accumulator
- vmls_u32⚠neonMultiply-subtract from accumulator
- vmlsl_high_lane_s16⚠neonMultiply-subtract long
- vmlsl_high_lane_s32⚠neonMultiply-subtract long
- vmlsl_high_lane_u16⚠neonMultiply-subtract long
- vmlsl_high_lane_u32⚠neonMultiply-subtract long
- vmlsl_high_laneq_s16⚠neonMultiply-subtract long
- vmlsl_high_laneq_s32⚠neonMultiply-subtract long
- vmlsl_high_laneq_u16⚠neonMultiply-subtract long
- vmlsl_high_laneq_u32⚠neonMultiply-subtract long
- vmlsl_high_n_s16⚠neonMultiply-subtract long
- vmlsl_high_n_s32⚠neonMultiply-subtract long
- vmlsl_high_n_u16⚠neonMultiply-subtract long
- vmlsl_high_n_u32⚠neonMultiply-subtract long
- vmlsl_high_s8⚠neonSigned multiply-subtract long
- vmlsl_high_s16⚠neonSigned multiply-subtract long
- vmlsl_high_s32⚠neonSigned multiply-subtract long
- vmlsl_high_u8⚠neonUnsigned multiply-subtract long
- vmlsl_high_u16⚠neonUnsigned multiply-subtract long
- vmlsl_high_u32⚠neonUnsigned multiply-subtract long
- vmlsl_lane_s16⚠neonVector widening multiply subtract with scalar
- vmlsl_lane_s32⚠neonVector widening multiply subtract with scalar
- vmlsl_lane_u16⚠neonVector widening multiply subtract with scalar
- vmlsl_lane_u32⚠neonVector widening multiply subtract with scalar
- vmlsl_laneq_s16⚠neonVector widening multiply subtract with scalar
- vmlsl_laneq_s32⚠neonVector widening multiply subtract with scalar
- vmlsl_laneq_u16⚠neonVector widening multiply subtract with scalar
- vmlsl_laneq_u32⚠neonVector widening multiply subtract with scalar
- vmlsl_n_s16⚠neonVector widening multiply subtract with scalar
- vmlsl_n_s32⚠neonVector widening multiply subtract with scalar
- vmlsl_n_u16⚠neonVector widening multiply subtract with scalar
- vmlsl_n_u32⚠neonVector widening multiply subtract with scalar
- vmlsl_s8⚠neonSigned multiply-subtract long
- vmlsl_s16⚠neonSigned multiply-subtract long
- vmlsl_s32⚠neonSigned multiply-subtract long
- vmlsl_u8⚠neonUnsigned multiply-subtract long
- vmlsl_u16⚠neonUnsigned multiply-subtract long
- vmlsl_u32⚠neonUnsigned multiply-subtract long
- vmlsq_f32⚠neonFloating-point multiply-subtract from accumulator
- vmlsq_f64⚠neonFloating-point multiply-subtract from accumulator
- vmlsq_lane_f32⚠neonVector multiply subtract with scalar
- vmlsq_lane_s16⚠neonVector multiply subtract with scalar
- vmlsq_lane_s32⚠neonVector multiply subtract with scalar
- vmlsq_lane_u16⚠neonVector multiply subtract with scalar
- vmlsq_lane_u32⚠neonVector multiply subtract with scalar
- vmlsq_laneq_f32⚠neonVector multiply subtract with scalar
- vmlsq_laneq_s16⚠neonVector multiply subtract with scalar
- vmlsq_laneq_s32⚠neonVector multiply subtract with scalar
- vmlsq_laneq_u16⚠neonVector multiply subtract with scalar
- vmlsq_laneq_u32⚠neonVector multiply subtract with scalar
- vmlsq_n_f32⚠neonVector multiply subtract with scalar
- vmlsq_n_s16⚠neonVector multiply subtract with scalar
- vmlsq_n_s32⚠neonVector multiply subtract with scalar
- vmlsq_n_u16⚠neonVector multiply subtract with scalar
- vmlsq_n_u32⚠neonVector multiply subtract with scalar
- vmlsq_s8⚠neonMultiply-subtract from accumulator
- vmlsq_s16⚠neonMultiply-subtract from accumulator
- vmlsq_s32⚠neonMultiply-subtract from accumulator
- vmlsq_u8⚠neonMultiply-subtract from accumulator
- vmlsq_u16⚠neonMultiply-subtract from accumulator
- vmlsq_u32⚠neonMultiply-subtract from accumulator
- vmov_n_f32⚠neonDuplicate vector element to vector or scalar
- vmov_n_f64⚠neonDuplicate vector element to vector or scalar
- vmov_n_p8⚠neonDuplicate vector element to vector or scalar
- vmov_n_p16⚠neonDuplicate vector element to vector or scalar
- vmov_n_p64⚠neonDuplicate vector element to vector or scalar
- vmov_n_s8⚠neonDuplicate vector element to vector or scalar
- vmov_n_s16⚠neonDuplicate vector element to vector or scalar
- vmov_n_s32⚠neonDuplicate vector element to vector or scalar
- vmov_n_s64⚠neonDuplicate vector element to vector or scalar
- vmov_n_u8⚠neonDuplicate vector element to vector or scalar
- vmov_n_u16⚠neonDuplicate vector element to vector or scalar
- vmov_n_u32⚠neonDuplicate vector element to vector or scalar
- vmov_n_u64⚠neonDuplicate vector element to vector or scalar
- vmovl_high_s8⚠neonVector move
- vmovl_high_s16⚠neonVector move
- vmovl_high_s32⚠neonVector move
- vmovl_high_u8⚠neonVector move
- vmovl_high_u16⚠neonVector move
- vmovl_high_u32⚠neonVector move
- vmovl_s8⚠neonVector long move.
- vmovl_s16⚠neonVector long move.
- vmovl_s32⚠neonVector long move.
- vmovl_u8⚠neonVector long move.
- vmovl_u16⚠neonVector long move.
- vmovl_u32⚠neonVector long move.
- vmovn_high_s16⚠neonExtract narrow
- vmovn_high_s32⚠neonExtract narrow
- vmovn_high_s64⚠neonExtract narrow
- vmovn_high_u16⚠neonExtract narrow
- vmovn_high_u32⚠neonExtract narrow
- vmovn_high_u64⚠neonExtract narrow
- vmovn_s16⚠neonVector narrow integer.
- vmovn_s32⚠neonVector narrow integer.
- vmovn_s64⚠neonVector narrow integer.
- vmovn_u16⚠neonVector narrow integer.
- vmovn_u32⚠neonVector narrow integer.
- vmovn_u64⚠neonVector narrow integer.
- vmovq_n_f32⚠neonDuplicate vector element to vector or scalar
- vmovq_n_f64⚠neonDuplicate vector element to vector or scalar
- vmovq_n_p8⚠neonDuplicate vector element to vector or scalar
- vmovq_n_p16⚠neonDuplicate vector element to vector or scalar
- vmovq_n_p64⚠neonDuplicate vector element to vector or scalar
- vmovq_n_s8⚠neonDuplicate vector element to vector or scalar
- vmovq_n_s16⚠neonDuplicate vector element to vector or scalar
- vmovq_n_s32⚠neonDuplicate vector element to vector or scalar
- vmovq_n_s64⚠neonDuplicate vector element to vector or scalar
- vmovq_n_u8⚠neonDuplicate vector element to vector or scalar
- vmovq_n_u16⚠neonDuplicate vector element to vector or scalar
- vmovq_n_u32⚠neonDuplicate vector element to vector or scalar
- vmovq_n_u64⚠neonDuplicate vector element to vector or scalar
- vmul_f32⚠neonMultiply
- vmul_f64⚠neonMultiply
- vmul_lane_f32⚠neonFloating-point multiply
- vmul_lane_f64⚠neonFloating-point multiply
- vmul_lane_s16⚠neonMultiply
- vmul_lane_s32⚠neonMultiply
- vmul_lane_u16⚠neonMultiply
- vmul_lane_u32⚠neonMultiply
- vmul_laneq_f32⚠neonFloating-point multiply
- vmul_laneq_f64⚠neonFloating-point multiply
- vmul_laneq_s16⚠neonMultiply
- vmul_laneq_s32⚠neonMultiply
- vmul_laneq_u16⚠neonMultiply
- vmul_laneq_u32⚠neonMultiply
- vmul_n_f32⚠neonVector multiply by scalar
- vmul_n_f64⚠neonVector multiply by scalar
- vmul_n_s16⚠neonVector multiply by scalar
- vmul_n_s32⚠neonVector multiply by scalar
- vmul_n_u16⚠neonVector multiply by scalar
- vmul_n_u32⚠neonVector multiply by scalar
- vmul_p8⚠neonPolynomial multiply
- vmul_s8⚠neonMultiply
- vmul_s16⚠neonMultiply
- vmul_s32⚠neonMultiply
- vmul_u8⚠neonMultiply
- vmul_u16⚠neonMultiply
- vmul_u32⚠neonMultiply
- vmuld_lane_f64⚠neonFloating-point multiply
- vmuld_laneq_f64⚠neonFloating-point multiply
- vmull_high_lane_s16⚠neonMultiply long
- vmull_high_lane_s32⚠neonMultiply long
- vmull_high_lane_u16⚠neonMultiply long
- vmull_high_lane_u32⚠neonMultiply long
- vmull_high_laneq_s16⚠neonMultiply long
- vmull_high_laneq_s32⚠neonMultiply long
- vmull_high_laneq_u16⚠neonMultiply long
- vmull_high_laneq_u32⚠neonMultiply long
- vmull_high_n_s16⚠neonMultiply long
- vmull_high_n_s32⚠neonMultiply long
- vmull_high_n_u16⚠neonMultiply long
- vmull_high_n_u32⚠neonMultiply long
- vmull_high_p8⚠neonPolynomial multiply long
- vmull_high_p64⚠neon,aesPolynomial multiply long
- vmull_high_s8⚠neonSigned multiply long
- vmull_high_s16⚠neonSigned multiply long
- vmull_high_s32⚠neonSigned multiply long
- vmull_high_u8⚠neonUnsigned multiply long
- vmull_high_u16⚠neonUnsigned multiply long
- vmull_high_u32⚠neonUnsigned multiply long
- vmull_lane_s16⚠neonVector long multiply by scalar
- vmull_lane_s32⚠neonVector long multiply by scalar
- vmull_lane_u16⚠neonVector long multiply by scalar
- vmull_lane_u32⚠neonVector long multiply by scalar
- vmull_laneq_s16⚠neonVector long multiply by scalar
- vmull_laneq_s32⚠neonVector long multiply by scalar
- vmull_laneq_u16⚠neonVector long multiply by scalar
- vmull_laneq_u32⚠neonVector long multiply by scalar
- vmull_n_s16⚠neonVector long multiply with scalar
- vmull_n_s32⚠neonVector long multiply with scalar
- vmull_n_u16⚠neonVector long multiply with scalar
- vmull_n_u32⚠neonVector long multiply with scalar
- vmull_p8⚠neonPolynomial multiply long
- vmull_p64⚠neon,aesPolynomial multiply long
- vmull_s8⚠neonSigned multiply long
- vmull_s16⚠neonSigned multiply long
- vmull_s32⚠neonSigned multiply long
- vmull_u8⚠neonUnsigned multiply long
- vmull_u16⚠neonUnsigned multiply long
- vmull_u32⚠neonUnsigned multiply long
- vmulq_f32⚠neonMultiply
- vmulq_f64⚠neonMultiply
- vmulq_lane_f32⚠neonFloating-point multiply
- vmulq_lane_f64⚠neonFloating-point multiply
- vmulq_lane_s16⚠neonMultiply
- vmulq_lane_s32⚠neonMultiply
- vmulq_lane_u16⚠neonMultiply
- vmulq_lane_u32⚠neonMultiply
- vmulq_laneq_f32⚠neonFloating-point multiply
- vmulq_laneq_f64⚠neonFloating-point multiply
- vmulq_laneq_s16⚠neonMultiply
- vmulq_laneq_s32⚠neonMultiply
- vmulq_laneq_u16⚠neonMultiply
- vmulq_laneq_u32⚠neonMultiply
- vmulq_n_f32⚠neonVector multiply by scalar
- vmulq_n_f64⚠neonVector multiply by scalar
- vmulq_n_s16⚠neonVector multiply by scalar
- vmulq_n_s32⚠neonVector multiply by scalar
- vmulq_n_u16⚠neonVector multiply by scalar
- vmulq_n_u32⚠neonVector multiply by scalar
- vmulq_p8⚠neonPolynomial multiply
- vmulq_s8⚠neonMultiply
- vmulq_s16⚠neonMultiply
- vmulq_s32⚠neonMultiply
- vmulq_u8⚠neonMultiply
- vmulq_u16⚠neonMultiply
- vmulq_u32⚠neonMultiply
- vmuls_lane_f32⚠neonFloating-point multiply
- vmuls_laneq_f32⚠neonFloating-point multiply
- vmulx_f32⚠neonFloating-point multiply extended
- vmulx_f64⚠neonFloating-point multiply extended
- vmulx_lane_f32⚠neonFloating-point multiply extended
- vmulx_lane_f64⚠neonFloating-point multiply extended
- vmulx_laneq_f32⚠neonFloating-point multiply extended
- vmulx_laneq_f64⚠neonFloating-point multiply extended
- vmulxd_f64⚠neonFloating-point multiply extended
- vmulxd_lane_f64⚠neonFloating-point multiply extended
- vmulxd_laneq_f64⚠neonFloating-point multiply extended
- vmulxq_f32⚠neonFloating-point multiply extended
- vmulxq_f64⚠neonFloating-point multiply extended
- vmulxq_lane_f32⚠neonFloating-point multiply extended
- vmulxq_lane_f64⚠neonFloating-point multiply extended
- vmulxq_laneq_f32⚠neonFloating-point multiply extended
- vmulxq_laneq_f64⚠neonFloating-point multiply extended
- vmulxs_f32⚠neonFloating-point multiply extended
- vmulxs_lane_f32⚠neonFloating-point multiply extended
- vmulxs_laneq_f32⚠neonFloating-point multiply extended
- vmvn_p8⚠neonVector bitwise not.
- vmvn_s8⚠neonVector bitwise not.
- vmvn_s16⚠neonVector bitwise not.
- vmvn_s32⚠neonVector bitwise not.
- vmvn_u8⚠neonVector bitwise not.
- vmvn_u16⚠neonVector bitwise not.
- vmvn_u32⚠neonVector bitwise not.
- vmvnq_p8⚠neonVector bitwise not.
- vmvnq_s8⚠neonVector bitwise not.
- vmvnq_s16⚠neonVector bitwise not.
- vmvnq_s32⚠neonVector bitwise not.
- vmvnq_u8⚠neonVector bitwise not.
- vmvnq_u16⚠neonVector bitwise not.
- vmvnq_u32⚠neonVector bitwise not.
- vneg_f32⚠neonNegate
- vneg_f64⚠neonNegate
- vneg_s8⚠neonNegate
- vneg_s16⚠neonNegate
- vneg_s32⚠neonNegate
- vneg_s64⚠neonNegate
- vnegd_s64⚠neonNegate
- vnegq_f32⚠neonNegate
- vnegq_f64⚠neonNegate
- vnegq_s8⚠neonNegate
- vnegq_s16⚠neonNegate
- vnegq_s32⚠neonNegate
- vnegq_s64⚠neonNegate
- vorn_s8⚠neonVector bitwise inclusive OR NOT
- vorn_s16⚠neonVector bitwise inclusive OR NOT
- vorn_s32⚠neonVector bitwise inclusive OR NOT
- vorn_s64⚠neonVector bitwise inclusive OR NOT
- vorn_u8⚠neonVector bitwise inclusive OR NOT
- vorn_u16⚠neonVector bitwise inclusive OR NOT
- vorn_u32⚠neonVector bitwise inclusive OR NOT
- vorn_u64⚠neonVector bitwise inclusive OR NOT
- vornq_s8⚠neonVector bitwise inclusive OR NOT
- vornq_s16⚠neonVector bitwise inclusive OR NOT
- vornq_s32⚠neonVector bitwise inclusive OR NOT
- vornq_s64⚠neonVector bitwise inclusive OR NOT
- vornq_u8⚠neonVector bitwise inclusive OR NOT
- vornq_u16⚠neonVector bitwise inclusive OR NOT
- vornq_u32⚠neonVector bitwise inclusive OR NOT
- vornq_u64⚠neonVector bitwise inclusive OR NOT
- vorr_s8⚠neonVector bitwise or (immediate, inclusive)
- vorr_s16⚠neonVector bitwise or (immediate, inclusive)
- vorr_s32⚠neonVector bitwise or (immediate, inclusive)
- vorr_s64⚠neonVector bitwise or (immediate, inclusive)
- vorr_u8⚠neonVector bitwise or (immediate, inclusive)
- vorr_u16⚠neonVector bitwise or (immediate, inclusive)
- vorr_u32⚠neonVector bitwise or (immediate, inclusive)
- vorr_u64⚠neonVector bitwise or (immediate, inclusive)
- vorrq_s8⚠neonVector bitwise or (immediate, inclusive)
- vorrq_s16⚠neonVector bitwise or (immediate, inclusive)
- vorrq_s32⚠neonVector bitwise or (immediate, inclusive)
- vorrq_s64⚠neonVector bitwise or (immediate, inclusive)
- vorrq_u8⚠neonVector bitwise or (immediate, inclusive)
- vorrq_u16⚠neonVector bitwise or (immediate, inclusive)
- vorrq_u32⚠neonVector bitwise or (immediate, inclusive)
- vorrq_u64⚠neonVector bitwise or (immediate, inclusive)
- vpadal_s8⚠neonSigned Add and Accumulate Long Pairwise.
- vpadal_s16⚠neonSigned Add and Accumulate Long Pairwise.
- vpadal_s32⚠neonSigned Add and Accumulate Long Pairwise.
- vpadal_u8⚠neonUnsigned Add and Accumulate Long Pairwise.
- vpadal_u16⚠neonUnsigned Add and Accumulate Long Pairwise.
- vpadal_u32⚠neonUnsigned Add and Accumulate Long Pairwise.
- vpadalq_s8⚠neonSigned Add and Accumulate Long Pairwise.
- vpadalq_s16⚠neonSigned Add and Accumulate Long Pairwise.
- vpadalq_s32⚠neonSigned Add and Accumulate Long Pairwise.
- vpadalq_u8⚠neonUnsigned Add and Accumulate Long Pairwise.
- vpadalq_u16⚠neonUnsigned Add and Accumulate Long Pairwise.
- vpadalq_u32⚠neonUnsigned Add and Accumulate Long Pairwise.
- vpadd_f32⚠neonFloating-point add pairwise
- vpadd_s8⚠neonAdd pairwise.
- vpadd_s16⚠neonAdd pairwise.
- vpadd_s32⚠neonAdd pairwise.
- vpadd_u8⚠neonAdd pairwise.
- vpadd_u16⚠neonAdd pairwise.
- vpadd_u32⚠neonAdd pairwise.
- vpaddd_f64⚠neonFloating-point add pairwise
- vpaddd_s64⚠neonAdd pairwise
- vpaddd_u64⚠neonAdd pairwise
- vpaddl_s8⚠neonSigned Add Long Pairwise.
- vpaddl_s16⚠neonSigned Add Long Pairwise.
- vpaddl_s32⚠neonSigned Add Long Pairwise.
- vpaddl_u8⚠neonUnsigned Add Long Pairwise.
- vpaddl_u16⚠neonUnsigned Add Long Pairwise.
- vpaddl_u32⚠neonUnsigned Add Long Pairwise.
- vpaddlq_s8⚠neonSigned Add Long Pairwise.
- vpaddlq_s16⚠neonSigned Add Long Pairwise.
- vpaddlq_s32⚠neonSigned Add Long Pairwise.
- vpaddlq_u8⚠neonUnsigned Add Long Pairwise.
- vpaddlq_u16⚠neonUnsigned Add Long Pairwise.
- vpaddlq_u32⚠neonUnsigned Add Long Pairwise.
- vpaddq_f32⚠neonFloating-point add pairwise
- vpaddq_f64⚠neonFloating-point add pairwise
- vpaddq_s8⚠neonAdd pairwise
- vpaddq_s16⚠neonAdd pairwise
- vpaddq_s32⚠neonAdd pairwise
- vpaddq_s64⚠neonAdd pairwise
- vpaddq_u8⚠neonAdd pairwise
- vpaddq_u16⚠neonAdd pairwise
- vpaddq_u32⚠neonAdd pairwise
- vpaddq_u64⚠neonAdd pairwise
- vpadds_f32⚠neonFloating-point add pairwise
- vpmax_f32⚠neonFolding maximum of adjacent pairs
- vpmax_s8⚠neonFolding maximum of adjacent pairs
- vpmax_s16⚠neonFolding maximum of adjacent pairs
- vpmax_s32⚠neonFolding maximum of adjacent pairs
- vpmax_u8⚠neonFolding maximum of adjacent pairs
- vpmax_u16⚠neonFolding maximum of adjacent pairs
- vpmax_u32⚠neonFolding maximum of adjacent pairs
- vpmaxnm_f32⚠neonFloating-point Maximum Number Pairwise (vector).
- vpmaxnmq_f32⚠neonFloating-point Maximum Number Pairwise (vector).
- vpmaxnmq_f64⚠neonFloating-point Maximum Number Pairwise (vector).
- vpmaxnmqd_f64⚠neonFloating-point maximum number pairwise
- vpmaxnms_f32⚠neonFloating-point maximum number pairwise
- vpmaxq_f32⚠neonFolding maximum of adjacent pairs
- vpmaxq_f64⚠neonFolding maximum of adjacent pairs
- vpmaxq_s8⚠neonFolding maximum of adjacent pairs
- vpmaxq_s16⚠neonFolding maximum of adjacent pairs
- vpmaxq_s32⚠neonFolding maximum of adjacent pairs
- vpmaxq_u8⚠neonFolding maximum of adjacent pairs
- vpmaxq_u16⚠neonFolding maximum of adjacent pairs
- vpmaxq_u32⚠neonFolding maximum of adjacent pairs
- vpmaxqd_f64⚠neonFloating-point maximum pairwise
- vpmaxs_f32⚠neonFloating-point maximum pairwise
- vpmin_f32⚠neonFolding minimum of adjacent pairs
- vpmin_s8⚠neonFolding minimum of adjacent pairs
- vpmin_s16⚠neonFolding minimum of adjacent pairs
- vpmin_s32⚠neonFolding minimum of adjacent pairs
- vpmin_u8⚠neonFolding minimum of adjacent pairs
- vpmin_u16⚠neonFolding minimum of adjacent pairs
- vpmin_u32⚠neonFolding minimum of adjacent pairs
- vpminnm_f32⚠neonFloating-point Minimum Number Pairwise (vector).
- vpminnmq_f32⚠neonFloating-point Minimum Number Pairwise (vector).
- vpminnmq_f64⚠neonFloating-point Minimum Number Pairwise (vector).
- vpminnmqd_f64⚠neonFloating-point minimum number pairwise
- vpminnms_f32⚠neonFloating-point minimum number pairwise
- vpminq_f32⚠neonFolding minimum of adjacent pairs
- vpminq_f64⚠neonFolding minimum of adjacent pairs
- vpminq_s8⚠neonFolding minimum of adjacent pairs
- vpminq_s16⚠neonFolding minimum of adjacent pairs
- vpminq_s32⚠neonFolding minimum of adjacent pairs
- vpminq_u8⚠neonFolding minimum of adjacent pairs
- vpminq_u16⚠neonFolding minimum of adjacent pairs
- vpminq_u32⚠neonFolding minimum of adjacent pairs
- vpminqd_f64⚠neonFloating-point minimum pairwise
- vpmins_f32⚠neonFloating-point minimum pairwise
- vqabs_s8⚠neonSigned saturating Absolute value
- vqabs_s16⚠neonSigned saturating Absolute value
- vqabs_s32⚠neonSigned saturating Absolute value
- vqabs_s64⚠neonSigned saturating Absolute value
- vqabsb_s8⚠neonSigned saturating absolute value
- vqabsd_s64⚠neonSigned saturating absolute value
- vqabsh_s16⚠neonSigned saturating absolute value
- vqabsq_s8⚠neonSigned saturating Absolute value
- vqabsq_s16⚠neonSigned saturating Absolute value
- vqabsq_s32⚠neonSigned saturating Absolute value
- vqabsq_s64⚠neonSigned saturating Absolute value
- vqabss_s32⚠neonSigned saturating absolute value
- vqadd_s8⚠neonSaturating add
- vqadd_s16⚠neonSaturating add
- vqadd_s32⚠neonSaturating add
- vqadd_s64⚠neonSaturating add
- vqadd_u8⚠neonSaturating add
- vqadd_u16⚠neonSaturating add
- vqadd_u32⚠neonSaturating add
- vqadd_u64⚠neonSaturating add
- vqaddb_s8⚠neonSaturating add
- vqaddb_u8⚠neonSaturating add
- vqaddd_s64⚠neonSaturating add
- vqaddd_u64⚠neonSaturating add
- vqaddh_s16⚠neonSaturating add
- vqaddh_u16⚠neonSaturating add
- vqaddq_s8⚠neonSaturating add
- vqaddq_s16⚠neonSaturating add
- vqaddq_s32⚠neonSaturating add
- vqaddq_s64⚠neonSaturating add
- vqaddq_u8⚠neonSaturating add
- vqaddq_u16⚠neonSaturating add
- vqaddq_u32⚠neonSaturating add
- vqaddq_u64⚠neonSaturating add
- vqadds_s32⚠neonSaturating add
- vqadds_u32⚠neonSaturating add
- Signed saturating doubling multiply-add long
- Signed saturating doubling multiply-add long
- Signed saturating doubling multiply-add long
- Signed saturating doubling multiply-add long
- vqdmlal_high_n_s16⚠neonSigned saturating doubling multiply-add long
- vqdmlal_high_n_s32⚠neonSigned saturating doubling multiply-add long
- vqdmlal_high_s16⚠neonSigned saturating doubling multiply-add long
- vqdmlal_high_s32⚠neonSigned saturating doubling multiply-add long
- vqdmlal_lane_s16⚠neonVector widening saturating doubling multiply accumulate with scalar
- vqdmlal_lane_s32⚠neonVector widening saturating doubling multiply accumulate with scalar
- vqdmlal_laneq_s16⚠neonVector widening saturating doubling multiply accumulate with scalar
- vqdmlal_laneq_s32⚠neonVector widening saturating doubling multiply accumulate with scalar
- vqdmlal_n_s16⚠neonVector widening saturating doubling multiply accumulate with scalar
- vqdmlal_n_s32⚠neonVector widening saturating doubling multiply accumulate with scalar
- vqdmlal_s16⚠neonSigned saturating doubling multiply-add long
- vqdmlal_s32⚠neonSigned saturating doubling multiply-add long
- vqdmlalh_lane_s16⚠neonSigned saturating doubling multiply-add long
- vqdmlalh_laneq_s16⚠neonSigned saturating doubling multiply-add long
- vqdmlalh_s16⚠neonSigned saturating doubling multiply-add long
- vqdmlals_lane_s32⚠neonSigned saturating doubling multiply-add long
- vqdmlals_laneq_s32⚠neonSigned saturating doubling multiply-add long
- vqdmlals_s32⚠neonSigned saturating doubling multiply-add long
- Signed saturating doubling multiply-subtract long
- Signed saturating doubling multiply-subtract long
- Signed saturating doubling multiply-subtract long
- Signed saturating doubling multiply-subtract long
- vqdmlsl_high_n_s16⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsl_high_n_s32⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsl_high_s16⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsl_high_s32⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsl_lane_s16⚠neonVector widening saturating doubling multiply subtract with scalar
- vqdmlsl_lane_s32⚠neonVector widening saturating doubling multiply subtract with scalar
- vqdmlsl_laneq_s16⚠neonVector widening saturating doubling multiply subtract with scalar
- vqdmlsl_laneq_s32⚠neonVector widening saturating doubling multiply subtract with scalar
- vqdmlsl_n_s16⚠neonVector widening saturating doubling multiply subtract with scalar
- vqdmlsl_n_s32⚠neonVector widening saturating doubling multiply subtract with scalar
- vqdmlsl_s16⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsl_s32⚠neonSigned saturating doubling multiply-subtract long
- vqdmlslh_lane_s16⚠neonSigned saturating doubling multiply-subtract long
- vqdmlslh_laneq_s16⚠neonSigned saturating doubling multiply-subtract long
- vqdmlslh_s16⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsls_lane_s32⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsls_laneq_s32⚠neonSigned saturating doubling multiply-subtract long
- vqdmlsls_s32⚠neonSigned saturating doubling multiply-subtract long
- vqdmulh_lane_s16⚠neonVector saturating doubling multiply high by scalar
- vqdmulh_lane_s32⚠neonVector saturating doubling multiply high by scalar
- vqdmulh_laneq_s16⚠neonVector saturating doubling multiply high by scalar
- vqdmulh_laneq_s32⚠neonVector saturating doubling multiply high by scalar
- vqdmulh_n_s16⚠neonVector saturating doubling multiply high with scalar
- vqdmulh_n_s32⚠neonVector saturating doubling multiply high with scalar
- vqdmulh_s16⚠neonSigned saturating doubling multiply returning high half
- vqdmulh_s32⚠neonSigned saturating doubling multiply returning high half
- vqdmulhh_lane_s16⚠neonSigned saturating doubling multiply returning high half
- vqdmulhh_laneq_s16⚠neonSigned saturating doubling multiply returning high half
- vqdmulhh_s16⚠neonSigned saturating doubling multiply returning high half
- vqdmulhq_lane_s16⚠neonVector saturating doubling multiply high by scalar
- vqdmulhq_lane_s32⚠neonVector saturating doubling multiply high by scalar
- vqdmulhq_laneq_s16⚠neonVector saturating doubling multiply high by scalar
- vqdmulhq_laneq_s32⚠neonVector saturating doubling multiply high by scalar
- vqdmulhq_n_s16⚠neonVector saturating doubling multiply high with scalar
- vqdmulhq_n_s32⚠neonVector saturating doubling multiply high with scalar
- vqdmulhq_s16⚠neonSigned saturating doubling multiply returning high half
- vqdmulhq_s32⚠neonSigned saturating doubling multiply returning high half
- vqdmulhs_lane_s32⚠neonSigned saturating doubling multiply returning high half
- vqdmulhs_laneq_s32⚠neonSigned saturating doubling multiply returning high half
- vqdmulhs_s32⚠neonSigned saturating doubling multiply returning high half
- Signed saturating doubling multiply long
- Signed saturating doubling multiply long
- Signed saturating doubling multiply long
- Signed saturating doubling multiply long
- vqdmull_high_n_s16⚠neonSigned saturating doubling multiply long
- vqdmull_high_n_s32⚠neonSigned saturating doubling multiply long
- vqdmull_high_s16⚠neonSigned saturating doubling multiply long
- vqdmull_high_s32⚠neonSigned saturating doubling multiply long
- vqdmull_lane_s16⚠neonVector saturating doubling long multiply by scalar
- vqdmull_lane_s32⚠neonVector saturating doubling long multiply by scalar
- vqdmull_laneq_s16⚠neonVector saturating doubling long multiply by scalar
- vqdmull_laneq_s32⚠neonVector saturating doubling long multiply by scalar
- vqdmull_n_s16⚠neonVector saturating doubling long multiply with scalar
- vqdmull_n_s32⚠neonVector saturating doubling long multiply with scalar
- vqdmull_s16⚠neonSigned saturating doubling multiply long
- vqdmull_s32⚠neonSigned saturating doubling multiply long
- vqdmullh_lane_s16⚠neonSigned saturating doubling multiply long
- vqdmullh_laneq_s16⚠neonSigned saturating doubling multiply long
- vqdmullh_s16⚠neonSigned saturating doubling multiply long
- vqdmulls_lane_s32⚠neonSigned saturating doubling multiply long
- vqdmulls_laneq_s32⚠neonSigned saturating doubling multiply long
- vqdmulls_s32⚠neonSigned saturating doubling multiply long
- vqmovn_high_s16⚠neonSigned saturating extract narrow
- vqmovn_high_s32⚠neonSigned saturating extract narrow
- vqmovn_high_s64⚠neonSigned saturating extract narrow
- vqmovn_high_u16⚠neonSigned saturating extract narrow
- vqmovn_high_u32⚠neonSigned saturating extract narrow
- vqmovn_high_u64⚠neonSigned saturating extract narrow
- vqmovn_s16⚠neonSigned saturating extract narrow
- vqmovn_s32⚠neonSigned saturating extract narrow
- vqmovn_s64⚠neonSigned saturating extract narrow
- vqmovn_u16⚠neonUnsigned saturating extract narrow
- vqmovn_u32⚠neonUnsigned saturating extract narrow
- vqmovn_u64⚠neonUnsigned saturating extract narrow
- vqmovnd_s64⚠neonSaturating extract narrow
- vqmovnd_u64⚠neonSaturating extract narrow
- vqmovnh_s16⚠neonSaturating extract narrow
- vqmovnh_u16⚠neonSaturating extract narrow
- vqmovns_s32⚠neonSaturating extract narrow
- vqmovns_u32⚠neonSaturating extract narrow
- vqmovun_high_s16⚠neonSigned saturating extract unsigned narrow
- vqmovun_high_s32⚠neonSigned saturating extract unsigned narrow
- vqmovun_high_s64⚠neonSigned saturating extract unsigned narrow
- vqmovun_s16⚠neonSigned saturating extract unsigned narrow
- vqmovun_s32⚠neonSigned saturating extract unsigned narrow
- vqmovun_s64⚠neonSigned saturating extract unsigned narrow
- vqmovund_s64⚠neonSigned saturating extract unsigned narrow
- vqmovunh_s16⚠neonSigned saturating extract unsigned narrow
- vqmovuns_s32⚠neonSigned saturating extract unsigned narrow
- vqneg_s8⚠neonSigned saturating negate
- vqneg_s16⚠neonSigned saturating negate
- vqneg_s32⚠neonSigned saturating negate
- vqneg_s64⚠neonSigned saturating negate
- vqnegb_s8⚠neonSigned saturating negate
- vqnegd_s64⚠neonSigned saturating negate
- vqnegh_s16⚠neonSigned saturating negate
- vqnegq_s8⚠neonSigned saturating negate
- vqnegq_s16⚠neonSigned saturating negate
- vqnegq_s32⚠neonSigned saturating negate
- vqnegq_s64⚠neonSigned saturating negate
- vqnegs_s32⚠neonSigned saturating negate
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- vqrdmlah_s16⚠rdmSigned saturating rounding doubling multiply accumulate returning high half
- vqrdmlah_s32⚠rdmSigned saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- vqrdmlahh_s16⚠rdmSigned saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- vqrdmlahq_s16⚠rdmSigned saturating rounding doubling multiply accumulate returning high half
- vqrdmlahq_s32⚠rdmSigned saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply accumulate returning high half
- vqrdmlahs_s32⚠rdmSigned saturating rounding doubling multiply accumulate returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- vqrdmlsh_s16⚠rdmSigned saturating rounding doubling multiply subtract returning high half
- vqrdmlsh_s32⚠rdmSigned saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- vqrdmlshh_s16⚠rdmSigned saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- vqrdmlshq_s16⚠rdmSigned saturating rounding doubling multiply subtract returning high half
- vqrdmlshq_s32⚠rdmSigned saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- Signed saturating rounding doubling multiply subtract returning high half
- vqrdmlshs_s32⚠rdmSigned saturating rounding doubling multiply subtract returning high half
- vqrdmulh_lane_s16⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulh_lane_s32⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulh_laneq_s16⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulh_laneq_s32⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulh_n_s16⚠neonVector saturating rounding doubling multiply high with scalar
- vqrdmulh_n_s32⚠neonVector saturating rounding doubling multiply high with scalar
- vqrdmulh_s16⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulh_s32⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhh_lane_s16⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhh_laneq_s16⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhh_s16⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhq_lane_s16⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulhq_lane_s32⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulhq_laneq_s16⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulhq_laneq_s32⚠neonVector rounding saturating doubling multiply high by scalar
- vqrdmulhq_n_s16⚠neonVector saturating rounding doubling multiply high with scalar
- vqrdmulhq_n_s32⚠neonVector saturating rounding doubling multiply high with scalar
- vqrdmulhq_s16⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhq_s32⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhs_lane_s32⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhs_laneq_s32⚠neonSigned saturating rounding doubling multiply returning high half
- vqrdmulhs_s32⚠neonSigned saturating rounding doubling multiply returning high half
- vqrshl_s8⚠neonSigned saturating rounding shift left
- vqrshl_s16⚠neonSigned saturating rounding shift left
- vqrshl_s32⚠neonSigned saturating rounding shift left
- vqrshl_s64⚠neonSigned saturating rounding shift left
- vqrshl_u8⚠neonUnsigned signed saturating rounding shift left
- vqrshl_u16⚠neonUnsigned signed saturating rounding shift left
- vqrshl_u32⚠neonUnsigned signed saturating rounding shift left
- vqrshl_u64⚠neonUnsigned signed saturating rounding shift left
- vqrshlb_s8⚠neonSigned saturating rounding shift left
- vqrshlb_u8⚠neonUnsigned signed saturating rounding shift left
- vqrshld_s64⚠neonSigned saturating rounding shift left
- vqrshld_u64⚠neonUnsigned signed saturating rounding shift left
- vqrshlh_s16⚠neonSigned saturating rounding shift left
- vqrshlh_u16⚠neonUnsigned signed saturating rounding shift left
- vqrshlq_s8⚠neonSigned saturating rounding shift left
- vqrshlq_s16⚠neonSigned saturating rounding shift left
- vqrshlq_s32⚠neonSigned saturating rounding shift left
- vqrshlq_s64⚠neonSigned saturating rounding shift left
- vqrshlq_u8⚠neonUnsigned signed saturating rounding shift left
- vqrshlq_u16⚠neonUnsigned signed saturating rounding shift left
- vqrshlq_u32⚠neonUnsigned signed saturating rounding shift left
- vqrshlq_u64⚠neonUnsigned signed saturating rounding shift left
- vqrshls_s32⚠neonSigned saturating rounding shift left
- vqrshls_u32⚠neonUnsigned signed saturating rounding shift left
- vqrshrn_high_n_s16⚠neonSigned saturating rounded shift right narrow
- vqrshrn_high_n_s32⚠neonSigned saturating rounded shift right narrow
- vqrshrn_high_n_s64⚠neonSigned saturating rounded shift right narrow
- vqrshrn_high_n_u16⚠neonUnsigned saturating rounded shift right narrow
- vqrshrn_high_n_u32⚠neonUnsigned saturating rounded shift right narrow
- vqrshrn_high_n_u64⚠neonUnsigned saturating rounded shift right narrow
- vqrshrn_n_s16⚠neonSigned saturating rounded shift right narrow
- vqrshrn_n_s32⚠neonSigned saturating rounded shift right narrow
- vqrshrn_n_s64⚠neonSigned saturating rounded shift right narrow
- vqrshrn_n_u16⚠neonUnsigned signed saturating rounded shift right narrow
- vqrshrn_n_u32⚠neonUnsigned signed saturating rounded shift right narrow
- vqrshrn_n_u64⚠neonUnsigned signed saturating rounded shift right narrow
- vqrshrnd_n_s64⚠neonSigned saturating rounded shift right narrow
- vqrshrnd_n_u64⚠neonUnsigned saturating rounded shift right narrow
- vqrshrnh_n_s16⚠neonSigned saturating rounded shift right narrow
- vqrshrnh_n_u16⚠neonUnsigned saturating rounded shift right narrow
- vqrshrns_n_s32⚠neonSigned saturating rounded shift right narrow
- vqrshrns_n_u32⚠neonUnsigned saturating rounded shift right narrow
- vqrshrun_high_n_s16⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrun_high_n_s32⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrun_high_n_s64⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrun_n_s16⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrun_n_s32⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrun_n_s64⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrund_n_s64⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshrunh_n_s16⚠neonSigned saturating rounded shift right unsigned narrow
- vqrshruns_n_s32⚠neonSigned saturating rounded shift right unsigned narrow
- vqshl_n_s8⚠neonSigned saturating shift left
- vqshl_n_s16⚠neonSigned saturating shift left
- vqshl_n_s32⚠neonSigned saturating shift left
- vqshl_n_s64⚠neonSigned saturating shift left
- vqshl_n_u8⚠neonUnsigned saturating shift left
- vqshl_n_u16⚠neonUnsigned saturating shift left
- vqshl_n_u32⚠neonUnsigned saturating shift left
- vqshl_n_u64⚠neonUnsigned saturating shift left
- vqshl_s8⚠neonSigned saturating shift left
- vqshl_s16⚠neonSigned saturating shift left
- vqshl_s32⚠neonSigned saturating shift left
- vqshl_s64⚠neonSigned saturating shift left
- vqshl_u8⚠neonUnsigned saturating shift left
- vqshl_u16⚠neonUnsigned saturating shift left
- vqshl_u32⚠neonUnsigned saturating shift left
- vqshl_u64⚠neonUnsigned saturating shift left
- vqshlb_n_s8⚠neonSigned saturating shift left
- vqshlb_n_u8⚠neonUnsigned saturating shift left
- vqshlb_s8⚠neonSigned saturating shift left
- vqshlb_u8⚠neonUnsigned saturating shift left
- vqshld_n_s64⚠neonSigned saturating shift left
- vqshld_n_u64⚠neonUnsigned saturating shift left
- vqshld_s64⚠neonSigned saturating shift left
- vqshld_u64⚠neonUnsigned saturating shift left
- vqshlh_n_s16⚠neonSigned saturating shift left
- vqshlh_n_u16⚠neonUnsigned saturating shift left
- vqshlh_s16⚠neonSigned saturating shift left
- vqshlh_u16⚠neonUnsigned saturating shift left
- vqshlq_n_s8⚠neonSigned saturating shift left
- vqshlq_n_s16⚠neonSigned saturating shift left
- vqshlq_n_s32⚠neonSigned saturating shift left
- vqshlq_n_s64⚠neonSigned saturating shift left
- vqshlq_n_u8⚠neonUnsigned saturating shift left
- vqshlq_n_u16⚠neonUnsigned saturating shift left
- vqshlq_n_u32⚠neonUnsigned saturating shift left
- vqshlq_n_u64⚠neonUnsigned saturating shift left
- vqshlq_s8⚠neonSigned saturating shift left
- vqshlq_s16⚠neonSigned saturating shift left
- vqshlq_s32⚠neonSigned saturating shift left
- vqshlq_s64⚠neonSigned saturating shift left
- vqshlq_u8⚠neonUnsigned saturating shift left
- vqshlq_u16⚠neonUnsigned saturating shift left
- vqshlq_u32⚠neonUnsigned saturating shift left
- vqshlq_u64⚠neonUnsigned saturating shift left
- vqshls_n_s32⚠neonSigned saturating shift left
- vqshls_n_u32⚠neonUnsigned saturating shift left
- vqshls_s32⚠neonSigned saturating shift left
- vqshls_u32⚠neonUnsigned saturating shift left
- vqshlu_n_s8⚠neonSigned saturating shift left unsigned
- vqshlu_n_s16⚠neonSigned saturating shift left unsigned
- vqshlu_n_s32⚠neonSigned saturating shift left unsigned
- vqshlu_n_s64⚠neonSigned saturating shift left unsigned
- vqshlub_n_s8⚠neonSigned saturating shift left unsigned
- vqshlud_n_s64⚠neonSigned saturating shift left unsigned
- vqshluh_n_s16⚠neonSigned saturating shift left unsigned
- vqshluq_n_s8⚠neonSigned saturating shift left unsigned
- vqshluq_n_s16⚠neonSigned saturating shift left unsigned
- vqshluq_n_s32⚠neonSigned saturating shift left unsigned
- vqshluq_n_s64⚠neonSigned saturating shift left unsigned
- vqshlus_n_s32⚠neonSigned saturating shift left unsigned
- vqshrn_high_n_s16⚠neonSigned saturating shift right narrow
- vqshrn_high_n_s32⚠neonSigned saturating shift right narrow
- vqshrn_high_n_s64⚠neonSigned saturating shift right narrow
- vqshrn_high_n_u16⚠neonUnsigned saturating shift right narrow
- vqshrn_high_n_u32⚠neonUnsigned saturating shift right narrow
- vqshrn_high_n_u64⚠neonUnsigned saturating shift right narrow
- vqshrn_n_s16⚠neonSigned saturating shift right narrow
- vqshrn_n_s32⚠neonSigned saturating shift right narrow
- vqshrn_n_s64⚠neonSigned saturating shift right narrow
- vqshrn_n_u16⚠neonUnsigned saturating shift right narrow
- vqshrn_n_u32⚠neonUnsigned saturating shift right narrow
- vqshrn_n_u64⚠neonUnsigned saturating shift right narrow
- vqshrnd_n_s64⚠neonSigned saturating shift right narrow
- vqshrnd_n_u64⚠neonUnsigned saturating shift right narrow
- vqshrnh_n_s16⚠neonSigned saturating shift right narrow
- vqshrnh_n_u16⚠neonUnsigned saturating shift right narrow
- vqshrns_n_s32⚠neonSigned saturating shift right narrow
- vqshrns_n_u32⚠neonUnsigned saturating shift right narrow
- vqshrun_high_n_s16⚠neonSigned saturating shift right unsigned narrow
- vqshrun_high_n_s32⚠neonSigned saturating shift right unsigned narrow
- vqshrun_high_n_s64⚠neonSigned saturating shift right unsigned narrow
- vqshrun_n_s16⚠neonSigned saturating shift right unsigned narrow
- vqshrun_n_s32⚠neonSigned saturating shift right unsigned narrow
- vqshrun_n_s64⚠neonSigned saturating shift right unsigned narrow
- vqshrund_n_s64⚠neonSigned saturating shift right unsigned narrow
- vqshrunh_n_s16⚠neonSigned saturating shift right unsigned narrow
- vqshruns_n_s32⚠neonSigned saturating shift right unsigned narrow
- vqsub_s8⚠neonSaturating subtract
- vqsub_s16⚠neonSaturating subtract
- vqsub_s32⚠neonSaturating subtract
- vqsub_s64⚠neonSaturating subtract
- vqsub_u8⚠neonSaturating subtract
- vqsub_u16⚠neonSaturating subtract
- vqsub_u32⚠neonSaturating subtract
- vqsub_u64⚠neonSaturating subtract
- vqsubb_s8⚠neonSaturating subtract
- vqsubb_u8⚠neonSaturating subtract
- vqsubd_s64⚠neonSaturating subtract
- vqsubd_u64⚠neonSaturating subtract
- vqsubh_s16⚠neonSaturating subtract
- vqsubh_u16⚠neonSaturating subtract
- vqsubq_s8⚠neonSaturating subtract
- vqsubq_s16⚠neonSaturating subtract
- vqsubq_s32⚠neonSaturating subtract
- vqsubq_s64⚠neonSaturating subtract
- vqsubq_u8⚠neonSaturating subtract
- vqsubq_u16⚠neonSaturating subtract
- vqsubq_u32⚠neonSaturating subtract
- vqsubq_u64⚠neonSaturating subtract
- vqsubs_s32⚠neonSaturating subtract
- vqsubs_u32⚠neonSaturating subtract
- vqtbl1_p8⚠neonTable look-up
- vqtbl1_s8⚠neonTable look-up
- vqtbl1_u8⚠neonTable look-up
- vqtbl1q_p8⚠neonTable look-up
- vqtbl1q_s8⚠neonTable look-up
- vqtbl1q_u8⚠neonTable look-up
- vqtbl2_p8⚠neonTable look-up
- vqtbl2_s8⚠neonTable look-up
- vqtbl2_u8⚠neonTable look-up
- vqtbl2q_p8⚠neonTable look-up
- vqtbl2q_s8⚠neonTable look-up
- vqtbl2q_u8⚠neonTable look-up
- vqtbl3_p8⚠neonTable look-up
- vqtbl3_s8⚠neonTable look-up
- vqtbl3_u8⚠neonTable look-up
- vqtbl3q_p8⚠neonTable look-up
- vqtbl3q_s8⚠neonTable look-up
- vqtbl3q_u8⚠neonTable look-up
- vqtbl4_p8⚠neonTable look-up
- vqtbl4_s8⚠neonTable look-up
- vqtbl4_u8⚠neonTable look-up
- vqtbl4q_p8⚠neonTable look-up
- vqtbl4q_s8⚠neonTable look-up
- vqtbl4q_u8⚠neonTable look-up
- vqtbx1_p8⚠neonExtended table look-up
- vqtbx1_s8⚠neonExtended table look-up
- vqtbx1_u8⚠neonExtended table look-up
- vqtbx1q_p8⚠neonExtended table look-up
- vqtbx1q_s8⚠neonExtended table look-up
- vqtbx1q_u8⚠neonExtended table look-up
- vqtbx2_p8⚠neonExtended table look-up
- vqtbx2_s8⚠neonExtended table look-up
- vqtbx2_u8⚠neonExtended table look-up
- vqtbx2q_p8⚠neonExtended table look-up
- vqtbx2q_s8⚠neonExtended table look-up
- vqtbx2q_u8⚠neonExtended table look-up
- vqtbx3_p8⚠neonExtended table look-up
- vqtbx3_s8⚠neonExtended table look-up
- vqtbx3_u8⚠neonExtended table look-up
- vqtbx3q_p8⚠neonExtended table look-up
- vqtbx3q_s8⚠neonExtended table look-up
- vqtbx3q_u8⚠neonExtended table look-up
- vqtbx4_p8⚠neonExtended table look-up
- vqtbx4_s8⚠neonExtended table look-up
- vqtbx4_u8⚠neonExtended table look-up
- vqtbx4q_p8⚠neonExtended table look-up
- vqtbx4q_s8⚠neonExtended table look-up
- vqtbx4q_u8⚠neonExtended table look-up
- vraddhn_high_s16⚠neonRounding Add returning High Narrow (high half).
- vraddhn_high_s32⚠neonRounding Add returning High Narrow (high half).
- vraddhn_high_s64⚠neonRounding Add returning High Narrow (high half).
- vraddhn_high_u16⚠neonRounding Add returning High Narrow (high half).
- vraddhn_high_u32⚠neonRounding Add returning High Narrow (high half).
- vraddhn_high_u64⚠neonRounding Add returning High Narrow (high half).
- vraddhn_s16⚠neonRounding Add returning High Narrow.
- vraddhn_s32⚠neonRounding Add returning High Narrow.
- vraddhn_s64⚠neonRounding Add returning High Narrow.
- vraddhn_u16⚠neonRounding Add returning High Narrow.
- vraddhn_u32⚠neonRounding Add returning High Narrow.
- vraddhn_u64⚠neonRounding Add returning High Narrow.
- vrax1q_u64⚠neon,sha3Rotate and exclusive OR
- vrbit_p8⚠neonReverse bit order
- vrbit_s8⚠neonReverse bit order
- vrbit_u8⚠neonReverse bit order
- vrbitq_p8⚠neonReverse bit order
- vrbitq_s8⚠neonReverse bit order
- vrbitq_u8⚠neonReverse bit order
- vrecpe_f32⚠neonReciprocal estimate.
- vrecpe_f64⚠neonReciprocal estimate.
- vrecpe_u32⚠neonUnsigned reciprocal estimate
- vrecped_f64⚠neonReciprocal estimate.
- vrecpeq_f32⚠neonReciprocal estimate.
- vrecpeq_f64⚠neonReciprocal estimate.
- vrecpeq_u32⚠neonUnsigned reciprocal estimate
- vrecpes_f32⚠neonReciprocal estimate.
- vrecps_f32⚠neonFloating-point reciprocal step
- vrecps_f64⚠neonFloating-point reciprocal step
- vrecpsd_f64⚠neonFloating-point reciprocal step
- vrecpsq_f32⚠neonFloating-point reciprocal step
- vrecpsq_f64⚠neonFloating-point reciprocal step
- vrecpss_f32⚠neonFloating-point reciprocal step
- vrecpxd_f64⚠neonFloating-point reciprocal exponent
- vrecpxs_f32⚠neonFloating-point reciprocal exponent
- vreinterpret_f32_f64⚠neonVector reinterpret cast operation
- vreinterpret_f32_p8⚠neonVector reinterpret cast operation
- vreinterpret_f32_p16⚠neonVector reinterpret cast operation
- vreinterpret_f32_p64⚠neonVector reinterpret cast operation
- vreinterpret_f32_s8⚠neonVector reinterpret cast operation
- vreinterpret_f32_s16⚠neonVector reinterpret cast operation
- vreinterpret_f32_s32⚠neonVector reinterpret cast operation
- vreinterpret_f32_s64⚠neonVector reinterpret cast operation
- vreinterpret_f32_u8⚠neonVector reinterpret cast operation
- vreinterpret_f32_u16⚠neonVector reinterpret cast operation
- vreinterpret_f32_u32⚠neonVector reinterpret cast operation
- vreinterpret_f32_u64⚠neonVector reinterpret cast operation
- vreinterpret_f64_f32⚠neonVector reinterpret cast operation
- vreinterpret_f64_p8⚠neonVector reinterpret cast operation
- vreinterpret_f64_p16⚠neonVector reinterpret cast operation
- vreinterpret_f64_p64⚠neonVector reinterpret cast operation
- vreinterpret_f64_s8⚠neonVector reinterpret cast operation
- vreinterpret_f64_s16⚠neonVector reinterpret cast operation
- vreinterpret_f64_s32⚠neonVector reinterpret cast operation
- vreinterpret_f64_s64⚠neonVector reinterpret cast operation
- vreinterpret_f64_u8⚠neonVector reinterpret cast operation
- vreinterpret_f64_u16⚠neonVector reinterpret cast operation
- vreinterpret_f64_u32⚠neonVector reinterpret cast operation
- vreinterpret_f64_u64⚠neonVector reinterpret cast operation
- vreinterpret_p8_f32⚠neonVector reinterpret cast operation
- vreinterpret_p8_f64⚠neonVector reinterpret cast operation
- vreinterpret_p8_p16⚠neonVector reinterpret cast operation
- vreinterpret_p8_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_p8_s8⚠neonVector reinterpret cast operation
- vreinterpret_p8_s16⚠neonVector reinterpret cast operation
- vreinterpret_p8_s32⚠neonVector reinterpret cast operation
- vreinterpret_p8_s64⚠neonVector reinterpret cast operation
- vreinterpret_p8_u8⚠neonVector reinterpret cast operation
- vreinterpret_p8_u16⚠neonVector reinterpret cast operation
- vreinterpret_p8_u32⚠neonVector reinterpret cast operation
- vreinterpret_p8_u64⚠neonVector reinterpret cast operation
- vreinterpret_p16_f32⚠neonVector reinterpret cast operation
- vreinterpret_p16_f64⚠neonVector reinterpret cast operation
- vreinterpret_p16_p8⚠neonVector reinterpret cast operation
- vreinterpret_p16_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_p16_s8⚠neonVector reinterpret cast operation
- vreinterpret_p16_s16⚠neonVector reinterpret cast operation
- vreinterpret_p16_s32⚠neonVector reinterpret cast operation
- vreinterpret_p16_s64⚠neonVector reinterpret cast operation
- vreinterpret_p16_u8⚠neonVector reinterpret cast operation
- vreinterpret_p16_u16⚠neonVector reinterpret cast operation
- vreinterpret_p16_u32⚠neonVector reinterpret cast operation
- vreinterpret_p16_u64⚠neonVector reinterpret cast operation
- vreinterpret_p64_f32⚠neonVector reinterpret cast operation
- vreinterpret_p64_f64⚠neonVector reinterpret cast operation
- vreinterpret_p64_p8⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_p16⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_s8⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_s16⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_s32⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_s64⚠neonVector reinterpret cast operation
- vreinterpret_p64_u8⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_u16⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_u32⚠neon,aesVector reinterpret cast operation
- vreinterpret_p64_u64⚠neonVector reinterpret cast operation
- vreinterpret_s8_f32⚠neonVector reinterpret cast operation
- vreinterpret_s8_f64⚠neonVector reinterpret cast operation
- vreinterpret_s8_p8⚠neonVector reinterpret cast operation
- vreinterpret_s8_p16⚠neonVector reinterpret cast operation
- vreinterpret_s8_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_s8_s16⚠neonVector reinterpret cast operation
- vreinterpret_s8_s32⚠neonVector reinterpret cast operation
- vreinterpret_s8_s64⚠neonVector reinterpret cast operation
- vreinterpret_s8_u8⚠neonVector reinterpret cast operation
- vreinterpret_s8_u16⚠neonVector reinterpret cast operation
- vreinterpret_s8_u32⚠neonVector reinterpret cast operation
- vreinterpret_s8_u64⚠neonVector reinterpret cast operation
- vreinterpret_s16_f32⚠neonVector reinterpret cast operation
- vreinterpret_s16_f64⚠neonVector reinterpret cast operation
- vreinterpret_s16_p8⚠neonVector reinterpret cast operation
- vreinterpret_s16_p16⚠neonVector reinterpret cast operation
- vreinterpret_s16_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_s16_s8⚠neonVector reinterpret cast operation
- vreinterpret_s16_s32⚠neonVector reinterpret cast operation
- vreinterpret_s16_s64⚠neonVector reinterpret cast operation
- vreinterpret_s16_u8⚠neonVector reinterpret cast operation
- vreinterpret_s16_u16⚠neonVector reinterpret cast operation
- vreinterpret_s16_u32⚠neonVector reinterpret cast operation
- vreinterpret_s16_u64⚠neonVector reinterpret cast operation
- vreinterpret_s32_f32⚠neonVector reinterpret cast operation
- vreinterpret_s32_f64⚠neonVector reinterpret cast operation
- vreinterpret_s32_p8⚠neonVector reinterpret cast operation
- vreinterpret_s32_p16⚠neonVector reinterpret cast operation
- vreinterpret_s32_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_s32_s8⚠neonVector reinterpret cast operation
- vreinterpret_s32_s16⚠neonVector reinterpret cast operation
- vreinterpret_s32_s64⚠neonVector reinterpret cast operation
- vreinterpret_s32_u8⚠neonVector reinterpret cast operation
- vreinterpret_s32_u16⚠neonVector reinterpret cast operation
- vreinterpret_s32_u32⚠neonVector reinterpret cast operation
- vreinterpret_s32_u64⚠neonVector reinterpret cast operation
- vreinterpret_s64_f32⚠neonVector reinterpret cast operation
- vreinterpret_s64_f64⚠neonVector reinterpret cast operation
- vreinterpret_s64_p8⚠neonVector reinterpret cast operation
- vreinterpret_s64_p16⚠neonVector reinterpret cast operation
- vreinterpret_s64_p64⚠neonVector reinterpret cast operation
- vreinterpret_s64_s8⚠neonVector reinterpret cast operation
- vreinterpret_s64_s16⚠neonVector reinterpret cast operation
- vreinterpret_s64_s32⚠neonVector reinterpret cast operation
- vreinterpret_s64_u8⚠neonVector reinterpret cast operation
- vreinterpret_s64_u16⚠neonVector reinterpret cast operation
- vreinterpret_s64_u32⚠neonVector reinterpret cast operation
- vreinterpret_s64_u64⚠neonVector reinterpret cast operation
- vreinterpret_u8_f32⚠neonVector reinterpret cast operation
- vreinterpret_u8_f64⚠neonVector reinterpret cast operation
- vreinterpret_u8_p8⚠neonVector reinterpret cast operation
- vreinterpret_u8_p16⚠neonVector reinterpret cast operation
- vreinterpret_u8_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_u8_s8⚠neonVector reinterpret cast operation
- vreinterpret_u8_s16⚠neonVector reinterpret cast operation
- vreinterpret_u8_s32⚠neonVector reinterpret cast operation
- vreinterpret_u8_s64⚠neonVector reinterpret cast operation
- vreinterpret_u8_u16⚠neonVector reinterpret cast operation
- vreinterpret_u8_u32⚠neonVector reinterpret cast operation
- vreinterpret_u8_u64⚠neonVector reinterpret cast operation
- vreinterpret_u16_f32⚠neonVector reinterpret cast operation
- vreinterpret_u16_f64⚠neonVector reinterpret cast operation
- vreinterpret_u16_p8⚠neonVector reinterpret cast operation
- vreinterpret_u16_p16⚠neonVector reinterpret cast operation
- vreinterpret_u16_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_u16_s8⚠neonVector reinterpret cast operation
- vreinterpret_u16_s16⚠neonVector reinterpret cast operation
- vreinterpret_u16_s32⚠neonVector reinterpret cast operation
- vreinterpret_u16_s64⚠neonVector reinterpret cast operation
- vreinterpret_u16_u8⚠neonVector reinterpret cast operation
- vreinterpret_u16_u32⚠neonVector reinterpret cast operation
- vreinterpret_u16_u64⚠neonVector reinterpret cast operation
- vreinterpret_u32_f32⚠neonVector reinterpret cast operation
- vreinterpret_u32_f64⚠neonVector reinterpret cast operation
- vreinterpret_u32_p8⚠neonVector reinterpret cast operation
- vreinterpret_u32_p16⚠neonVector reinterpret cast operation
- vreinterpret_u32_p64⚠neon,aesVector reinterpret cast operation
- vreinterpret_u32_s8⚠neonVector reinterpret cast operation
- vreinterpret_u32_s16⚠neonVector reinterpret cast operation
- vreinterpret_u32_s32⚠neonVector reinterpret cast operation
- vreinterpret_u32_s64⚠neonVector reinterpret cast operation
- vreinterpret_u32_u8⚠neonVector reinterpret cast operation
- vreinterpret_u32_u16⚠neonVector reinterpret cast operation
- vreinterpret_u32_u64⚠neonVector reinterpret cast operation
- vreinterpret_u64_f32⚠neonVector reinterpret cast operation
- vreinterpret_u64_f64⚠neonVector reinterpret cast operation
- vreinterpret_u64_p8⚠neonVector reinterpret cast operation
- vreinterpret_u64_p16⚠neonVector reinterpret cast operation
- vreinterpret_u64_p64⚠neonVector reinterpret cast operation
- vreinterpret_u64_s8⚠neonVector reinterpret cast operation
- vreinterpret_u64_s16⚠neonVector reinterpret cast operation
- vreinterpret_u64_s32⚠neonVector reinterpret cast operation
- vreinterpret_u64_s64⚠neonVector reinterpret cast operation
- vreinterpret_u64_u8⚠neonVector reinterpret cast operation
- vreinterpret_u64_u16⚠neonVector reinterpret cast operation
- vreinterpret_u64_u32⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_f32_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_f32_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_f32_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_f64_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_f64_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_f64_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_p8_f32⚠neonVector reinterpret cast operation
- vreinterpretq_p8_f64⚠neonVector reinterpret cast operation
- vreinterpretq_p8_p16⚠neonVector reinterpret cast operation
- vreinterpretq_p8_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p8_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p8_s8⚠neonVector reinterpret cast operation
- vreinterpretq_p8_s16⚠neonVector reinterpret cast operation
- vreinterpretq_p8_s32⚠neonVector reinterpret cast operation
- vreinterpretq_p8_s64⚠neonVector reinterpret cast operation
- vreinterpretq_p8_u8⚠neonVector reinterpret cast operation
- vreinterpretq_p8_u16⚠neonVector reinterpret cast operation
- vreinterpretq_p8_u32⚠neonVector reinterpret cast operation
- vreinterpretq_p8_u64⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_p16_p8⚠neonVector reinterpret cast operation
- vreinterpretq_p16_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p16_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p16_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_p16_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_p64_p8⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_p16⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_s8⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_s16⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_s32⚠neon,aesVector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_p64_u8⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_u16⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p64_u32⚠neon,aesVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_p128_p8⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_p16⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_s8⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_s16⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_s32⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_s64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_u8⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_u16⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_u32⚠neon,aesVector reinterpret cast operation
- vreinterpretq_p128_u64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s8_f32⚠neonVector reinterpret cast operation
- vreinterpretq_s8_f64⚠neonVector reinterpret cast operation
- vreinterpretq_s8_p8⚠neonVector reinterpret cast operation
- vreinterpretq_s8_p16⚠neonVector reinterpret cast operation
- vreinterpretq_s8_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s8_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s8_s16⚠neonVector reinterpret cast operation
- vreinterpretq_s8_s32⚠neonVector reinterpret cast operation
- vreinterpretq_s8_s64⚠neonVector reinterpret cast operation
- vreinterpretq_s8_u8⚠neonVector reinterpret cast operation
- vreinterpretq_s8_u16⚠neonVector reinterpret cast operation
- vreinterpretq_s8_u32⚠neonVector reinterpret cast operation
- vreinterpretq_s8_u64⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s16_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s16_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s16_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s16_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s16_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s32_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s32_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s32_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s32_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s32_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s64_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s64_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_s64_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_s64_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u8_f32⚠neonVector reinterpret cast operation
- vreinterpretq_u8_f64⚠neonVector reinterpret cast operation
- vreinterpretq_u8_p8⚠neonVector reinterpret cast operation
- vreinterpretq_u8_p16⚠neonVector reinterpret cast operation
- vreinterpretq_u8_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u8_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u8_s8⚠neonVector reinterpret cast operation
- vreinterpretq_u8_s16⚠neonVector reinterpret cast operation
- vreinterpretq_u8_s32⚠neonVector reinterpret cast operation
- vreinterpretq_u8_s64⚠neonVector reinterpret cast operation
- vreinterpretq_u8_u16⚠neonVector reinterpret cast operation
- vreinterpretq_u8_u32⚠neonVector reinterpret cast operation
- vreinterpretq_u8_u64⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u16_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u16_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u16_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u16_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u16_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u32_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u32_p64⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u32_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u32_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u32_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u64_p8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u64_p128⚠neon,aesVector reinterpret cast operation
- vreinterpretq_u64_s8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vreinterpretq_u64_u8⚠neonVector reinterpret cast operation
- Vector reinterpret cast operation
- Vector reinterpret cast operation
- vrev16_p8⚠neonReversing vector elements (swap endianness)
- vrev16_s8⚠neonReversing vector elements (swap endianness)
- vrev16_u8⚠neonReversing vector elements (swap endianness)
- vrev16q_p8⚠neonReversing vector elements (swap endianness)
- vrev16q_s8⚠neonReversing vector elements (swap endianness)
- vrev16q_u8⚠neonReversing vector elements (swap endianness)
- vrev32_p8⚠neonReversing vector elements (swap endianness)
- vrev32_p16⚠neonReversing vector elements (swap endianness)
- vrev32_s8⚠neonReversing vector elements (swap endianness)
- vrev32_s16⚠neonReversing vector elements (swap endianness)
- vrev32_u8⚠neonReversing vector elements (swap endianness)
- vrev32_u16⚠neonReversing vector elements (swap endianness)
- vrev32q_p8⚠neonReversing vector elements (swap endianness)
- vrev32q_p16⚠neonReversing vector elements (swap endianness)
- vrev32q_s8⚠neonReversing vector elements (swap endianness)
- vrev32q_s16⚠neonReversing vector elements (swap endianness)
- vrev32q_u8⚠neonReversing vector elements (swap endianness)
- vrev32q_u16⚠neonReversing vector elements (swap endianness)
- vrev64_f32⚠neonReversing vector elements (swap endianness)
- vrev64_p8⚠neonReversing vector elements (swap endianness)
- vrev64_p16⚠neonReversing vector elements (swap endianness)
- vrev64_s8⚠neonReversing vector elements (swap endianness)
- vrev64_s16⚠neonReversing vector elements (swap endianness)
- vrev64_s32⚠neonReversing vector elements (swap endianness)
- vrev64_u8⚠neonReversing vector elements (swap endianness)
- vrev64_u16⚠neonReversing vector elements (swap endianness)
- vrev64_u32⚠neonReversing vector elements (swap endianness)
- vrev64q_f32⚠neonReversing vector elements (swap endianness)
- vrev64q_p8⚠neonReversing vector elements (swap endianness)
- vrev64q_p16⚠neonReversing vector elements (swap endianness)
- vrev64q_s8⚠neonReversing vector elements (swap endianness)
- vrev64q_s16⚠neonReversing vector elements (swap endianness)
- vrev64q_s32⚠neonReversing vector elements (swap endianness)
- vrev64q_u8⚠neonReversing vector elements (swap endianness)
- vrev64q_u16⚠neonReversing vector elements (swap endianness)
- vrev64q_u32⚠neonReversing vector elements (swap endianness)
- vrhadd_s8⚠neonRounding halving add
- vrhadd_s16⚠neonRounding halving add
- vrhadd_s32⚠neonRounding halving add
- vrhadd_u8⚠neonRounding halving add
- vrhadd_u16⚠neonRounding halving add
- vrhadd_u32⚠neonRounding halving add
- vrhaddq_s8⚠neonRounding halving add
- vrhaddq_s16⚠neonRounding halving add
- vrhaddq_s32⚠neonRounding halving add
- vrhaddq_u8⚠neonRounding halving add
- vrhaddq_u16⚠neonRounding halving add
- vrhaddq_u32⚠neonRounding halving add
- vrnd_f32⚠neonFloating-point round to integral, toward zero
- vrnd_f64⚠neonFloating-point round to integral, toward zero
- vrnda_f32⚠neonFloating-point round to integral, to nearest with ties to away
- vrnda_f64⚠neonFloating-point round to integral, to nearest with ties to away
- vrndaq_f32⚠neonFloating-point round to integral, to nearest with ties to away
- vrndaq_f64⚠neonFloating-point round to integral, to nearest with ties to away
- vrndi_f32⚠neonFloating-point round to integral, using current rounding mode
- vrndi_f64⚠neonFloating-point round to integral, using current rounding mode
- vrndiq_f32⚠neonFloating-point round to integral, using current rounding mode
- vrndiq_f64⚠neonFloating-point round to integral, using current rounding mode
- vrndm_f32⚠neonFloating-point round to integral, toward minus infinity
- vrndm_f64⚠neonFloating-point round to integral, toward minus infinity
- vrndmq_f32⚠neonFloating-point round to integral, toward minus infinity
- vrndmq_f64⚠neonFloating-point round to integral, toward minus infinity
- vrndn_f32⚠neonFloating-point round to integral, to nearest with ties to even
- vrndn_f64⚠neonFloating-point round to integral, to nearest with ties to even
- vrndnq_f32⚠neonFloating-point round to integral, to nearest with ties to even
- vrndnq_f64⚠neonFloating-point round to integral, to nearest with ties to even
- vrndns_f32⚠neonFloating-point round to integral, to nearest with ties to even
- vrndp_f32⚠neonFloating-point round to integral, toward plus infinity
- vrndp_f64⚠neonFloating-point round to integral, toward plus infinity
- vrndpq_f32⚠neonFloating-point round to integral, toward plus infinity
- vrndpq_f64⚠neonFloating-point round to integral, toward plus infinity
- vrndq_f32⚠neonFloating-point round to integral, toward zero
- vrndq_f64⚠neonFloating-point round to integral, toward zero
- vrndx_f32⚠neonFloating-point round to integral exact, using current rounding mode
- vrndx_f64⚠neonFloating-point round to integral exact, using current rounding mode
- vrndxq_f32⚠neonFloating-point round to integral exact, using current rounding mode
- vrndxq_f64⚠neonFloating-point round to integral exact, using current rounding mode
- vrshl_s8⚠neonSigned rounding shift left
- vrshl_s16⚠neonSigned rounding shift left
- vrshl_s32⚠neonSigned rounding shift left
- vrshl_s64⚠neonSigned rounding shift left
- vrshl_u8⚠neonUnsigned rounding shift left
- vrshl_u16⚠neonUnsigned rounding shift left
- vrshl_u32⚠neonUnsigned rounding shift left
- vrshl_u64⚠neonUnsigned rounding shift left
- vrshld_s64⚠neonSigned rounding shift left
- vrshld_u64⚠neonUnsigned rounding shift left
- vrshlq_s8⚠neonSigned rounding shift left
- vrshlq_s16⚠neonSigned rounding shift left
- vrshlq_s32⚠neonSigned rounding shift left
- vrshlq_s64⚠neonSigned rounding shift left
- vrshlq_u8⚠neonUnsigned rounding shift left
- vrshlq_u16⚠neonUnsigned rounding shift left
- vrshlq_u32⚠neonUnsigned rounding shift left
- vrshlq_u64⚠neonUnsigned rounding shift left
- vrshr_n_s8⚠neonSigned rounding shift right
- vrshr_n_s16⚠neonSigned rounding shift right
- vrshr_n_s32⚠neonSigned rounding shift right
- vrshr_n_s64⚠neonSigned rounding shift right
- vrshr_n_u8⚠neonUnsigned rounding shift right
- vrshr_n_u16⚠neonUnsigned rounding shift right
- vrshr_n_u32⚠neonUnsigned rounding shift right
- vrshr_n_u64⚠neonUnsigned rounding shift right
- vrshrd_n_s64⚠neonSigned rounding shift right
- vrshrd_n_u64⚠neonUnsigned rounding shift right
- vrshrn_high_n_s16⚠neonRounding shift right narrow
- vrshrn_high_n_s32⚠neonRounding shift right narrow
- vrshrn_high_n_s64⚠neonRounding shift right narrow
- vrshrn_high_n_u16⚠neonRounding shift right narrow
- vrshrn_high_n_u32⚠neonRounding shift right narrow
- vrshrn_high_n_u64⚠neonRounding shift right narrow
- vrshrn_n_s16⚠neonRounding shift right narrow
- vrshrn_n_s32⚠neonRounding shift right narrow
- vrshrn_n_s64⚠neonRounding shift right narrow
- vrshrn_n_u16⚠neonRounding shift right narrow
- vrshrn_n_u32⚠neonRounding shift right narrow
- vrshrn_n_u64⚠neonRounding shift right narrow
- vrshrq_n_s8⚠neonSigned rounding shift right
- vrshrq_n_s16⚠neonSigned rounding shift right
- vrshrq_n_s32⚠neonSigned rounding shift right
- vrshrq_n_s64⚠neonSigned rounding shift right
- vrshrq_n_u8⚠neonUnsigned rounding shift right
- vrshrq_n_u16⚠neonUnsigned rounding shift right
- vrshrq_n_u32⚠neonUnsigned rounding shift right
- vrshrq_n_u64⚠neonUnsigned rounding shift right
- vrsqrte_f32⚠neonReciprocal square-root estimate.
- vrsqrte_f64⚠neonReciprocal square-root estimate.
- vrsqrte_u32⚠neonUnsigned reciprocal square root estimate
- vrsqrted_f64⚠neonReciprocal square-root estimate.
- vrsqrteq_f32⚠neonReciprocal square-root estimate.
- vrsqrteq_f64⚠neonReciprocal square-root estimate.
- vrsqrteq_u32⚠neonUnsigned reciprocal square root estimate
- vrsqrtes_f32⚠neonReciprocal square-root estimate.
- vrsqrts_f32⚠neonFloating-point reciprocal square root step
- vrsqrts_f64⚠neonFloating-point reciprocal square root step
- vrsqrtsd_f64⚠neonFloating-point reciprocal square root step
- vrsqrtsq_f32⚠neonFloating-point reciprocal square root step
- vrsqrtsq_f64⚠neonFloating-point reciprocal square root step
- vrsqrtss_f32⚠neonFloating-point reciprocal square root step
- vrsra_n_s8⚠neonSigned rounding shift right and accumulate
- vrsra_n_s16⚠neonSigned rounding shift right and accumulate
- vrsra_n_s32⚠neonSigned rounding shift right and accumulate
- vrsra_n_s64⚠neonSigned rounding shift right and accumulate
- vrsra_n_u8⚠neonUnsigned rounding shift right and accumulate
- vrsra_n_u16⚠neonUnsigned rounding shift right and accumulate
- vrsra_n_u32⚠neonUnsigned rounding shift right and accumulate
- vrsra_n_u64⚠neonUnsigned rounding shift right and accumulate
- vrsrad_n_s64⚠neonSigned rounding shift right and accumulate.
- vrsrad_n_u64⚠neonUnsigned rounding shift right and accumulate.
- vrsraq_n_s8⚠neonSigned rounding shift right and accumulate
- vrsraq_n_s16⚠neonSigned rounding shift right and accumulate
- vrsraq_n_s32⚠neonSigned rounding shift right and accumulate
- vrsraq_n_s64⚠neonSigned rounding shift right and accumulate
- vrsraq_n_u8⚠neonUnsigned rounding shift right and accumulate
- vrsraq_n_u16⚠neonUnsigned rounding shift right and accumulate
- vrsraq_n_u32⚠neonUnsigned rounding shift right and accumulate
- vrsraq_n_u64⚠neonUnsigned rounding shift right and accumulate
- vrsubhn_high_s16⚠neonRounding subtract returning high narrow
- vrsubhn_high_s32⚠neonRounding subtract returning high narrow
- vrsubhn_high_s64⚠neonRounding subtract returning high narrow
- vrsubhn_high_u16⚠neonRounding subtract returning high narrow
- vrsubhn_high_u32⚠neonRounding subtract returning high narrow
- vrsubhn_high_u64⚠neonRounding subtract returning high narrow
- vrsubhn_s16⚠neonRounding subtract returning high narrow
- vrsubhn_s32⚠neonRounding subtract returning high narrow
- vrsubhn_s64⚠neonRounding subtract returning high narrow
- vrsubhn_u16⚠neonRounding subtract returning high narrow
- vrsubhn_u32⚠neonRounding subtract returning high narrow
- vrsubhn_u64⚠neonRounding subtract returning high narrow
- vset_lane_f32⚠neonInsert vector element from another vector element
- vset_lane_f64⚠neonInsert vector element from another vector element
- vset_lane_p8⚠neonInsert vector element from another vector element
- vset_lane_p16⚠neonInsert vector element from another vector element
- vset_lane_p64⚠neon,aesInsert vector element from another vector element
- vset_lane_s8⚠neonInsert vector element from another vector element
- vset_lane_s16⚠neonInsert vector element from another vector element
- vset_lane_s32⚠neonInsert vector element from another vector element
- vset_lane_s64⚠neonInsert vector element from another vector element
- vset_lane_u8⚠neonInsert vector element from another vector element
- vset_lane_u16⚠neonInsert vector element from another vector element
- vset_lane_u32⚠neonInsert vector element from another vector element
- vset_lane_u64⚠neonInsert vector element from another vector element
- vsetq_lane_f32⚠neonInsert vector element from another vector element
- vsetq_lane_f64⚠neonInsert vector element from another vector element
- vsetq_lane_p8⚠neonInsert vector element from another vector element
- vsetq_lane_p16⚠neonInsert vector element from another vector element
- vsetq_lane_p64⚠neon,aesInsert vector element from another vector element
- vsetq_lane_s8⚠neonInsert vector element from another vector element
- vsetq_lane_s16⚠neonInsert vector element from another vector element
- vsetq_lane_s32⚠neonInsert vector element from another vector element
- vsetq_lane_s64⚠neonInsert vector element from another vector element
- vsetq_lane_u8⚠neonInsert vector element from another vector element
- vsetq_lane_u16⚠neonInsert vector element from another vector element
- vsetq_lane_u32⚠neonInsert vector element from another vector element
- vsetq_lane_u64⚠neonInsert vector element from another vector element
- vsha1cq_u32⚠sha2SHA1 hash update accelerator, choose.
- vsha1h_u32⚠sha2SHA1 fixed rotate.
- vsha1mq_u32⚠sha2SHA1 hash update accelerator, majority.
- vsha1pq_u32⚠sha2SHA1 hash update accelerator, parity.
- vsha1su0q_u32⚠sha2SHA1 schedule update accelerator, first part.
- vsha1su1q_u32⚠sha2SHA1 schedule update accelerator, second part.
- vsha256h2q_u32⚠sha2SHA256 hash update accelerator, upper part.
- vsha256hq_u32⚠sha2SHA256 hash update accelerator.
- vsha256su0q_u32⚠sha2SHA256 schedule update accelerator, first part.
- vsha256su1q_u32⚠sha2SHA256 schedule update accelerator, second part.
- vsha512h2q_u64⚠neon,sha3SHA512 hash update part 2
- vsha512hq_u64⚠neon,sha3SHA512 hash update part 1
- vsha512su0q_u64⚠neon,sha3SHA512 schedule update 0
- vsha512su1q_u64⚠neon,sha3SHA512 schedule update 1
- vshl_n_s8⚠neonShift left
- vshl_n_s16⚠neonShift left
- vshl_n_s32⚠neonShift left
- vshl_n_s64⚠neonShift left
- vshl_n_u8⚠neonShift left
- vshl_n_u16⚠neonShift left
- vshl_n_u32⚠neonShift left
- vshl_n_u64⚠neonShift left
- vshl_s8⚠neonSigned Shift left
- vshl_s16⚠neonSigned Shift left
- vshl_s32⚠neonSigned Shift left
- vshl_s64⚠neonSigned Shift left
- vshl_u8⚠neonUnsigned Shift left
- vshl_u16⚠neonUnsigned Shift left
- vshl_u32⚠neonUnsigned Shift left
- vshl_u64⚠neonUnsigned Shift left
- vshld_n_s64⚠neonShift left
- vshld_n_u64⚠neonShift left
- vshld_s64⚠neonSigned Shift left
- vshld_u64⚠neonUnsigned Shift left
- vshll_high_n_s8⚠neonSigned shift left long
- vshll_high_n_s16⚠neonSigned shift left long
- vshll_high_n_s32⚠neonSigned shift left long
- vshll_high_n_u8⚠neonSigned shift left long
- vshll_high_n_u16⚠neonSigned shift left long
- vshll_high_n_u32⚠neonSigned shift left long
- vshll_n_s8⚠neonSigned shift left long
- vshll_n_s16⚠neonSigned shift left long
- vshll_n_s32⚠neonSigned shift left long
- vshll_n_u8⚠neonSigned shift left long
- vshll_n_u16⚠neonSigned shift left long
- vshll_n_u32⚠neonSigned shift left long
- vshlq_n_s8⚠neonShift left
- vshlq_n_s16⚠neonShift left
- vshlq_n_s32⚠neonShift left
- vshlq_n_s64⚠neonShift left
- vshlq_n_u8⚠neonShift left
- vshlq_n_u16⚠neonShift left
- vshlq_n_u32⚠neonShift left
- vshlq_n_u64⚠neonShift left
- vshlq_s8⚠neonSigned Shift left
- vshlq_s16⚠neonSigned Shift left
- vshlq_s32⚠neonSigned Shift left
- vshlq_s64⚠neonSigned Shift left
- vshlq_u8⚠neonUnsigned Shift left
- vshlq_u16⚠neonUnsigned Shift left
- vshlq_u32⚠neonUnsigned Shift left
- vshlq_u64⚠neonUnsigned Shift left
- vshr_n_s8⚠neonShift right
- vshr_n_s16⚠neonShift right
- vshr_n_s32⚠neonShift right
- vshr_n_s64⚠neonShift right
- vshr_n_u8⚠neonShift right
- vshr_n_u16⚠neonShift right
- vshr_n_u32⚠neonShift right
- vshr_n_u64⚠neonShift right
- vshrd_n_s64⚠neonSigned shift right
- vshrd_n_u64⚠neonUnsigned shift right
- vshrn_high_n_s16⚠neonShift right narrow
- vshrn_high_n_s32⚠neonShift right narrow
- vshrn_high_n_s64⚠neonShift right narrow
- vshrn_high_n_u16⚠neonShift right narrow
- vshrn_high_n_u32⚠neonShift right narrow
- vshrn_high_n_u64⚠neonShift right narrow
- vshrn_n_s16⚠neonShift right narrow
- vshrn_n_s32⚠neonShift right narrow
- vshrn_n_s64⚠neonShift right narrow
- vshrn_n_u16⚠neonShift right narrow
- vshrn_n_u32⚠neonShift right narrow
- vshrn_n_u64⚠neonShift right narrow
- vshrq_n_s8⚠neonShift right
- vshrq_n_s16⚠neonShift right
- vshrq_n_s32⚠neonShift right
- vshrq_n_s64⚠neonShift right
- vshrq_n_u8⚠neonShift right
- vshrq_n_u16⚠neonShift right
- vshrq_n_u32⚠neonShift right
- vshrq_n_u64⚠neonShift right
- vsli_n_p8⚠neonShift Left and Insert (immediate)
- vsli_n_p16⚠neonShift Left and Insert (immediate)
- vsli_n_p64⚠neon,aesShift Left and Insert (immediate)
- vsli_n_s8⚠neonShift Left and Insert (immediate)
- vsli_n_s16⚠neonShift Left and Insert (immediate)
- vsli_n_s32⚠neonShift Left and Insert (immediate)
- vsli_n_s64⚠neonShift Left and Insert (immediate)
- vsli_n_u8⚠neonShift Left and Insert (immediate)
- vsli_n_u16⚠neonShift Left and Insert (immediate)
- vsli_n_u32⚠neonShift Left and Insert (immediate)
- vsli_n_u64⚠neonShift Left and Insert (immediate)
- vslid_n_s64⚠neonShift left and insert
- vslid_n_u64⚠neonShift left and insert
- vsliq_n_p8⚠neonShift Left and Insert (immediate)
- vsliq_n_p16⚠neonShift Left and Insert (immediate)
- vsliq_n_p64⚠neon,aesShift Left and Insert (immediate)
- vsliq_n_s8⚠neonShift Left and Insert (immediate)
- vsliq_n_s16⚠neonShift Left and Insert (immediate)
- vsliq_n_s32⚠neonShift Left and Insert (immediate)
- vsliq_n_s64⚠neonShift Left and Insert (immediate)
- vsliq_n_u8⚠neonShift Left and Insert (immediate)
- vsliq_n_u16⚠neonShift Left and Insert (immediate)
- vsliq_n_u32⚠neonShift Left and Insert (immediate)
- vsliq_n_u64⚠neonShift Left and Insert (immediate)
- vsqadd_u8⚠neonUnsigned saturating Accumulate of Signed value.
- vsqadd_u16⚠neonUnsigned saturating Accumulate of Signed value.
- vsqadd_u32⚠neonUnsigned saturating Accumulate of Signed value.
- vsqadd_u64⚠neonUnsigned saturating Accumulate of Signed value.
- vsqaddb_u8⚠neonUnsigned saturating accumulate of signed value
- vsqaddd_u64⚠neonUnsigned saturating accumulate of signed value
- vsqaddh_u16⚠neonUnsigned saturating accumulate of signed value
- vsqaddq_u8⚠neonUnsigned saturating Accumulate of Signed value.
- vsqaddq_u16⚠neonUnsigned saturating Accumulate of Signed value.
- vsqaddq_u32⚠neonUnsigned saturating Accumulate of Signed value.
- vsqaddq_u64⚠neonUnsigned saturating Accumulate of Signed value.
- vsqadds_u32⚠neonUnsigned saturating accumulate of signed value
- vsqrt_f32⚠neonCalculates the square root of each lane.
- vsqrt_f64⚠neonCalculates the square root of each lane.
- vsqrtq_f32⚠neonCalculates the square root of each lane.
- vsqrtq_f64⚠neonCalculates the square root of each lane.
- vsra_n_s8⚠neonSigned shift right and accumulate
- vsra_n_s16⚠neonSigned shift right and accumulate
- vsra_n_s32⚠neonSigned shift right and accumulate
- vsra_n_s64⚠neonSigned shift right and accumulate
- vsra_n_u8⚠neonUnsigned shift right and accumulate
- vsra_n_u16⚠neonUnsigned shift right and accumulate
- vsra_n_u32⚠neonUnsigned shift right and accumulate
- vsra_n_u64⚠neonUnsigned shift right and accumulate
- vsrad_n_s64⚠neonSigned shift right and accumulate
- vsrad_n_u64⚠neonUnsigned shift right and accumulate
- vsraq_n_s8⚠neonSigned shift right and accumulate
- vsraq_n_s16⚠neonSigned shift right and accumulate
- vsraq_n_s32⚠neonSigned shift right and accumulate
- vsraq_n_s64⚠neonSigned shift right and accumulate
- vsraq_n_u8⚠neonUnsigned shift right and accumulate
- vsraq_n_u16⚠neonUnsigned shift right and accumulate
- vsraq_n_u32⚠neonUnsigned shift right and accumulate
- vsraq_n_u64⚠neonUnsigned shift right and accumulate
- vsri_n_p8⚠neonShift Right and Insert (immediate)
- vsri_n_p16⚠neonShift Right and Insert (immediate)
- vsri_n_p64⚠neon,aesShift Right and Insert (immediate)
- vsri_n_s8⚠neonShift Right and Insert (immediate)
- vsri_n_s16⚠neonShift Right and Insert (immediate)
- vsri_n_s32⚠neonShift Right and Insert (immediate)
- vsri_n_s64⚠neonShift Right and Insert (immediate)
- vsri_n_u8⚠neonShift Right and Insert (immediate)
- vsri_n_u16⚠neonShift Right and Insert (immediate)
- vsri_n_u32⚠neonShift Right and Insert (immediate)
- vsri_n_u64⚠neonShift Right and Insert (immediate)
- vsrid_n_s64⚠neonShift right and insert
- vsrid_n_u64⚠neonShift right and insert
- vsriq_n_p8⚠neonShift Right and Insert (immediate)
- vsriq_n_p16⚠neonShift Right and Insert (immediate)
- vsriq_n_p64⚠neon,aesShift Right and Insert (immediate)
- vsriq_n_s8⚠neonShift Right and Insert (immediate)
- vsriq_n_s16⚠neonShift Right and Insert (immediate)
- vsriq_n_s32⚠neonShift Right and Insert (immediate)
- vsriq_n_s64⚠neonShift Right and Insert (immediate)
- vsriq_n_u8⚠neonShift Right and Insert (immediate)
- vsriq_n_u16⚠neonShift Right and Insert (immediate)
- vsriq_n_u32⚠neonShift Right and Insert (immediate)
- vsriq_n_u64⚠neonShift Right and Insert (immediate)
- vst1_f32⚠neon
- vst1_f32_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_f32_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_f32_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_f64⚠neon
- vst1_f64_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_f64_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_f64_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_lane_f32⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_f64⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_p8⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_p16⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_p64⚠neon,aesStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_s8⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_s16⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_s32⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_s64⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_u8⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_u16⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_u32⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_lane_u64⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_p8⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_p8_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_p8_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_p8_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_p16⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_p16_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_p16_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_p16_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_p64⚠neon,aes
- vst1_p64_x2⚠neon,aesStore multiple single-element structures to one, two, three, or four registers
- vst1_p64_x3⚠neon,aesStore multiple single-element structures to one, two, three, or four registers
- vst1_p64_x4⚠neon,aesStore multiple single-element structures to one, two, three, or four registers
- vst1_s8⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_s8_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s8_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s8_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s16⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_s16_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s16_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s16_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s32⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_s32_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s32_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s32_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s64⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_s64_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s64_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_s64_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1_u8⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_u8_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u8_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u8_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u16⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_u16_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u16_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u16_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u32⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_u32_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u32_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u32_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u64⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1_u64_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u64_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1_u64_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_f32⚠neon
- vst1q_f32_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_f32_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_f32_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_f64⚠neon
- vst1q_f64_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_f64_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_f64_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_lane_f32⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_f64⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_p8⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_p16⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_p64⚠neon,aesStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_s8⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_s16⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_s32⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_s64⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_u8⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_u16⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_u32⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_lane_u64⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_p8⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_p8_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_p8_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_p8_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_p16⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_p16_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_p16_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_p16_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_p64⚠neon,aes
- vst1q_p64_x2⚠neon,aesStore multiple single-element structures to one, two, three, or four registers
- vst1q_p64_x3⚠neon,aesStore multiple single-element structures to one, two, three, or four registers
- vst1q_p64_x4⚠neon,aesStore multiple single-element structures to one, two, three, or four registers
- vst1q_s8⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_s8_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s8_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s8_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s16⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_s16_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s16_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s16_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s32⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_s32_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s32_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s32_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s64⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_s64_x2⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s64_x3⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_s64_x4⚠neonStore multiple single-element structures from one, two, three, or four registers
- vst1q_u8⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_u8_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u8_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u8_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u16⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_u16_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u16_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u16_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u32⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_u32_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u32_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u32_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u64⚠neonStore multiple single-element structures from one, two, three, or four registers.
- vst1q_u64_x2⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u64_x3⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst1q_u64_x4⚠neonStore multiple single-element structures to one, two, three, or four registers
- vst2_f32⚠neonStore multiple 2-element structures from two registers
- vst2_f64⚠neonStore multiple 2-element structures from two registers
- vst2_lane_f32⚠neonStore multiple 2-element structures from two registers
- vst2_lane_f64⚠neonStore multiple 2-element structures from two registers
- vst2_lane_p8⚠neonStore multiple 2-element structures from two registers
- vst2_lane_p16⚠neonStore multiple 2-element structures from two registers
- vst2_lane_p64⚠neon,aesStore multiple 2-element structures from two registers
- vst2_lane_s8⚠neonStore multiple 2-element structures from two registers
- vst2_lane_s16⚠neonStore multiple 2-element structures from two registers
- vst2_lane_s32⚠neonStore multiple 2-element structures from two registers
- vst2_lane_s64⚠neonStore multiple 2-element structures from two registers
- vst2_lane_u8⚠neonStore multiple 2-element structures from two registers
- vst2_lane_u16⚠neonStore multiple 2-element structures from two registers
- vst2_lane_u32⚠neonStore multiple 2-element structures from two registers
- vst2_lane_u64⚠neonStore multiple 2-element structures from two registers
- vst2_p8⚠neonStore multiple 2-element structures from two registers
- vst2_p16⚠neonStore multiple 2-element structures from two registers
- vst2_p64⚠neon,aesStore multiple 2-element structures from two registers
- vst2_s8⚠neonStore multiple 2-element structures from two registers
- vst2_s16⚠neonStore multiple 2-element structures from two registers
- vst2_s32⚠neonStore multiple 2-element structures from two registers
- vst2_s64⚠neonStore multiple 2-element structures from two registers
- vst2_u8⚠neonStore multiple 2-element structures from two registers
- vst2_u16⚠neonStore multiple 2-element structures from two registers
- vst2_u32⚠neonStore multiple 2-element structures from two registers
- vst2_u64⚠neonStore multiple 2-element structures from two registers
- vst2q_f32⚠neonStore multiple 2-element structures from two registers
- vst2q_f64⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_f32⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_f64⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_p8⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_p16⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_p64⚠neon,aesStore multiple 2-element structures from two registers
- vst2q_lane_s8⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_s16⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_s32⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_s64⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_u8⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_u16⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_u32⚠neonStore multiple 2-element structures from two registers
- vst2q_lane_u64⚠neonStore multiple 2-element structures from two registers
- vst2q_p8⚠neonStore multiple 2-element structures from two registers
- vst2q_p16⚠neonStore multiple 2-element structures from two registers
- vst2q_p64⚠neon,aesStore multiple 2-element structures from two registers
- vst2q_s8⚠neonStore multiple 2-element structures from two registers
- vst2q_s16⚠neonStore multiple 2-element structures from two registers
- vst2q_s32⚠neonStore multiple 2-element structures from two registers
- vst2q_s64⚠neonStore multiple 2-element structures from two registers
- vst2q_u8⚠neonStore multiple 2-element structures from two registers
- vst2q_u16⚠neonStore multiple 2-element structures from two registers
- vst2q_u32⚠neonStore multiple 2-element structures from two registers
- vst2q_u64⚠neonStore multiple 2-element structures from two registers
- vst3_f32⚠neonStore multiple 3-element structures from three registers
- vst3_f64⚠neonStore multiple 3-element structures from three registers
- vst3_lane_f32⚠neonStore multiple 3-element structures from three registers
- vst3_lane_f64⚠neonStore multiple 3-element structures from three registers
- vst3_lane_p8⚠neonStore multiple 3-element structures from three registers
- vst3_lane_p16⚠neonStore multiple 3-element structures from three registers
- vst3_lane_p64⚠neon,aesStore multiple 3-element structures from three registers
- vst3_lane_s8⚠neonStore multiple 3-element structures from three registers
- vst3_lane_s16⚠neonStore multiple 3-element structures from three registers
- vst3_lane_s32⚠neonStore multiple 3-element structures from three registers
- vst3_lane_s64⚠neonStore multiple 3-element structures from three registers
- vst3_lane_u8⚠neonStore multiple 3-element structures from three registers
- vst3_lane_u16⚠neonStore multiple 3-element structures from three registers
- vst3_lane_u32⚠neonStore multiple 3-element structures from three registers
- vst3_lane_u64⚠neonStore multiple 3-element structures from three registers
- vst3_p8⚠neonStore multiple 3-element structures from three registers
- vst3_p16⚠neonStore multiple 3-element structures from three registers
- vst3_p64⚠neon,aesStore multiple 3-element structures from three registers
- vst3_s8⚠neonStore multiple 3-element structures from three registers
- vst3_s16⚠neonStore multiple 3-element structures from three registers
- vst3_s32⚠neonStore multiple 3-element structures from three registers
- vst3_s64⚠neonStore multiple 3-element structures from three registers
- vst3_u8⚠neonStore multiple 3-element structures from three registers
- vst3_u16⚠neonStore multiple 3-element structures from three registers
- vst3_u32⚠neonStore multiple 3-element structures from three registers
- vst3_u64⚠neonStore multiple 3-element structures from three registers
- vst3q_f32⚠neonStore multiple 3-element structures from three registers
- vst3q_f64⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_f32⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_f64⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_p8⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_p16⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_p64⚠neon,aesStore multiple 3-element structures from three registers
- vst3q_lane_s8⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_s16⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_s32⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_s64⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_u8⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_u16⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_u32⚠neonStore multiple 3-element structures from three registers
- vst3q_lane_u64⚠neonStore multiple 3-element structures from three registers
- vst3q_p8⚠neonStore multiple 3-element structures from three registers
- vst3q_p16⚠neonStore multiple 3-element structures from three registers
- vst3q_p64⚠neon,aesStore multiple 3-element structures from three registers
- vst3q_s8⚠neonStore multiple 3-element structures from three registers
- vst3q_s16⚠neonStore multiple 3-element structures from three registers
- vst3q_s32⚠neonStore multiple 3-element structures from three registers
- vst3q_s64⚠neonStore multiple 3-element structures from three registers
- vst3q_u8⚠neonStore multiple 3-element structures from three registers
- vst3q_u16⚠neonStore multiple 3-element structures from three registers
- vst3q_u32⚠neonStore multiple 3-element structures from three registers
- vst3q_u64⚠neonStore multiple 3-element structures from three registers
- vst4_f32⚠neonStore multiple 4-element structures from four registers
- vst4_f64⚠neonStore multiple 4-element structures from four registers
- vst4_lane_f32⚠neonStore multiple 4-element structures from four registers
- vst4_lane_f64⚠neonStore multiple 4-element structures from four registers
- vst4_lane_p8⚠neonStore multiple 4-element structures from four registers
- vst4_lane_p16⚠neonStore multiple 4-element structures from four registers
- vst4_lane_p64⚠neon,aesStore multiple 4-element structures from four registers
- vst4_lane_s8⚠neonStore multiple 4-element structures from four registers
- vst4_lane_s16⚠neonStore multiple 4-element structures from four registers
- vst4_lane_s32⚠neonStore multiple 4-element structures from four registers
- vst4_lane_s64⚠neonStore multiple 4-element structures from four registers
- vst4_lane_u8⚠neonStore multiple 4-element structures from four registers
- vst4_lane_u16⚠neonStore multiple 4-element structures from four registers
- vst4_lane_u32⚠neonStore multiple 4-element structures from four registers
- vst4_lane_u64⚠neonStore multiple 4-element structures from four registers
- vst4_p8⚠neonStore multiple 4-element structures from four registers
- vst4_p16⚠neonStore multiple 4-element structures from four registers
- vst4_p64⚠neon,aesStore multiple 4-element structures from four registers
- vst4_s8⚠neonStore multiple 4-element structures from four registers
- vst4_s16⚠neonStore multiple 4-element structures from four registers
- vst4_s32⚠neonStore multiple 4-element structures from four registers
- vst4_s64⚠neonStore multiple 4-element structures from four registers
- vst4_u8⚠neonStore multiple 4-element structures from four registers
- vst4_u16⚠neonStore multiple 4-element structures from four registers
- vst4_u32⚠neonStore multiple 4-element structures from four registers
- vst4_u64⚠neonStore multiple 4-element structures from four registers
- vst4q_f32⚠neonStore multiple 4-element structures from four registers
- vst4q_f64⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_f32⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_f64⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_p8⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_p16⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_p64⚠neon,aesStore multiple 4-element structures from four registers
- vst4q_lane_s8⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_s16⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_s32⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_s64⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_u8⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_u16⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_u32⚠neonStore multiple 4-element structures from four registers
- vst4q_lane_u64⚠neonStore multiple 4-element structures from four registers
- vst4q_p8⚠neonStore multiple 4-element structures from four registers
- vst4q_p16⚠neonStore multiple 4-element structures from four registers
- vst4q_p64⚠neon,aesStore multiple 4-element structures from four registers
- vst4q_s8⚠neonStore multiple 4-element structures from four registers
- vst4q_s16⚠neonStore multiple 4-element structures from four registers
- vst4q_s32⚠neonStore multiple 4-element structures from four registers
- vst4q_s64⚠neonStore multiple 4-element structures from four registers
- vst4q_u8⚠neonStore multiple 4-element structures from four registers
- vst4q_u16⚠neonStore multiple 4-element structures from four registers
- vst4q_u32⚠neonStore multiple 4-element structures from four registers
- vst4q_u64⚠neonStore multiple 4-element structures from four registers
- vstrq_p128⚠neonStore SIMD&FP register (immediate offset)
- vsub_f32⚠neonSubtract
- vsub_f64⚠neonSubtract
- vsub_s8⚠neonSubtract
- vsub_s16⚠neonSubtract
- vsub_s32⚠neonSubtract
- vsub_s64⚠neonSubtract
- vsub_u8⚠neonSubtract
- vsub_u16⚠neonSubtract
- vsub_u32⚠neonSubtract
- vsub_u64⚠neonSubtract
- vsubd_s64⚠neonSubtract
- vsubd_u64⚠neonSubtract
- vsubhn_high_s16⚠neonSubtract returning high narrow
- vsubhn_high_s32⚠neonSubtract returning high narrow
- vsubhn_high_s64⚠neonSubtract returning high narrow
- vsubhn_high_u16⚠neonSubtract returning high narrow
- vsubhn_high_u32⚠neonSubtract returning high narrow
- vsubhn_high_u64⚠neonSubtract returning high narrow
- vsubhn_s16⚠neonSubtract returning high narrow
- vsubhn_s32⚠neonSubtract returning high narrow
- vsubhn_s64⚠neonSubtract returning high narrow
- vsubhn_u16⚠neonSubtract returning high narrow
- vsubhn_u32⚠neonSubtract returning high narrow
- vsubhn_u64⚠neonSubtract returning high narrow
- vsubl_high_s8⚠neonSigned Subtract Long
- vsubl_high_s16⚠neonSigned Subtract Long
- vsubl_high_s32⚠neonSigned Subtract Long
- vsubl_high_u8⚠neonUnsigned Subtract Long
- vsubl_high_u16⚠neonUnsigned Subtract Long
- vsubl_high_u32⚠neonUnsigned Subtract Long
- vsubl_s8⚠neonSigned Subtract Long
- vsubl_s16⚠neonSigned Subtract Long
- vsubl_s32⚠neonSigned Subtract Long
- vsubl_u8⚠neonUnsigned Subtract Long
- vsubl_u16⚠neonUnsigned Subtract Long
- vsubl_u32⚠neonUnsigned Subtract Long
- vsubq_f32⚠neonSubtract
- vsubq_f64⚠neonSubtract
- vsubq_s8⚠neonSubtract
- vsubq_s16⚠neonSubtract
- vsubq_s32⚠neonSubtract
- vsubq_s64⚠neonSubtract
- vsubq_u8⚠neonSubtract
- vsubq_u16⚠neonSubtract
- vsubq_u32⚠neonSubtract
- vsubq_u64⚠neonSubtract
- vsubw_high_s8⚠neonSigned Subtract Wide
- vsubw_high_s16⚠neonSigned Subtract Wide
- vsubw_high_s32⚠neonSigned Subtract Wide
- vsubw_high_u8⚠neonUnsigned Subtract Wide
- vsubw_high_u16⚠neonUnsigned Subtract Wide
- vsubw_high_u32⚠neonUnsigned Subtract Wide
- vsubw_s8⚠neonSigned Subtract Wide
- vsubw_s16⚠neonSigned Subtract Wide
- vsubw_s32⚠neonSigned Subtract Wide
- vsubw_u8⚠neonUnsigned Subtract Wide
- vsubw_u16⚠neonUnsigned Subtract Wide
- vsubw_u32⚠neonUnsigned Subtract Wide
- vtbl1_p8⚠neonTable look-up
- vtbl1_s8⚠neonTable look-up
- vtbl1_u8⚠neonTable look-up
- vtbl2_p8⚠neonTable look-up
- vtbl2_s8⚠neonTable look-up
- vtbl2_u8⚠neonTable look-up
- vtbl3_p8⚠neonTable look-up
- vtbl3_s8⚠neonTable look-up
- vtbl3_u8⚠neonTable look-up
- vtbl4_p8⚠neonTable look-up
- vtbl4_s8⚠neonTable look-up
- vtbl4_u8⚠neonTable look-up
- vtbx1_p8⚠neonExtended table look-up
- vtbx1_s8⚠neonExtended table look-up
- vtbx1_u8⚠neonExtended table look-up
- vtbx2_p8⚠neonExtended table look-up
- vtbx2_s8⚠neonExtended table look-up
- vtbx2_u8⚠neonExtended table look-up
- vtbx3_p8⚠neonExtended table look-up
- vtbx3_s8⚠neonExtended table look-up
- vtbx3_u8⚠neonExtended table look-up
- vtbx4_p8⚠neonExtended table look-up
- vtbx4_s8⚠neonExtended table look-up
- vtbx4_u8⚠neonExtended table look-up
- vtrn1_f32⚠neonTranspose vectors
- vtrn1_p8⚠neonTranspose vectors
- vtrn1_p16⚠neonTranspose vectors
- vtrn1_s8⚠neonTranspose vectors
- vtrn1_s16⚠neonTranspose vectors
- vtrn1_s32⚠neonTranspose vectors
- vtrn1_u8⚠neonTranspose vectors
- vtrn1_u16⚠neonTranspose vectors
- vtrn1_u32⚠neonTranspose vectors
- vtrn1q_f32⚠neonTranspose vectors
- vtrn1q_f64⚠neonTranspose vectors
- vtrn1q_p8⚠neonTranspose vectors
- vtrn1q_p16⚠neonTranspose vectors
- vtrn1q_p64⚠neonTranspose vectors
- vtrn1q_s8⚠neonTranspose vectors
- vtrn1q_s16⚠neonTranspose vectors
- vtrn1q_s32⚠neonTranspose vectors
- vtrn1q_s64⚠neonTranspose vectors
- vtrn1q_u8⚠neonTranspose vectors
- vtrn1q_u16⚠neonTranspose vectors
- vtrn1q_u32⚠neonTranspose vectors
- vtrn1q_u64⚠neonTranspose vectors
- vtrn2_f32⚠neonTranspose vectors
- vtrn2_p8⚠neonTranspose vectors
- vtrn2_p16⚠neonTranspose vectors
- vtrn2_s8⚠neonTranspose vectors
- vtrn2_s16⚠neonTranspose vectors
- vtrn2_s32⚠neonTranspose vectors
- vtrn2_u8⚠neonTranspose vectors
- vtrn2_u16⚠neonTranspose vectors
- vtrn2_u32⚠neonTranspose vectors
- vtrn2q_f32⚠neonTranspose vectors
- vtrn2q_f64⚠neonTranspose vectors
- vtrn2q_p8⚠neonTranspose vectors
- vtrn2q_p16⚠neonTranspose vectors
- vtrn2q_p64⚠neonTranspose vectors
- vtrn2q_s8⚠neonTranspose vectors
- vtrn2q_s16⚠neonTranspose vectors
- vtrn2q_s32⚠neonTranspose vectors
- vtrn2q_s64⚠neonTranspose vectors
- vtrn2q_u8⚠neonTranspose vectors
- vtrn2q_u16⚠neonTranspose vectors
- vtrn2q_u32⚠neonTranspose vectors
- vtrn2q_u64⚠neonTranspose vectors
- vtrn_f32⚠neonTranspose elements
- vtrn_p8⚠neonTranspose elements
- vtrn_p16⚠neonTranspose elements
- vtrn_s8⚠neonTranspose elements
- vtrn_s16⚠neonTranspose elements
- vtrn_s32⚠neonTranspose elements
- vtrn_u8⚠neonTranspose elements
- vtrn_u16⚠neonTranspose elements
- vtrn_u32⚠neonTranspose elements
- vtrnq_f32⚠neonTranspose elements
- vtrnq_p8⚠neonTranspose elements
- vtrnq_p16⚠neonTranspose elements
- vtrnq_s8⚠neonTranspose elements
- vtrnq_s16⚠neonTranspose elements
- vtrnq_s32⚠neonTranspose elements
- vtrnq_u8⚠neonTranspose elements
- vtrnq_u16⚠neonTranspose elements
- vtrnq_u32⚠neonTranspose elements
- vtst_p8⚠neonSigned compare bitwise Test bits nonzero
- vtst_p16⚠neonSigned compare bitwise Test bits nonzero
- vtst_p64⚠neonSigned compare bitwise Test bits nonzero
- vtst_s8⚠neonSigned compare bitwise Test bits nonzero
- vtst_s16⚠neonSigned compare bitwise Test bits nonzero
- vtst_s32⚠neonSigned compare bitwise Test bits nonzero
- vtst_s64⚠neonSigned compare bitwise Test bits nonzero
- vtst_u8⚠neonUnsigned compare bitwise Test bits nonzero
- vtst_u16⚠neonUnsigned compare bitwise Test bits nonzero
- vtst_u32⚠neonUnsigned compare bitwise Test bits nonzero
- vtst_u64⚠neonUnsigned compare bitwise Test bits nonzero
- vtstd_s64⚠neonCompare bitwise test bits nonzero
- vtstd_u64⚠neonCompare bitwise test bits nonzero
- vtstq_p8⚠neonSigned compare bitwise Test bits nonzero
- vtstq_p16⚠neonSigned compare bitwise Test bits nonzero
- vtstq_p64⚠neonSigned compare bitwise Test bits nonzero
- vtstq_s8⚠neonSigned compare bitwise Test bits nonzero
- vtstq_s16⚠neonSigned compare bitwise Test bits nonzero
- vtstq_s32⚠neonSigned compare bitwise Test bits nonzero
- vtstq_s64⚠neonSigned compare bitwise Test bits nonzero
- vtstq_u8⚠neonUnsigned compare bitwise Test bits nonzero
- vtstq_u16⚠neonUnsigned compare bitwise Test bits nonzero
- vtstq_u32⚠neonUnsigned compare bitwise Test bits nonzero
- vtstq_u64⚠neonUnsigned compare bitwise Test bits nonzero
- vuqadd_s8⚠neonSigned saturating Accumulate of Unsigned value.
- vuqadd_s16⚠neonSigned saturating Accumulate of Unsigned value.
- vuqadd_s32⚠neonSigned saturating Accumulate of Unsigned value.
- vuqadd_s64⚠neonSigned saturating Accumulate of Unsigned value.
- vuqaddb_s8⚠neonSigned saturating accumulate of unsigned value
- vuqaddd_s64⚠neonSigned saturating accumulate of unsigned value
- vuqaddh_s16⚠neonSigned saturating accumulate of unsigned value
- vuqaddq_s8⚠neonSigned saturating Accumulate of Unsigned value.
- vuqaddq_s16⚠neonSigned saturating Accumulate of Unsigned value.
- vuqaddq_s32⚠neonSigned saturating Accumulate of Unsigned value.
- vuqaddq_s64⚠neonSigned saturating Accumulate of Unsigned value.
- vuqadds_s32⚠neonSigned saturating accumulate of unsigned value
- vuzp1_f32⚠neonUnzip vectors
- vuzp1_p8⚠neonUnzip vectors
- vuzp1_p16⚠neonUnzip vectors
- vuzp1_s8⚠neonUnzip vectors
- vuzp1_s16⚠neonUnzip vectors
- vuzp1_s32⚠neonUnzip vectors
- vuzp1_u8⚠neonUnzip vectors
- vuzp1_u16⚠neonUnzip vectors
- vuzp1_u32⚠neonUnzip vectors
- vuzp1q_f32⚠neonUnzip vectors
- vuzp1q_f64⚠neonUnzip vectors
- vuzp1q_p8⚠neonUnzip vectors
- vuzp1q_p16⚠neonUnzip vectors
- vuzp1q_p64⚠neonUnzip vectors
- vuzp1q_s8⚠neonUnzip vectors
- vuzp1q_s16⚠neonUnzip vectors
- vuzp1q_s32⚠neonUnzip vectors
- vuzp1q_s64⚠neonUnzip vectors
- vuzp1q_u8⚠neonUnzip vectors
- vuzp1q_u16⚠neonUnzip vectors
- vuzp1q_u32⚠neonUnzip vectors
- vuzp1q_u64⚠neonUnzip vectors
- vuzp2_f32⚠neonUnzip vectors
- vuzp2_p8⚠neonUnzip vectors
- vuzp2_p16⚠neonUnzip vectors
- vuzp2_s8⚠neonUnzip vectors
- vuzp2_s16⚠neonUnzip vectors
- vuzp2_s32⚠neonUnzip vectors
- vuzp2_u8⚠neonUnzip vectors
- vuzp2_u16⚠neonUnzip vectors
- vuzp2_u32⚠neonUnzip vectors
- vuzp2q_f32⚠neonUnzip vectors
- vuzp2q_f64⚠neonUnzip vectors
- vuzp2q_p8⚠neonUnzip vectors
- vuzp2q_p16⚠neonUnzip vectors
- vuzp2q_p64⚠neonUnzip vectors
- vuzp2q_s8⚠neonUnzip vectors
- vuzp2q_s16⚠neonUnzip vectors
- vuzp2q_s32⚠neonUnzip vectors
- vuzp2q_s64⚠neonUnzip vectors
- vuzp2q_u8⚠neonUnzip vectors
- vuzp2q_u16⚠neonUnzip vectors
- vuzp2q_u32⚠neonUnzip vectors
- vuzp2q_u64⚠neonUnzip vectors
- vuzp_f32⚠neonUnzip vectors
- vuzp_p8⚠neonUnzip vectors
- vuzp_p16⚠neonUnzip vectors
- vuzp_s8⚠neonUnzip vectors
- vuzp_s16⚠neonUnzip vectors
- vuzp_s32⚠neonUnzip vectors
- vuzp_u8⚠neonUnzip vectors
- vuzp_u16⚠neonUnzip vectors
- vuzp_u32⚠neonUnzip vectors
- vuzpq_f32⚠neonUnzip vectors
- vuzpq_p8⚠neonUnzip vectors
- vuzpq_p16⚠neonUnzip vectors
- vuzpq_s8⚠neonUnzip vectors
- vuzpq_s16⚠neonUnzip vectors
- vuzpq_s32⚠neonUnzip vectors
- vuzpq_u8⚠neonUnzip vectors
- vuzpq_u16⚠neonUnzip vectors
- vuzpq_u32⚠neonUnzip vectors
- vxarq_u64⚠neon,sha3Exclusive OR and rotate
- vzip1_f32⚠neonZip vectors
- vzip1_p8⚠neonZip vectors
- vzip1_p16⚠neonZip vectors
- vzip1_s8⚠neonZip vectors
- vzip1_s16⚠neonZip vectors
- vzip1_s32⚠neonZip vectors
- vzip1_u8⚠neonZip vectors
- vzip1_u16⚠neonZip vectors
- vzip1_u32⚠neonZip vectors
- vzip1q_f32⚠neonZip vectors
- vzip1q_f64⚠neonZip vectors
- vzip1q_p8⚠neonZip vectors
- vzip1q_p16⚠neonZip vectors
- vzip1q_p64⚠neonZip vectors
- vzip1q_s8⚠neonZip vectors
- vzip1q_s16⚠neonZip vectors
- vzip1q_s32⚠neonZip vectors
- vzip1q_s64⚠neonZip vectors
- vzip1q_u8⚠neonZip vectors
- vzip1q_u16⚠neonZip vectors
- vzip1q_u32⚠neonZip vectors
- vzip1q_u64⚠neonZip vectors
- vzip2_f32⚠neonZip vectors
- vzip2_p8⚠neonZip vectors
- vzip2_p16⚠neonZip vectors
- vzip2_s8⚠neonZip vectors
- vzip2_s16⚠neonZip vectors
- vzip2_s32⚠neonZip vectors
- vzip2_u8⚠neonZip vectors
- vzip2_u16⚠neonZip vectors
- vzip2_u32⚠neonZip vectors
- vzip2q_f32⚠neonZip vectors
- vzip2q_f64⚠neonZip vectors
- vzip2q_p8⚠neonZip vectors
- vzip2q_p16⚠neonZip vectors
- vzip2q_p64⚠neonZip vectors
- vzip2q_s8⚠neonZip vectors
- vzip2q_s16⚠neonZip vectors
- vzip2q_s32⚠neonZip vectors
- vzip2q_s64⚠neonZip vectors
- vzip2q_u8⚠neonZip vectors
- vzip2q_u16⚠neonZip vectors
- vzip2q_u32⚠neonZip vectors
- vzip2q_u64⚠neonZip vectors
- vzip_f32⚠neonZip vectors
- vzip_p8⚠neonZip vectors
- vzip_p16⚠neonZip vectors
- vzip_s8⚠neonZip vectors
- vzip_s16⚠neonZip vectors
- vzip_s32⚠neonZip vectors
- vzip_u8⚠neonZip vectors
- vzip_u16⚠neonZip vectors
- vzip_u32⚠neonZip vectors
- vzipq_f32⚠neonZip vectors
- vzipq_p8⚠neonZip vectors
- vzipq_p16⚠neonZip vectors
- vzipq_s8⚠neonZip vectors
- vzipq_s16⚠neonZip vectors
- vzipq_s32⚠neonZip vectors
- vzipq_u8⚠neonZip vectors
- vzipq_u16⚠neonZip vectors
- vzipq_u32⚠neonZip vectors
- CRC32 single round checksum for bytes (8 bits).
- CRC32-C single round checksum for bytes (8 bits).
- CRC32-C single round checksum for half words (16 bits).
- CRC32-C single round checksum for words (32 bits).
- CRC32 single round checksum for half words (16 bits).
- CRC32 single round checksum for words (32 bits).
- __dmb⚠ExperimentalGenerates a DMB (data memory barrier) instruction or equivalent CP15 instruction.
- __dsb⚠ExperimentalGenerates a DSB (data synchronization barrier) instruction or equivalent CP15 instruction.
- __isb⚠ExperimentalGenerates an ISB (instruction synchronization barrier) instruction or equivalent CP15 instruction.
- __nop⚠ExperimentalGenerates an unspecified no-op instruction.
- __sev⚠ExperimentalGenerates a SEV (send a global event) hint instruction.
- __sevl⚠ExperimentalGenerates a send a local event hint instruction.
- Cancels the current transaction and discards all state modifications that were performed transactionally.
- Commits the current transaction. For a nested transaction, the only effect is that the transactional nesting depth is decreased. For an outer transaction, the state modifications performed transactionally are committed to the architectural state.
- Starts a new transaction. When the transaction starts successfully the return value is 0. If the transaction fails, all state modifications are discarded and a cause of the failure is encoded in the return value.
- Tests if executing inside a transaction. If no transaction is currently executing, the return value is 0. Otherwise, this intrinsic returns the depth of the transaction.
- __wfe⚠ExperimentalGenerates a WFE (wait for event) hint instruction, or nothing.
- __wfi⚠ExperimentalGenerates a WFI (wait for interrupt) hint instruction, or nothing.
- __yield⚠ExperimentalGenerates a YIELD hint instruction.
- _prefetch⚠ExperimentalFetch the cache line that contains addresspusing the givenRWandLOCALITY.
- Floating-point complex add
- Floating-point complex add
- Floating-point complex add
- Floating-point complex add
- Floating-point complex add
- Floating-point complex add
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Floating-point complex multiply accumulate
- Dot product arithmetic (indexed)
- Dot product arithmetic (indexed)
- Dot product arithmetic (indexed)
- Dot product arithmetic (indexed)
- Dot product arithmetic (vector)
- Dot product arithmetic (vector)
- Dot product arithmetic (indexed)
- Dot product arithmetic (indexed)
- Dot product arithmetic (indexed)
- Dot product arithmetic (indexed)
- Dot product arithmetic (vector)
- Dot product arithmetic (vector)
- 8-bit integer matrix multiply-accumulate
- 8-bit integer matrix multiply-accumulate
- Floating-point round to 32-bit integer, using current rounding mode
- Floating-point round to 32-bit integer, using current rounding mode
- Floating-point round to 32-bit integer, using current rounding mode
- Floating-point round to 32-bit integer, using current rounding mode
- Floating-point round to 32-bit integer toward zero
- Floating-point round to 32-bit integer toward zero
- Floating-point round to 32-bit integer toward zero
- Floating-point round to 32-bit integer toward zero
- Floating-point round to 64-bit integer, using current rounding mode
- Floating-point round to 64-bit integer, using current rounding mode
- Floating-point round to 64-bit integer, using current rounding mode
- Floating-point round to 64-bit integer, using current rounding mode
- Floating-point round to 64-bit integer toward zero
- Floating-point round to 64-bit integer toward zero
- Floating-point round to 64-bit integer toward zero
- Floating-point round to 64-bit integer toward zero
- SM3PARTW1
- SM3PARTW2
- SM3SS1
- SM3TT1A
- SM3TT1B
- SM3TT2A
- SM3TT2B
- SM4 key
- SM4 encode
- Dot product index form with signed and unsigned integers
- Dot product index form with signed and unsigned integers
- Dot product index form with signed and unsigned integers
- Dot product index form with signed and unsigned integers
- Dot product index form with unsigned and signed integers
- Dot product index form with unsigned and signed integers
- Dot product vector form with unsigned and signed integers
- Dot product index form with unsigned and signed integers
- Dot product index form with unsigned and signed integers
- Dot product vector form with unsigned and signed integers
- Unsigned and signed 8-bit integer matrix multiply-accumulate