From patchwork Wed Feb 21 17:39:18 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: "Andre Vieira (lists)" X-Patchwork-Id: 1902285 Return-Path: X-Original-To: incoming@patchwork.ozlabs.org Delivered-To: patchwork-incoming@legolas.ozlabs.org Authentication-Results: legolas.ozlabs.org; spf=pass (sender SPF authorized) smtp.mailfrom=gcc.gnu.org (client-ip=8.43.85.97; helo=server2.sourceware.org; envelope-from=gcc-patches-bounces+incoming=patchwork.ozlabs.org@gcc.gnu.org; receiver=patchwork.ozlabs.org) Received: from server2.sourceware.org (server2.sourceware.org [8.43.85.97]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (secp384r1) server-digest SHA384) (No client certificate requested) by legolas.ozlabs.org (Postfix) with ESMTPS id 4Tg3TJ6j1zz20Qg for ; Thu, 22 Feb 2024 04:40:36 +1100 (AEDT) Received: from server2.sourceware.org (localhost [IPv6:::1]) by sourceware.org (Postfix) with ESMTP id DA7713858419 for ; Wed, 21 Feb 2024 17:40:34 +0000 (GMT) X-Original-To: gcc-patches@gcc.gnu.org Delivered-To: gcc-patches@gcc.gnu.org Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by sourceware.org (Postfix) with ESMTP id 9D6813858419 for ; Wed, 21 Feb 2024 17:39:39 +0000 (GMT) DMARC-Filter: OpenDMARC Filter v1.4.2 sourceware.org 9D6813858419 Authentication-Results: sourceware.org; dmarc=pass (p=none dis=none) header.from=arm.com Authentication-Results: sourceware.org; spf=pass smtp.mailfrom=arm.com ARC-Filter: OpenARC Filter v1.0.0 sourceware.org 9D6813858419 Authentication-Results: server2.sourceware.org; arc=none smtp.remote-ip=217.140.110.172 ARC-Seal: i=1; a=rsa-sha256; d=sourceware.org; s=key; t=1708537187; cv=none; b=RtmNXJomAaqglciWC8KTjy7ODpvq5qWKdPsRXiqXXtjDNkF6fdwKlVDAKGBCeLW1E4UGTeZwSWsxZZNUjjQHHM0oR33sWa09rFP9KNNNN/Ztv14HXkRHo+FJuxoZjIUY2nL0sXBvxdJ1+HuKP6jocaXNHlCQZ0QqKvplYq8k+4s= ARC-Message-Signature: i=1; a=rsa-sha256; d=sourceware.org; s=key; t=1708537187; c=relaxed/simple; bh=wXCsCwevb+9plLUHXwlr2GQY9UVC45vSKm8aQh7Rd8Y=; h=From:To:Subject:Date:Message-Id:MIME-Version; b=AxO4nusgr295akHIFtAaDHMymlG3zX2vafzz1wYHmf8Vl3b2zLOYlJgZ9uUUqnq6J8iFAUILyVIMk6DdIeSWv+IFRb4RbgTfC0nZ8LWpE1a20cHexfD9kwtGeeXdFsnfy4AjSWT+iaZ51e1ZH6C/Lk/Yt1ZqJnD9sXtGIfagvBo= ARC-Authentication-Results: i=1; server2.sourceware.org Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id BD8FE1007; Wed, 21 Feb 2024 09:40:17 -0800 (PST) Received: from e107157-lin.cambridge.arm.com (e107157-lin.cambridge.arm.com [10.2.78.70]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPSA id 43D713F73F; Wed, 21 Feb 2024 09:39:38 -0800 (PST) From: Andre Vieira To: gcc-patches@gcc.gnu.org Cc: stam.markianos-wright@arm.com, richard.earnshaw@arm.com Subject: [PATCH v4 1/5] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns Date: Wed, 21 Feb 2024 17:39:18 +0000 Message-Id: <20240221173922.19137-2-andre.simoesdiasvieira@arm.com> X-Mailer: git-send-email 2.17.1 In-Reply-To: <20240221173922.19137-1-andre.simoesdiasvieira@arm.com> References: <20240221173922.19137-1-andre.simoesdiasvieira@arm.com> MIME-Version: 1.0 X-Spam-Status: No, score=-11.7 required=5.0 tests=BAYES_00, GIT_PATCH_0, HTML_SINGLET_MANY, KAM_DMARC_NONE, KAM_DMARC_STATUS, KAM_LAZY_DOMAIN_SECURITY, KAM_SHORT, SPF_HELO_NONE, SPF_NONE, TXREP, T_SCC_BODY_TEXT_LINE autolearn=ham autolearn_force=no version=3.4.6 X-Spam-Checker-Version: SpamAssassin 3.4.6 (2021-04-09) on server2.sourceware.org X-BeenThere: gcc-patches@gcc.gnu.org X-Mailman-Version: 2.1.30 Precedence: list List-Id: Gcc-patches mailing list List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: gcc-patches-bounces+incoming=patchwork.ozlabs.org@gcc.gnu.org This patch adds an attribute to the mve md patterns to be able to identify predicable MVE instructions and what their predicated and unpredicated variants are. This attribute is used to encode the icode of the unpredicated variant of an instruction in its predicated variant. This will make it possible for us to transform VPT-predicated insns in the insn chain into their unpredicated equivalents when transforming the loop into a MVE Tail-Predicated Low Overhead Loop. For example: `mve_vldrbq_z_ -> mve_vldrbq_`. gcc/ChangeLog: * config/arm/arm.md (mve_unpredicated_insn): New attribute. * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define. (MVE_VPT_UNPREDICATED_INSN_P): Likewise. (MVE_VPT_PREDICABLE_INSN_P): Likewise. * config/arm/vec-common.md (mve_vshlq_): Add attribute. * config/arm/mve.md (arm_vcx1q_p_v16qi): Add attribute. (arm_vcx1qv16qi): Likewise. (arm_vcx1qav16qi): Likewise. (arm_vcx1qv16qi): Likewise. (arm_vcx2q_p_v16qi): Likewise. (arm_vcx2qv16qi): Likewise. (arm_vcx2qav16qi): Likewise. (arm_vcx2qv16qi): Likewise. (arm_vcx3q_p_v16qi): Likewise. (arm_vcx3qv16qi): Likewise. (arm_vcx3qav16qi): Likewise. (arm_vcx3qv16qi): Likewise. (@mve_q_): Likewise. (@mve_q_int_): Likewise. (@mve_q_v4si): Likewise. (@mve_q_n_): Likewise. (@mve_q_r_): Likewise. (@mve_q_f): Likewise. (@mve_q_m_): Likewise. (@mve_q_m_n_): Likewise. (@mve_q_m_r_): Likewise. (@mve_q_m_f): Likewise. (@mve_q_int_m_): Likewise. (@mve_q_p_v4si): Likewise. (@mve_q_p_): Likewise. (@mve_q_): Likewise. (@mve_q_f): Likewise. (@mve_q_m_): Likewise. (@mve_q_m_f): Likewise. (mve_vq_f): Likewise. (mve_q): Likewise. (mve_q_f): Likewise. (mve_vadciq_v4si): Likewise. (mve_vadciq_m_v4si): Likewise. (mve_vadcq_v4si): Likewise. (mve_vadcq_m_v4si): Likewise. (mve_vandq_): Likewise. (mve_vandq_f): Likewise. (mve_vandq_m_): Likewise. (mve_vandq_m_f): Likewise. (mve_vandq_s): Likewise. (mve_vandq_u): Likewise. (mve_vbicq_): Likewise. (mve_vbicq_f): Likewise. (mve_vbicq_m_): Likewise. (mve_vbicq_m_f): Likewise. (mve_vbicq_m_n_): Likewise. (mve_vbicq_n_): Likewise. (mve_vbicq_s): Likewise. (mve_vbicq_u): Likewise. (@mve_vclzq_s): Likewise. (mve_vclzq_u): Likewise. (@mve_vcmp_q_): Likewise. (@mve_vcmp_q_n_): Likewise. (@mve_vcmp_q_f): Likewise. (@mve_vcmp_q_n_f): Likewise. (@mve_vcmp_q_m_f): Likewise. (@mve_vcmp_q_m_n_): Likewise. (@mve_vcmp_q_m_): Likewise. (@mve_vcmp_q_m_n_f): Likewise. (mve_vctpq): Likewise. (mve_vctpq_m): Likewise. (mve_vcvtaq_): Likewise. (mve_vcvtaq_m_): Likewise. (mve_vcvtbq_f16_f32v8hf): Likewise. (mve_vcvtbq_f32_f16v4sf): Likewise. (mve_vcvtbq_m_f16_f32v8hf): Likewise. (mve_vcvtbq_m_f32_f16v4sf): Likewise. (mve_vcvtmq_): Likewise. (mve_vcvtmq_m_): Likewise. (mve_vcvtnq_): Likewise. (mve_vcvtnq_m_): Likewise. (mve_vcvtpq_): Likewise. (mve_vcvtpq_m_): Likewise. (mve_vcvtq_from_f_): Likewise. (mve_vcvtq_m_from_f_): Likewise. (mve_vcvtq_m_n_from_f_): Likewise. (mve_vcvtq_m_n_to_f_): Likewise. (mve_vcvtq_m_to_f_): Likewise. (mve_vcvtq_n_from_f_): Likewise. (mve_vcvtq_n_to_f_): Likewise. (mve_vcvtq_to_f_): Likewise. (mve_vcvttq_f16_f32v8hf): Likewise. (mve_vcvttq_f32_f16v4sf): Likewise. (mve_vcvttq_m_f16_f32v8hf): Likewise. (mve_vcvttq_m_f32_f16v4sf): Likewise. (mve_vdwdupq_m_wb_u_insn): Likewise. (mve_vdwdupq_wb_u_insn): Likewise. (mve_veorq_s>): Likewise. (mve_veorq_u>): Likewise. (mve_veorq_f): Likewise. (mve_vidupq_m_wb_u_insn): Likewise. (mve_vidupq_u_insn): Likewise. (mve_viwdupq_m_wb_u_insn): Likewise. (mve_viwdupq_wb_u_insn): Likewise. (mve_vldrbq_): Likewise. (mve_vldrbq_gather_offset_): Likewise. (mve_vldrbq_gather_offset_z_): Likewise. (mve_vldrbq_z_): Likewise. (mve_vldrdq_gather_base_v2di): Likewise. (mve_vldrdq_gather_base_wb_v2di_insn): Likewise. (mve_vldrdq_gather_base_wb_z_v2di_insn): Likewise. (mve_vldrdq_gather_base_z_v2di): Likewise. (mve_vldrdq_gather_offset_v2di): Likewise. (mve_vldrdq_gather_offset_z_v2di): Likewise. (mve_vldrdq_gather_shifted_offset_v2di): Likewise. (mve_vldrdq_gather_shifted_offset_z_v2di): Likewise. (mve_vldrhq_): Likewise. (mve_vldrhq_fv8hf): Likewise. (mve_vldrhq_gather_offset_): Likewise. (mve_vldrhq_gather_offset_fv8hf): Likewise. (mve_vldrhq_gather_offset_z_): Likewise. (mve_vldrhq_gather_offset_z_fv8hf): Likewise. (mve_vldrhq_gather_shifted_offset_): Likewise. (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise. (mve_vldrhq_gather_shifted_offset_z_): Likewise. (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise. (mve_vldrhq_z_): Likewise. (mve_vldrhq_z_fv8hf): Likewise. (mve_vldrwq_v4si): Likewise. (mve_vldrwq_fv4sf): Likewise. (mve_vldrwq_gather_base_v4si): Likewise. (mve_vldrwq_gather_base_fv4sf): Likewise. (mve_vldrwq_gather_base_wb_v4si_insn): Likewise. (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise. (mve_vldrwq_gather_base_wb_z_v4si_insn): Likewise. (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise. (mve_vldrwq_gather_base_z_v4si): Likewise. (mve_vldrwq_gather_base_z_fv4sf): Likewise. (mve_vldrwq_gather_offset_v4si): Likewise. (mve_vldrwq_gather_offset_fv4sf): Likewise. (mve_vldrwq_gather_offset_z_v4si): Likewise. (mve_vldrwq_gather_offset_z_fv4sf): Likewise. (mve_vldrwq_gather_shifted_offset_v4si): Likewise. (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise. (mve_vldrwq_gather_shifted_offset_z_v4si): Likewise. (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise. (mve_vldrwq_z_v4si): Likewise. (mve_vldrwq_z_fv4sf): Likewise. (mve_vmvnq_s): Likewise. (mve_vmvnq_u): Likewise. (mve_vornq_): Likewise. (mve_vornq_f): Likewise. (mve_vornq_m_): Likewise. (mve_vornq_m_f): Likewise. (mve_vornq_s): Likewise. (mve_vornq_u): Likewise. (mve_vorrq_): Likewise. (mve_vorrq_f): Likewise. (mve_vorrq_m_): Likewise. (mve_vorrq_m_f): Likewise. (mve_vorrq_m_n_): Likewise. (mve_vorrq_n_): Likewise. (mve_vorrq_s): Likewise. (mve_vorrq_s): Likewise. (mve_vsbciq_v4si): Likewise. (mve_vsbciq_m_v4si): Likewise. (mve_vsbcq_v4si): Likewise. (mve_vsbcq_m_v4si): Likewise. (mve_vshlcq_): Likewise. (mve_vshlcq_m_): Likewise. (mve_vshrq_m_n_): Likewise. (mve_vshrq_n_): Likewise. (mve_vstrbq_): Likewise. (mve_vstrbq_p_): Likewise. (mve_vstrbq_scatter_offset__insn): Likewise. (mve_vstrbq_scatter_offset_p__insn): Likewise. (mve_vstrdq_scatter_base_v2di): Likewise. (mve_vstrdq_scatter_base_p_v2di): Likewise. (mve_vstrdq_scatter_base_wb_v2di): Likewise. (mve_vstrdq_scatter_base_wb_p_v2di): Likewise. (mve_vstrdq_scatter_offset_v2di_insn): Likewise. (mve_vstrdq_scatter_offset_p_v2di_insn): Likewise. (mve_vstrdq_scatter_shifted_offset_v2di_insn): Likewise. (mve_vstrdq_scatter_shifted_offset_p_v2di_insn): Likewise. (mve_vstrhq_): Likewise. (mve_vstrhq_fv8hf): Likewise. (mve_vstrhq_p_): Likewise. (mve_vstrhq_p_fv8hf): Likewise. (mve_vstrhq_scatter_offset__insn): Likewise. (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise. (mve_vstrhq_scatter_offset_p__insn): Likewise. (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise. (mve_vstrhq_scatter_shifted_offset__insn): Likewise. (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise. (mve_vstrhq_scatter_shifted_offset_p__insn): Likewise. (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise. (mve_vstrwq_v4si): Likewise. (mve_vstrwq_fv4sf): Likewise. (mve_vstrwq_p_v4si): Likewise. (mve_vstrwq_p_fv4sf): Likewise. (mve_vstrwq_scatter_base_v4si): Likewise. (mve_vstrwq_scatter_base_fv4sf): Likewise. (mve_vstrwq_scatter_base_p_v4si): Likewise. (mve_vstrwq_scatter_base_p_fv4sf): Likewise. (mve_vstrwq_scatter_base_wb_v4si): Likewise. (mve_vstrwq_scatter_base_wb_fv4sf): Likewise. (mve_vstrwq_scatter_base_wb_p_v4si): Likewise. (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise. (mve_vstrwq_scatter_offset_v4si_insn): Likewise. (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise. (mve_vstrwq_scatter_offset_p_v4si_insn): Likewise. (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise. (mve_vstrwq_scatter_shifted_offset_v4si_insn): Likewise. (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise. (mve_vstrwq_scatter_shifted_offset_p_v4si_insn): Likewise. (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise. diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h index 2a2207c0ba1..449e6935b32 100644 --- a/gcc/config/arm/arm.h +++ b/gcc/config/arm/arm.h @@ -2375,6 +2375,21 @@ extern int making_const_table; else if (TARGET_THUMB1) \ thumb1_final_prescan_insn (INSN) +/* These defines are useful to refer to the value of the mve_unpredicated_insn + insn attribute. Note that, because these use the get_attr_* function, these + will change recog_data if (INSN) isn't current_insn. */ +#define MVE_VPT_PREDICABLE_INSN_P(INSN) \ + (recog_memoized (INSN) >= 0 \ + && get_attr_mve_unpredicated_insn (INSN) != CODE_FOR_nothing) + +#define MVE_VPT_PREDICATED_INSN_P(INSN) \ + (MVE_VPT_PREDICABLE_INSN_P (INSN) \ + && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN)) + +#define MVE_VPT_UNPREDICATED_INSN_P(INSN) \ + (MVE_VPT_PREDICABLE_INSN_P (INSN) \ + && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN)) + #define ARM_SIGN_EXTEND(x) ((HOST_WIDE_INT) \ (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x) \ : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\ diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md index 4a98f2d7b62..3f2863adf44 100644 --- a/gcc/config/arm/arm.md +++ b/gcc/config/arm/arm.md @@ -124,6 +124,12 @@ (define_attr "fpu" "none,vfp" ; and not all ARM insns do. (define_attr "predicated" "yes,no" (const_string "no")) +; An attribute that encodes the CODE_FOR_ of the MVE VPT unpredicated +; version of a VPT-predicated instruction. For unpredicated instructions +; that are predicable, encode the same pattern's CODE_FOR_ as a way to +; encode that it is a predicable instruction. +(define_attr "mve_unpredicated_insn" "" (symbol_ref "CODE_FOR_nothing")) + ; LENGTH of an instruction (in bytes) (define_attr "length" "" (const_int 4)) diff --git a/gcc/config/arm/iterators.md b/gcc/config/arm/iterators.md index 547d87f3bc8..7600bf62531 100644 --- a/gcc/config/arm/iterators.md +++ b/gcc/config/arm/iterators.md @@ -2311,6 +2311,7 @@ (define_int_attr simd32_op [(UNSPEC_QADD8 "qadd8") (UNSPEC_QSUB8 "qsub8") (define_int_attr mmla_sfx [(UNSPEC_MATMUL_S "s8") (UNSPEC_MATMUL_U "u8") (UNSPEC_MATMUL_US "s8")]) + ;;MVE int attribute. (define_int_attr supf [(VCVTQ_TO_F_S "s") (VCVTQ_TO_F_U "u") (VREV16Q_S "s") (VREV16Q_U "u") (VMVNQ_N_S "s") (VMVNQ_N_U "u") diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md index 0fabbaa931b..8aa0bded7f0 100644 --- a/gcc/config/arm/mve.md +++ b/gcc/config/arm/mve.md @@ -17,7 +17,7 @@ ;; along with GCC; see the file COPYING3. If not see ;; . -(define_insn "*mve_mov" +(define_insn "mve_mov" [(set (match_operand:MVE_types 0 "nonimmediate_operand" "=w,w,r,w , w, r,Ux,w") (match_operand:MVE_types 1 "general_operand" " w,r,w,DnDm,UxUi,r,w, Ul"))] "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT" @@ -81,18 +81,27 @@ (define_insn "*mve_mov" return ""; } } - [(set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load") + [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_mov") + (symbol_ref "CODE_FOR_nothing") + (symbol_ref "CODE_FOR_nothing") + (symbol_ref "CODE_FOR_mve_mov") + (symbol_ref "CODE_FOR_mve_mov") + (symbol_ref "CODE_FOR_nothing") + (symbol_ref "CODE_FOR_mve_mov") + (symbol_ref "CODE_FOR_nothing")]) + (set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load") (set_attr "length" "4,8,8,4,4,8,4,8") (set_attr "thumb2_pool_range" "*,*,*,*,1018,*,*,*") (set_attr "neg_pool_range" "*,*,*,*,996,*,*,*")]) -(define_insn "*mve_vdup" +(define_insn "mve_vdup" [(set (match_operand:MVE_vecs 0 "s_register_operand" "=w") (vec_duplicate:MVE_vecs (match_operand: 1 "s_register_operand" "r")))] "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT" "vdup.\t%q0, %1" - [(set_attr "length" "4") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdup")) + (set_attr "length" "4") (set_attr "type" "mve_move")]) ;; @@ -145,7 +154,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -159,7 +169,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -173,7 +184,8 @@ (define_insn "mve_vq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "v.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -187,7 +199,8 @@ (define_insn "@mve_q_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".%#\t%q0, %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -201,7 +214,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; ;; [vcvttq_f32_f16]) @@ -214,7 +228,8 @@ (define_insn "mve_vcvttq_f32_f16v4sf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtt.f32.f16\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) + (set_attr "type" "mve_move") ]) ;; @@ -228,7 +243,8 @@ (define_insn "mve_vcvtbq_f32_f16v4sf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtb.f32.f16\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) + (set_attr "type" "mve_move") ]) ;; @@ -242,7 +258,8 @@ (define_insn "mve_vcvtq_to_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.f%#.%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) + (set_attr "type" "mve_move") ]) ;; @@ -256,7 +273,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -270,7 +288,8 @@ (define_insn "mve_vcvtq_from_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.%#.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) + (set_attr "type" "mve_move") ]) ;; @@ -284,7 +303,8 @@ (define_insn "mve_vq_s" ] "TARGET_HAVE_MVE" "v.s%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -297,7 +317,8 @@ (define_insn "mve_vmvnq_u" ] "TARGET_HAVE_MVE" "vmvn\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vmvnq_s" [ @@ -318,7 +339,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -331,7 +353,8 @@ (define_insn "@mve_vclzq_s" ] "TARGET_HAVE_MVE" "vclz.i%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vclzq_u" [ @@ -354,7 +377,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -368,7 +392,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -382,7 +407,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -397,7 +423,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -411,7 +438,8 @@ (define_insn "mve_vcvtpq_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtp.%#.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) + (set_attr "type" "mve_move") ]) ;; @@ -425,7 +453,8 @@ (define_insn "mve_vcvtnq_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtn.%#.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) + (set_attr "type" "mve_move") ]) ;; @@ -439,7 +468,8 @@ (define_insn "mve_vcvtmq_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtm.%#.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) + (set_attr "type" "mve_move") ]) ;; @@ -453,7 +483,8 @@ (define_insn "mve_vcvtaq_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvta.%#.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) + (set_attr "type" "mve_move") ]) ;; @@ -467,7 +498,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".i%#\t%q0, %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -481,7 +513,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -495,7 +528,8 @@ (define_insn "@mve_q_v4si" ] "TARGET_HAVE_MVE" ".32\t%Q0, %R0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -509,7 +543,8 @@ (define_insn "mve_vctpq" ] "TARGET_HAVE_MVE" "vctp.\t%1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctpq")) + (set_attr "type" "mve_move") ]) ;; @@ -523,7 +558,8 @@ (define_insn "mve_vpnotv16bi" ] "TARGET_HAVE_MVE" "vpnot" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpnotv16bi")) + (set_attr "type" "mve_move") ]) ;; @@ -538,7 +574,8 @@ (define_insn "@mve_q_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -553,7 +590,8 @@ (define_insn "mve_vcvtq_n_to_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.f.\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) + (set_attr "type" "mve_move") ]) ;; [vcreateq_f]) @@ -599,7 +637,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; Versions that take constant vectors as operand 2 (with all elements @@ -617,7 +656,8 @@ (define_insn "mve_vshrq_n_s_imm" VALID_NEON_QREG_MODE (mode), true); } - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_s_imm")) + (set_attr "type" "mve_move") ]) (define_insn "mve_vshrq_n_u_imm" [ @@ -632,7 +672,8 @@ (define_insn "mve_vshrq_n_u_imm" VALID_NEON_QREG_MODE (mode), true); } - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_u_imm")) + (set_attr "type" "mve_move") ]) ;; @@ -647,7 +688,8 @@ (define_insn "mve_vcvtq_n_from_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt..f\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) + (set_attr "type" "mve_move") ]) ;; @@ -662,8 +704,9 @@ (define_insn "@mve_q_p_v4si" ] "TARGET_HAVE_MVE" "vpst\;t.32\t%Q0, %R0, %q1" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_]) @@ -676,7 +719,8 @@ (define_insn "@mve_vcmpq_" ] "TARGET_HAVE_MVE" "vcmp.%#\t, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_")) + (set_attr "type" "mve_move") ]) ;; @@ -691,7 +735,8 @@ (define_insn "@mve_vcmpq_n_" ] "TARGET_HAVE_MVE" "vcmp.%# , %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -722,7 +767,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -739,7 +785,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".i%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -754,7 +801,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -769,7 +817,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -789,8 +838,11 @@ (define_insn "mve_vandq_u" "@ vand\t%q0, %q1, %q2 * return neon_output_logic_immediate (\"vand\", &operands[2], mode, 1, VALID_NEON_QREG_MODE (mode));" - [(set_attr "type" "mve_move") + [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_vandq_u") + (symbol_ref "CODE_FOR_nothing")]) + (set_attr "type" "mve_move") ]) + (define_expand "mve_vandq_s" [ (set (match_operand:MVE_2 0 "s_register_operand") @@ -811,7 +863,8 @@ (define_insn "mve_vbicq_u" ] "TARGET_HAVE_MVE" "vbic\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vbicq_s" @@ -835,7 +888,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -853,7 +907,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %q2, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; Auto vectorizer pattern for int vcadd @@ -876,7 +931,8 @@ (define_insn "mve_veorq_u" ] "TARGET_HAVE_MVE" "veor\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_veorq_s" [ @@ -904,7 +960,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -920,7 +977,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".s%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -935,7 +993,8 @@ (define_insn "mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) @@ -954,7 +1013,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -972,7 +1032,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -988,7 +1049,8 @@ (define_insn "@mve_q_int_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_int_")) + (set_attr "type" "mve_move") ]) ;; @@ -1004,7 +1066,8 @@ (define_insn "mve_q" ] "TARGET_HAVE_MVE" ".i%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q")) + (set_attr "type" "mve_move") ]) ;; @@ -1018,7 +1081,8 @@ (define_insn "mve_vornq_s" ] "TARGET_HAVE_MVE" "vorn\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vornq_u" @@ -1047,7 +1111,8 @@ (define_insn "mve_vorrq_s" "@ vorr\t%q0, %q1, %q2 * return neon_output_logic_immediate (\"vorr\", &operands[2], mode, 0, VALID_NEON_QREG_MODE (mode));" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vorrq_u" [ @@ -1071,7 +1136,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1087,7 +1153,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1103,7 +1170,8 @@ (define_insn "@mve_q_r_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_r_")) + (set_attr "type" "mve_move") ]) ;; @@ -1118,7 +1186,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1133,7 +1202,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1148,7 +1218,8 @@ (define_insn "@mve_q_v4si" ] "TARGET_HAVE_MVE" ".32\t%Q0, %R0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -1165,7 +1236,8 @@ (define_insn "@mve_q_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1179,7 +1251,8 @@ (define_insn "mve_vandq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vand\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1193,7 +1266,8 @@ (define_insn "mve_vbicq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vbic\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1209,7 +1283,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q1, %q2, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1223,7 +1298,8 @@ (define_insn "@mve_vcmpq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcmp.f%# , %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1238,7 +1314,8 @@ (define_insn "@mve_vcmpq_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcmp.f%# , %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1253,8 +1330,10 @@ (define_insn "mve_vctpq_m" ] "TARGET_HAVE_MVE" "vpst\;vctpt.\t%1" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctpq")) + (set_attr "type" "mve_move") + (set_attr "length""8") +]) ;; ;; [vcvtbq_f16_f32]) @@ -1268,7 +1347,8 @@ (define_insn "mve_vcvtbq_f16_f32v8hf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtb.f16.f32\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) + (set_attr "type" "mve_move") ]) ;; @@ -1283,7 +1363,8 @@ (define_insn "mve_vcvttq_f16_f32v8hf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtt.f16.f32\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) + (set_attr "type" "mve_move") ]) ;; @@ -1297,7 +1378,8 @@ (define_insn "mve_veorq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "veor\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1313,7 +1395,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1331,7 +1414,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1346,7 +1430,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1364,7 +1449,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1384,7 +1470,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1400,7 +1487,8 @@ (define_insn "mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1414,7 +1502,8 @@ (define_insn "mve_vornq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vorn\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1428,7 +1517,8 @@ (define_insn "mve_vorrq_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vorr\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1444,7 +1534,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".i%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1460,7 +1551,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".s%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1476,7 +1568,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".s%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1494,7 +1587,8 @@ (define_insn "@mve_q_v4si" ] "TARGET_HAVE_MVE" ".32\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -1510,7 +1604,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1526,7 +1621,8 @@ (define_insn "@mve_q_poly_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_poly_")) + (set_attr "type" "mve_move") ]) ;; @@ -1547,8 +1643,9 @@ (define_insn "@mve_vcmpq_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%#\t, %q1, %q2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_f")) + (set_attr "length""8")]) + ;; ;; [vcvtaq_m_u, vcvtaq_m_s]) ;; @@ -1562,8 +1659,10 @@ (define_insn "mve_vcvtaq_m_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtat.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) + ;; ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u]) ;; @@ -1577,8 +1676,9 @@ (define_insn "mve_vcvtq_m_to_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#\t%q0, %q2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vqrshrnbq_n_u, vqrshrnbq_n_s] @@ -1604,7 +1704,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1623,7 +1724,8 @@ (define_insn "@mve_q_v4si" ] "TARGET_HAVE_MVE" ".32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -1639,7 +1741,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1685,7 +1788,10 @@ (define_insn "mve_vshlcq_" (match_dup 4)] VSHLCQ))] "TARGET_HAVE_MVE" - "vshlc\t%q0, %1, %4") + "vshlc\t%q0, %1, %4" + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_")) + (set_attr "type" "mve_move") +]) ;; ;; [vabsq_m_s] @@ -1705,7 +1811,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1721,7 +1828,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1744,7 +1852,8 @@ (define_insn "@mve_vcmpq_m_n_" ] "TARGET_HAVE_MVE" "vpst\;vcmpt.%#\t, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1767,7 +1876,8 @@ (define_insn "@mve_vcmpq_m_" ] "TARGET_HAVE_MVE" "vpst\;vcmpt.%#\t, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1783,7 +1893,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1800,7 +1911,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.s%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1819,7 +1931,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1838,7 +1951,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1857,7 +1971,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1878,7 +1993,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1894,7 +2010,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1910,7 +2027,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" "\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1933,7 +2051,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -1950,7 +2069,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1967,7 +2087,8 @@ (define_insn "@mve_q_m_r_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_r_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1983,7 +2104,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1999,7 +2121,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -2015,7 +2138,8 @@ (define_insn "@mve_q_n_" ] "TARGET_HAVE_MVE" ".%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -2038,7 +2162,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2054,7 +2179,8 @@ (define_insn "@mve_q_p_v4si" ] "TARGET_HAVE_MVE" "vpst\;t.32\t%Q0, %R0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270]) @@ -2072,7 +2198,9 @@ (define_insn "@mve_q_f" "@ vcmul.f%# %q0, %q2, %q3, # vcmla.f%# %q0, %q2, %q3, #" - [(set_attr "type" "mve_move") + [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_q_f") + (symbol_ref "CODE_FOR_mve_q_f")]) + (set_attr "type" "mve_move") ]) ;; @@ -2093,7 +2221,8 @@ (define_insn "@mve_vcmpq_m_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%#\t, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2109,7 +2238,8 @@ (define_insn "mve_vcvtbq_m_f16_f32v8hf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtbt.f16.f32\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2125,7 +2255,8 @@ (define_insn "mve_vcvtbq_m_f32_f16v4sf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtbt.f32.f16\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2141,7 +2272,8 @@ (define_insn "mve_vcvttq_m_f16_f32v8hf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvttt.f16.f32\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2157,8 +2289,9 @@ (define_insn "mve_vcvttq_m_f32_f16v4sf" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvttt.f32.f16\t%q0, %q2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vdupq_m_n_f]) @@ -2173,7 +2306,8 @@ (define_insn "@mve_q_m_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2190,7 +2324,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2207,7 +2342,8 @@ (define_insn "@mve_q_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" ".f%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2224,7 +2360,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2243,7 +2380,8 @@ (define_insn "@mve_q_p_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2262,7 +2400,8 @@ (define_insn "@mve_q_" ] "TARGET_HAVE_MVE" ".%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") ]) ;; @@ -2281,7 +2420,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2298,7 +2438,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2319,7 +2460,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2335,7 +2477,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.i%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2352,7 +2495,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.i%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2368,7 +2512,8 @@ (define_insn "@mve_q_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2384,7 +2529,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2400,7 +2546,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2416,7 +2563,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2435,7 +2583,8 @@ (define_insn "@mve_q_p_v4si" ] "TARGET_HAVE_MVE" "vpst\;t.32\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2451,7 +2600,8 @@ (define_insn "mve_vcvtmq_m_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtmt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2467,7 +2617,8 @@ (define_insn "mve_vcvtpq_m_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtpt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2483,7 +2634,8 @@ (define_insn "mve_vcvtnq_m_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtnt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2500,7 +2652,8 @@ (define_insn "mve_vcvtq_m_n_from_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2516,7 +2669,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2532,8 +2686,9 @@ (define_insn "mve_vcvtq_m_from_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vabavq_p_s, vabavq_p_u]) @@ -2549,7 +2704,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -2566,8 +2722,9 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\n\tt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") - (set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") + (set_attr "length" "8")]) ;; ;; [vsriq_m_n_s, vsriq_m_n_u]) @@ -2583,8 +2740,9 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") - (set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") + (set_attr "length" "8")]) ;; ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s]) @@ -2600,7 +2758,8 @@ (define_insn "mve_vcvtq_m_n_to_f_" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2640,7 +2799,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2659,8 +2819,9 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.i%# %q0, %q2, %3" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vaddq_m_u, vaddq_m_s] @@ -2678,7 +2839,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.i%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2698,7 +2860,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2715,8 +2878,9 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vcaddq_rot90_m_u, vcaddq_rot90_m_s] @@ -2735,7 +2899,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %q3, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2763,7 +2928,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2784,7 +2950,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2802,7 +2969,8 @@ (define_insn "@mve_q_int_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_int_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2819,7 +2987,8 @@ (define_insn "mve_vornq_m_" ] "TARGET_HAVE_MVE" "vpst\;vornt\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2837,7 +3006,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2855,7 +3025,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2872,7 +3043,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2892,7 +3064,8 @@ (define_insn "@mve_q_p_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2920,7 +3093,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2940,7 +3114,8 @@ (define_insn "@mve_q_p_v4si" ] "TARGET_HAVE_MVE" "vpst\;t.32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_v4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2958,7 +3133,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2976,7 +3152,8 @@ (define_insn "@mve_q_poly_m_" ] "TARGET_HAVE_MVE" "vpst\;t.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_poly_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2994,7 +3171,8 @@ (define_insn "@mve_q_m_n_" ] "TARGET_HAVE_MVE" "vpst\;t.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3012,7 +3190,8 @@ (define_insn "@mve_q_m_" ] "TARGET_HAVE_MVE" "vpst\;t.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3036,7 +3215,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3057,7 +3237,8 @@ (define_insn "@mve_q_m_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3077,7 +3258,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3094,7 +3276,8 @@ (define_insn "@mve_q_m_n_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3116,7 +3299,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%#\t%q0, %q2, %q3, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3136,7 +3320,8 @@ (define_insn "@mve_q_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;t.f%#\t%q0, %q2, %q3, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3153,7 +3338,8 @@ (define_insn "mve_vornq_m_f" ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vornt\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3173,7 +3359,8 @@ (define_insn "mve_vstrbq_" output_asm_insn("vstrb.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) + (set_attr "length" "4")]) ;; ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u] @@ -3201,7 +3388,8 @@ (define_insn "mve_vstrbq_scatter_offset__insn" VSTRBSOQ))] "TARGET_HAVE_MVE" "vstrb.\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u] @@ -3223,7 +3411,8 @@ (define_insn "mve_vstrwq_scatter_base_v4si" output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u] @@ -3246,7 +3435,8 @@ (define_insn "mve_vldrbq_gather_offset_" output_asm_insn ("vldrb.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) + (set_attr "length" "4")]) ;; ;; [vldrbq_s vldrbq_u] @@ -3268,7 +3458,8 @@ (define_insn "mve_vldrbq_" output_asm_insn ("vldrb.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_base_s vldrwq_gather_base_u] @@ -3288,7 +3479,8 @@ (define_insn "mve_vldrwq_gather_base_v4si" output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) + (set_attr "length" "4")]) ;; ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u] @@ -3320,7 +3512,8 @@ (define_insn "mve_vstrbq_scatter_offset_p__insn" VSTRBSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrbt.\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u] @@ -3343,7 +3536,8 @@ (define_insn "mve_vstrwq_scatter_base_p_v4si" output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) + (set_attr "length" "8")]) (define_insn "mve_vstrbq_p_" [(set (match_operand: 0 "mve_memory_operand" "=Ux") @@ -3361,7 +3555,8 @@ (define_insn "mve_vstrbq_p_" output_asm_insn ("vpst\;vstrbt.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) + (set_attr "length" "8")]) ;; ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u] @@ -3386,7 +3581,8 @@ (define_insn "mve_vldrbq_gather_offset_z_" output_asm_insn ("vpst\n\tvldrbt.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) + (set_attr "length" "8")]) ;; ;; [vldrbq_z_s vldrbq_z_u] @@ -3409,7 +3605,8 @@ (define_insn "mve_vldrbq_z_" output_asm_insn ("vpst\;vldrbt.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u] @@ -3430,7 +3627,8 @@ (define_insn "mve_vldrwq_gather_base_z_v4si" output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) + (set_attr "length" "8")]) ;; ;; [vldrhq_f] @@ -3449,7 +3647,8 @@ (define_insn "mve_vldrhq_fv8hf" output_asm_insn ("vldrh.16\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u] @@ -3472,7 +3671,8 @@ (define_insn "mve_vldrhq_gather_offset_" output_asm_insn ("vldrh.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u] @@ -3497,7 +3697,8 @@ (define_insn "mve_vldrhq_gather_offset_z_" output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) + (set_attr "length" "8")]) ;; ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u] @@ -3520,7 +3721,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_" output_asm_insn ("vldrh.\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u] @@ -3545,7 +3747,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_z_" output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) + (set_attr "length" "8")]) ;; ;; [vldrhq_s, vldrhq_u] @@ -3567,7 +3770,8 @@ (define_insn "mve_vldrhq_" output_asm_insn ("vldrh.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) + (set_attr "length" "4")]) ;; ;; [vldrhq_z_f] @@ -3587,7 +3791,8 @@ (define_insn "mve_vldrhq_z_fv8hf" output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vldrhq_z_s vldrhq_z_u] @@ -3610,7 +3815,8 @@ (define_insn "mve_vldrhq_z_" output_asm_insn ("vpst\;vldrht.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) + (set_attr "length" "8")]) ;; ;; [vldrwq_f] @@ -3629,7 +3835,8 @@ (define_insn "mve_vldrwq_fv4sf" output_asm_insn ("vldrw.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_s vldrwq_u] @@ -3648,7 +3855,8 @@ (define_insn "mve_vldrwq_v4si" output_asm_insn ("vldrw.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrwq_z_f] @@ -3668,7 +3876,8 @@ (define_insn "mve_vldrwq_z_fv4sf" output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_z_s vldrwq_z_u] @@ -3688,7 +3897,8 @@ (define_insn "mve_vldrwq_z_v4si" output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) + (set_attr "length" "8")]) (define_expand "@mve_vld1q_f" [(match_operand:MVE_0 0 "s_register_operand") @@ -3728,7 +3938,8 @@ (define_insn "mve_vldrdq_gather_base_v2di" output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) + (set_attr "length" "4")]) ;; ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u] @@ -3749,7 +3960,8 @@ (define_insn "mve_vldrdq_gather_base_z_v2di" output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) + (set_attr "length" "8")]) ;; ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u] @@ -3769,7 +3981,8 @@ (define_insn "mve_vldrdq_gather_offset_v2di" output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) + (set_attr "length" "4")]) ;; ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u] @@ -3790,7 +4003,8 @@ (define_insn "mve_vldrdq_gather_offset_z_v2di" output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) + (set_attr "length" "8")]) ;; ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u] @@ -3810,7 +4024,8 @@ (define_insn "mve_vldrdq_gather_shifted_offset_v2di" output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) + (set_attr "length" "4")]) ;; ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u] @@ -3831,7 +4046,8 @@ (define_insn "mve_vldrdq_gather_shifted_offset_z_v2di" output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) + (set_attr "length" "8")]) ;; ;; [vldrhq_gather_offset_f] @@ -3851,7 +4067,8 @@ (define_insn "mve_vldrhq_gather_offset_fv8hf" output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_offset_z_f] @@ -3873,7 +4090,8 @@ (define_insn "mve_vldrhq_gather_offset_z_fv8hf" output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vldrhq_gather_shifted_offset_f] @@ -3893,7 +4111,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_fv8hf" output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_shifted_offset_z_f] @@ -3915,7 +4134,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_z_fv8hf" output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_base_f] @@ -3935,7 +4155,8 @@ (define_insn "mve_vldrwq_gather_base_fv4sf" output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_base_z_f] @@ -3956,7 +4177,8 @@ (define_insn "mve_vldrwq_gather_base_z_fv4sf" output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_offset_f] @@ -3976,7 +4198,8 @@ (define_insn "mve_vldrwq_gather_offset_fv4sf" output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u] @@ -3996,7 +4219,8 @@ (define_insn "mve_vldrwq_gather_offset_v4si" output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_offset_z_f] @@ -4018,7 +4242,8 @@ (define_insn "mve_vldrwq_gather_offset_z_fv4sf" output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u] @@ -4040,7 +4265,8 @@ (define_insn "mve_vldrwq_gather_offset_z_v4si" output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_shifted_offset_f] @@ -4060,7 +4286,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_fv4sf" output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u] @@ -4080,7 +4307,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_v4si" output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_shifted_offset_z_f] @@ -4102,7 +4330,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_z_fv4sf" output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u] @@ -4124,7 +4353,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_z_v4si" output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrhq_f] @@ -4143,7 +4373,8 @@ (define_insn "mve_vstrhq_fv8hf" output_asm_insn ("vstrh.16\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vstrhq_p_f] @@ -4164,7 +4395,8 @@ (define_insn "mve_vstrhq_p_fv8hf" output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vstrhq_p_s vstrhq_p_u] @@ -4186,7 +4418,8 @@ (define_insn "mve_vstrhq_p_" output_asm_insn ("vpst\;vstrht.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u] @@ -4218,7 +4451,8 @@ (define_insn "mve_vstrhq_scatter_offset_p__insn" VSTRHSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrht.\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u] @@ -4246,7 +4480,8 @@ (define_insn "mve_vstrhq_scatter_offset__insn" VSTRHSOQ))] "TARGET_HAVE_MVE" "vstrh.\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u] @@ -4278,7 +4513,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_p__insn" VSTRHSSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrht.\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u] @@ -4307,7 +4543,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset__insn" VSTRHSSOQ))] "TARGET_HAVE_MVE" "vstrh.\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_s, vstrhq_u] @@ -4326,7 +4563,8 @@ (define_insn "mve_vstrhq_" output_asm_insn ("vstrh.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) + (set_attr "length" "4")]) ;; ;; [vstrwq_f] @@ -4345,7 +4583,8 @@ (define_insn "mve_vstrwq_fv4sf" output_asm_insn ("vstrw.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vstrwq_p_f] @@ -4366,7 +4605,8 @@ (define_insn "mve_vstrwq_p_fv4sf" output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vstrwq_p_s vstrwq_p_u] @@ -4387,7 +4627,8 @@ (define_insn "mve_vstrwq_p_v4si" output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrwq_s vstrwq_u] @@ -4406,7 +4647,8 @@ (define_insn "mve_vstrwq_v4si" output_asm_insn ("vstrw.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) + (set_attr "length" "4")]) (define_expand "@mve_vst1q_f" [(match_operand: 0 "mve_memory_operand") @@ -4449,7 +4691,8 @@ (define_insn "mve_vstrdq_scatter_base_p_v2di" output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u] @@ -4471,7 +4714,8 @@ (define_insn "mve_vstrdq_scatter_base_v2di" output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) + (set_attr "length" "4")]) ;; ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u] @@ -4502,7 +4746,8 @@ (define_insn "mve_vstrdq_scatter_offset_p_v2di_insn" VSTRDSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrdt.64\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u] @@ -4530,7 +4775,8 @@ (define_insn "mve_vstrdq_scatter_offset_v2di_insn" VSTRDSOQ))] "TARGET_HAVE_MVE" "vstrd.64\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) + (set_attr "length" "4")]) ;; ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u] @@ -4562,7 +4808,8 @@ (define_insn "mve_vstrdq_scatter_shifted_offset_p_v2di_insn" VSTRDSSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrdt.64\t%q2, [%0, %q1, uxtw #3]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u] @@ -4591,7 +4838,8 @@ (define_insn "mve_vstrdq_scatter_shifted_offset_v2di_insn" VSTRDSSOQ))] "TARGET_HAVE_MVE" "vstrd.64\t%q2, [%0, %q1, uxtw #3]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_offset_f] @@ -4619,7 +4867,8 @@ (define_insn "mve_vstrhq_scatter_offset_fv8hf_insn" VSTRHQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrh.16\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_offset_p_f] @@ -4650,7 +4899,8 @@ (define_insn "mve_vstrhq_scatter_offset_p_fv8hf_insn" VSTRHQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrht.16\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_shifted_offset_f] @@ -4678,7 +4928,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_fv8hf_insn" VSTRHQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrh.16\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_shifted_offset_p_f] @@ -4710,7 +4961,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn" VSTRHQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_base_f] @@ -4732,7 +4984,8 @@ (define_insn "mve_vstrwq_scatter_base_fv4sf" output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_p_f] @@ -4755,7 +5008,8 @@ (define_insn "mve_vstrwq_scatter_base_p_fv4sf" output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_offset_f] @@ -4783,7 +5037,8 @@ (define_insn "mve_vstrwq_scatter_offset_fv4sf_insn" VSTRWQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrw.32\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_offset_p_f] @@ -4814,7 +5069,8 @@ (define_insn "mve_vstrwq_scatter_offset_p_fv4sf_insn" VSTRWQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrwt.32\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -4845,7 +5101,8 @@ (define_insn "mve_vstrwq_scatter_offset_p_v4si_insn" VSTRWSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrwt.32\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -4873,7 +5130,8 @@ (define_insn "mve_vstrwq_scatter_offset_v4si_insn" VSTRWSOQ))] "TARGET_HAVE_MVE" "vstrw.32\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_shifted_offset_f] @@ -4901,7 +5159,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_fv4sf_insn" VSTRWQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrw.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_shifted_offset_p_f] @@ -4933,7 +5192,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn" VSTRWQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u] @@ -4965,7 +5225,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_p_v4si_insn" VSTRWSSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u] @@ -4994,7 +5255,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_v4si_insn" VSTRWSSOQ))] "TARGET_HAVE_MVE" "vstrw.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) + (set_attr "length" "4")]) ;; ;; [vidupq_n_u]) @@ -5062,7 +5324,8 @@ (define_insn "mve_vidupq_m_wb_u_insn" (match_operand:SI 6 "immediate_operand" "i")))] "TARGET_HAVE_MVE" "vpst\;\tvidupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u_insn")) + (set_attr "length""8")]) ;; ;; [vddupq_n_u]) @@ -5130,7 +5393,8 @@ (define_insn "mve_vddupq_m_wb_u_insn" (match_operand:SI 6 "immediate_operand" "i")))] "TARGET_HAVE_MVE" "vpst\;vddupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u_insn")) + (set_attr "length""8")]) ;; ;; [vdwdupq_n_u]) @@ -5246,8 +5510,9 @@ (define_insn "mve_vdwdupq_m_wb_u_insn" ] "TARGET_HAVE_MVE" "vpst\;vdwdupt.u%#\t%q2, %3, %R4, %5" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u_insn")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [viwdupq_n_u]) @@ -5363,7 +5628,8 @@ (define_insn "mve_viwdupq_m_wb_u_insn" ] "TARGET_HAVE_MVE" "vpst\;\tviwdupt.u%#\t%q2, %3, %R4, %5" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u_insn")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5389,7 +5655,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_v4si" output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u] @@ -5415,7 +5682,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_p_v4si" output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_base_wb_f] @@ -5440,7 +5708,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_fv4sf" output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_wb_p_f] @@ -5466,7 +5735,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_p_fv4sf" output_asm_insn ("vpst\;vstrwt.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u] @@ -5491,7 +5761,8 @@ (define_insn "mve_vstrdq_scatter_base_wb_v2di" output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) + (set_attr "length" "4")]) ;; ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u] @@ -5517,7 +5788,8 @@ (define_insn "mve_vstrdq_scatter_base_wb_p_v2di" output_asm_insn ("vpst\;vstrdt.u64\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) + (set_attr "length" "8")]) (define_expand "mve_vldrwq_gather_base_wb_v4si" [(match_operand:V4SI 0 "s_register_operand") @@ -5569,7 +5841,8 @@ (define_insn "mve_vldrwq_gather_base_wb_v4si_insn" output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) + (set_attr "length" "4")]) (define_expand "mve_vldrwq_gather_base_wb_z_v4si" [(match_operand:V4SI 0 "s_register_operand") @@ -5625,7 +5898,8 @@ (define_insn "mve_vldrwq_gather_base_wb_z_v4si_insn" output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) + (set_attr "length" "8")]) (define_expand "mve_vldrwq_gather_base_wb_fv4sf" [(match_operand:V4SI 0 "s_register_operand") @@ -5677,7 +5951,8 @@ (define_insn "mve_vldrwq_gather_base_wb_fv4sf_insn" output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) + (set_attr "length" "4")]) (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf" [(match_operand:V4SI 0 "s_register_operand") @@ -5734,7 +6009,8 @@ (define_insn "mve_vldrwq_gather_base_wb_z_fv4sf_insn" output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) + (set_attr "length" "8")]) (define_expand "mve_vldrdq_gather_base_wb_v2di" [(match_operand:V2DI 0 "s_register_operand") @@ -5787,7 +6063,8 @@ (define_insn "mve_vldrdq_gather_base_wb_v2di_insn" output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) + (set_attr "length" "4")]) (define_expand "mve_vldrdq_gather_base_wb_z_v2di" [(match_operand:V2DI 0 "s_register_operand") @@ -5826,7 +6103,7 @@ (define_insn "get_fpscr_nzcvqc" (unspec_volatile:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))] "TARGET_HAVE_MVE" "vmrs\\t%0, FPSCR_nzcvqc" - [(set_attr "type" "mve_move")]) + [(set_attr "type" "mve_move")]) (define_insn "set_fpscr_nzcvqc" [(set (reg:SI VFPCC_REGNUM) @@ -5834,7 +6111,7 @@ (define_insn "set_fpscr_nzcvqc" VUNSPEC_SET_FPSCR_NZCVQC))] "TARGET_HAVE_MVE" "vmsr\\tFPSCR_nzcvqc, %0" - [(set_attr "type" "mve_move")]) + [(set_attr "type" "mve_move")]) ;; ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u] @@ -5859,7 +6136,8 @@ (define_insn "mve_vldrdq_gather_base_wb_z_v2di_insn" output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) + (set_attr "length" "8")]) ;; ;; [vadciq_m_s, vadciq_m_u]) ;; @@ -5876,7 +6154,8 @@ (define_insn "mve_vadciq_m_v4si" ] "TARGET_HAVE_MVE" "vpst\;vadcit.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -5893,7 +6172,8 @@ (define_insn "mve_vadciq_v4si" ] "TARGET_HAVE_MVE" "vadci.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4")]) ;; @@ -5912,7 +6192,8 @@ (define_insn "mve_vadcq_m_v4si" ] "TARGET_HAVE_MVE" "vpst\;vadct.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -5929,7 +6210,8 @@ (define_insn "mve_vadcq_v4si" ] "TARGET_HAVE_MVE" "vadc.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4") (set_attr "conds" "set")]) @@ -5949,7 +6231,8 @@ (define_insn "mve_vsbciq_m_v4si" ] "TARGET_HAVE_MVE" "vpst\;vsbcit.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -5966,7 +6249,8 @@ (define_insn "mve_vsbciq_v4si" ] "TARGET_HAVE_MVE" "vsbci.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4")]) ;; @@ -5985,7 +6269,8 @@ (define_insn "mve_vsbcq_m_v4si" ] "TARGET_HAVE_MVE" "vpst\;vsbct.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -6002,7 +6287,8 @@ (define_insn "mve_vsbcq_v4si" ] "TARGET_HAVE_MVE" "vsbc.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4")]) ;; @@ -6031,7 +6317,7 @@ (define_insn "mve_vst2q" "vst21.\t{%q0, %q1}, %3", ops); return ""; } - [(set_attr "length" "8")]) + [(set_attr "length" "8")]) ;; ;; [vld2q]) @@ -6059,7 +6345,7 @@ (define_insn "mve_vld2q" "vld21.\t{%q0, %q1}, %3", ops); return ""; } - [(set_attr "length" "8")]) + [(set_attr "length" "8")]) ;; ;; [vld4q]) @@ -6402,7 +6688,8 @@ (define_insn "mve_vshlcq_m_" ] "TARGET_HAVE_MVE" "vpst\;vshlct\t%q0, %1, %4" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; CDE instructions on MVE registers. @@ -6414,7 +6701,8 @@ (define_insn "arm_vcx1qv16qi" UNSPEC_VCDE))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx1\\tp%c1, %q0, #%c2" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx1qav16qi" @@ -6425,7 +6713,8 @@ (define_insn "arm_vcx1qav16qi" UNSPEC_VCDEA))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx1a\\tp%c1, %q0, #%c3" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx2qv16qi" @@ -6436,7 +6725,8 @@ (define_insn "arm_vcx2qv16qi" UNSPEC_VCDE))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx2\\tp%c1, %q0, %q2, #%c3" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx2qav16qi" @@ -6448,7 +6738,8 @@ (define_insn "arm_vcx2qav16qi" UNSPEC_VCDEA))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx2a\\tp%c1, %q0, %q3, #%c4" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx3qv16qi" @@ -6460,7 +6751,8 @@ (define_insn "arm_vcx3qv16qi" UNSPEC_VCDE))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx3\\tp%c1, %q0, %q2, %q3, #%c4" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx3qav16qi" @@ -6473,7 +6765,8 @@ (define_insn "arm_vcx3qav16qi" UNSPEC_VCDEA))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx1q_p_v16qi" @@ -6485,7 +6778,8 @@ (define_insn "arm_vcx1q_p_v16qi" CDE_VCX))] "TARGET_CDE && TARGET_HAVE_MVE" "vpst\;vcx1t\\tp%c1, %q0, #%c3" - [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) + (set_attr "type" "coproc") (set_attr "length" "8")] ) @@ -6499,7 +6793,8 @@ (define_insn "arm_vcx2q_p_v16qi" CDE_VCX))] "TARGET_CDE && TARGET_HAVE_MVE" "vpst\;vcx2t\\tp%c1, %q0, %q3, #%c4" - [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) + (set_attr "type" "coproc") (set_attr "length" "8")] ) @@ -6514,11 +6809,12 @@ (define_insn "arm_vcx3q_p_v16qi" CDE_VCX))] "TARGET_CDE && TARGET_HAVE_MVE" "vpst\;vcx3t\\tp%c1, %q0, %q3, %q4, #%c5" - [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) + (set_attr "type" "coproc") (set_attr "length" "8")] ) -(define_insn "*movmisalign_mve_store" +(define_insn "movmisalign_mve_store" [(set (match_operand:MVE_VLD_ST 0 "mve_memory_operand" "=Ux") (unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "s_register_operand" " w")] UNSPEC_MISALIGNED_ACCESS))] @@ -6526,11 +6822,12 @@ (define_insn "*movmisalign_mve_store" || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (mode))) && !BYTES_BIG_ENDIAN && unaligned_access" "vstr.\t%q1, %E0" - [(set_attr "type" "mve_store")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign_mve_store")) + (set_attr "type" "mve_store")] ) -(define_insn "*movmisalign_mve_load" +(define_insn "movmisalign_mve_load" [(set (match_operand:MVE_VLD_ST 0 "s_register_operand" "=w") (unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "mve_memory_operand" " Ux")] UNSPEC_MISALIGNED_ACCESS))] @@ -6538,7 +6835,8 @@ (define_insn "*movmisalign_mve_load" || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (mode))) && !BYTES_BIG_ENDIAN && unaligned_access" "vldr.\t%q0, %E1" - [(set_attr "type" "mve_load")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign_mve_load")) + (set_attr "type" "mve_load")] ) ;; Expander for VxBI moves diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md index dac888535cb..ff1c27a0d71 100644 --- a/gcc/config/arm/vec-common.md +++ b/gcc/config/arm/vec-common.md @@ -366,7 +366,8 @@ (define_insn "@mve_q_" "@ .%#\t%0, %1, %2 * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], mode, VALID_NEON_QREG_MODE (mode), true);" - [(set_attr "type" "neon_shift_reg, neon_shift_imm")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_q_")) + (set_attr "type" "neon_shift_reg, neon_shift_imm")] ) (define_expand "vashl3"