From patchwork Fri Sep 8 01:03:13 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Tsukasa OI X-Patchwork-Id: 1831172 Return-Path: X-Original-To: incoming@patchwork.ozlabs.org Delivered-To: patchwork-incoming@legolas.ozlabs.org Authentication-Results: legolas.ozlabs.org; dkim=pass (1024-bit key; unprotected) header.d=gcc.gnu.org header.i=@gcc.gnu.org header.a=rsa-sha256 header.s=default header.b=KITTjNqv; dkim-atps=neutral Authentication-Results: legolas.ozlabs.org; spf=pass (sender SPF authorized) smtp.mailfrom=gcc.gnu.org (client-ip=2620:52:3:1:0:246e:9693:128c; helo=server2.sourceware.org; envelope-from=gcc-patches-bounces+incoming=patchwork.ozlabs.org@gcc.gnu.org; receiver=patchwork.ozlabs.org) Received: from server2.sourceware.org (server2.sourceware.org [IPv6:2620:52:3:1:0:246e:9693:128c]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (secp384r1) server-digest SHA384) (No client certificate requested) by legolas.ozlabs.org (Postfix) with ESMTPS id 4RhdD84lqcz1yhG for ; Fri, 8 Sep 2023 11:04:08 +1000 (AEST) Received: from server2.sourceware.org (localhost [IPv6:::1]) by sourceware.org (Postfix) with ESMTP id 3FB273857012 for ; Fri, 8 Sep 2023 01:04:04 +0000 (GMT) DKIM-Filter: OpenDKIM Filter v2.11.0 sourceware.org 3FB273857012 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gcc.gnu.org; s=default; t=1694135044; bh=9I3D0mUp/WZtKp5iu0YVl+8K3sIqOUCOH8R5XzNzW6Y=; h=To:Cc:Subject:Date:In-Reply-To:References:List-Id: List-Unsubscribe:List-Archive:List-Post:List-Help:List-Subscribe: From:Reply-To:From; b=KITTjNqvCkXpSncEgcOnCJa7k+s+xIUeSmPS9zDV/9Sr0Y6fz/pEFQvx6G98ohKpR OWAhPMtGQAR27wGbBqOvd3TKknquMezXNYXsvuhbt977WK7vUD1ahQuym+85jhM6ms jm+o5SPqKRXvE/zyLZZVFGiCvj8X65Mdt0UFqbVs= X-Original-To: gcc-patches@gcc.gnu.org Delivered-To: gcc-patches@gcc.gnu.org Received: from mail-sender-0.a4lg.com (mail-sender-0.a4lg.com [IPv6:2401:2500:203:30b:4000:6bfe:4757:0]) by sourceware.org (Postfix) with ESMTPS id 855C53858C33 for ; Fri, 8 Sep 2023 01:03:38 +0000 (GMT) DMARC-Filter: OpenDMARC Filter v1.4.2 sourceware.org 855C53858C33 Received: from [127.0.0.1] (localhost [127.0.0.1]) by mail-sender-0.a4lg.com (Postfix) with ESMTPSA id ADE10300089; Fri, 8 Sep 2023 01:03:36 +0000 (UTC) To: Tsukasa OI , Kito Cheng , Palmer Dabbelt , Andrew Waterman , Jim Wilson , Jeff Law Cc: gcc-patches@gcc.gnu.org Subject: [RFC PATCH 1/1] RISC-V: Make SHA-256, SM3 and SM4 builtins operate on uint32_t Date: Fri, 8 Sep 2023 01:03:13 +0000 Message-ID: <59e884e254718724df55d9970f8049811081b130.1694134824.git.research_trasio@irq.a4lg.com> In-Reply-To: References: Mime-Version: 1.0 X-Spam-Status: No, score=-11.9 required=5.0 tests=BAYES_00, DKIM_SIGNED, DKIM_VALID, DKIM_VALID_AU, GIT_PATCH_0, KAM_MANYTO, KAM_SHORT, SPF_HELO_NONE, SPF_PASS, TXREP autolearn=ham autolearn_force=no version=3.4.6 X-Spam-Checker-Version: SpamAssassin 3.4.6 (2021-04-09) on server2.sourceware.org X-BeenThere: gcc-patches@gcc.gnu.org X-Mailman-Version: 2.1.30 Precedence: list List-Id: Gcc-patches mailing list List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Tsukasa OI via Gcc-patches From: Tsukasa OI Reply-To: Tsukasa OI Errors-To: gcc-patches-bounces+incoming=patchwork.ozlabs.org@gcc.gnu.org Sender: "Gcc-patches" From: Tsukasa OI This is in parity with the LLVM commit 599421ae36c3 ("[RISCV] Re-define sha256, Zksed, and Zksh intrinsics to use i32 types."). SHA-256, SM3 and SM4 instructions operate on 32-bit integers and upper 32-bits have no effects on RV64 (the output is sign-extended from the original 32-bit value). In that sense, making those intrinsics only operate on uint32_t is much more natural than XLEN-bits wide integers. This commit reforms instructions and expansions based on 32-bit instruction handling on RV64 (such as ADDW). Before: riscv__si: For RV32, operate on uint32_t riscv__di: For RV64, operate on uint64_t After: *riscv__si: For RV32, operate on uint32_t riscv__di_extended: For RV64, input is uint32_t and... output is sign-extended int64_t. riscv__si: Common and expands to either of above. On RV64, extract lower 32-bits from the int64_t result. It also refines definitions of SHA-256, SM3 and SM4 intrinsics. gcc/ChangeLog: * config/riscv/crypto.md (riscv_sha256sig0_, riscv_sha256sig1_, riscv_sha256sum0_, riscv_sha256sum1_, riscv_sm3p0_, riscv_sm3p1_, riscv_sm4ed_, riscv_sm4ks_): Remove and replace with new insn/expansions. (SHA256_OP, SM3_OP, SM4_OP): New iterators. (sha256_op, sm3_op, sm4_op): New attributes for iteration. (*riscv__si): New raw instruction for RV32. (*riscv__si): Ditto. (*riscv__si): Ditto. (riscv__di_extended): New base instruction for RV64. (riscv__di_extended): Ditto. (riscv__di_extended): Ditto. (riscv__si): New common instruction expansion. (riscv__si): Ditto. (riscv__si): Ditto. * config/riscv/riscv-builtins.cc: Add availability "crypto_zknh", "crypto_zksh" and "crypto_zksed". Remove availability "crypto_zksh{32,64}" and "crypto_zksed{32,64}". * config/riscv/riscv-ftypes.def: Remove unused function type. * config/riscv/riscv-scalar-crypto.def: Make SHA-256, SM3 and SM4 intrinsics to operate on uint32_t. gcc/testsuite/ChangeLog: * gcc.target/riscv/zknh-sha256.c: Moved to... * gcc.target/riscv/zknh-sha256-64.c: ...here. Test RV64. * gcc.target/riscv/zknh-sha256-32.c: New test for RV32. * gcc.target/riscv/zksh64.c: Change the type. * gcc.target/riscv/zksed64.c: Ditto. --- gcc/config/riscv/crypto.md | 161 ++++++++++++------ gcc/config/riscv/riscv-builtins.cc | 7 +- gcc/config/riscv/riscv-ftypes.def | 1 - gcc/config/riscv/riscv-scalar-crypto.def | 24 +-- .../gcc.target/riscv/zknh-sha256-32.c | 10 ++ .../riscv/{zknh-sha256.c => zknh-sha256-64.c} | 8 +- gcc/testsuite/gcc.target/riscv/zksed64.c | 4 +- gcc/testsuite/gcc.target/riscv/zksh64.c | 4 +- 8 files changed, 139 insertions(+), 80 deletions(-) create mode 100644 gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c rename gcc/testsuite/gcc.target/riscv/{zknh-sha256.c => zknh-sha256-64.c} (78%) diff --git a/gcc/config/riscv/crypto.md b/gcc/config/riscv/crypto.md index e4b7f0190dfe..03a1d03397d9 100644 --- a/gcc/config/riscv/crypto.md +++ b/gcc/config/riscv/crypto.md @@ -250,36 +250,47 @@ ;; ZKNH - SHA256 -(define_insn "riscv_sha256sig0_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r")] - UNSPEC_SHA_256_SIG0))] - "TARGET_ZKNH" - "sha256sig0\t%0,%1" - [(set_attr "type" "crypto")]) - -(define_insn "riscv_sha256sig1_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r")] - UNSPEC_SHA_256_SIG1))] - "TARGET_ZKNH" - "sha256sig1\t%0,%1" +(define_int_iterator SHA256_OP [ + UNSPEC_SHA_256_SIG0 UNSPEC_SHA_256_SIG1 + UNSPEC_SHA_256_SUM0 UNSPEC_SHA_256_SUM1]) +(define_int_attr sha256_op [ + (UNSPEC_SHA_256_SIG0 "sha256sig0") (UNSPEC_SHA_256_SIG1 "sha256sig1") + (UNSPEC_SHA_256_SUM0 "sha256sum0") (UNSPEC_SHA_256_SUM1 "sha256sum1")]) + +(define_insn "*riscv__si" + [(set (match_operand:SI 0 "register_operand" "=r") + (unspec:SI [(match_operand:SI 1 "register_operand" "r")] + SHA256_OP))] + "TARGET_ZKNH && !TARGET_64BIT" + "\t%0,%1" [(set_attr "type" "crypto")]) -(define_insn "riscv_sha256sum0_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r")] - UNSPEC_SHA_256_SUM0))] - "TARGET_ZKNH" - "sha256sum0\t%0,%1" +(define_insn "riscv__di_extended" + [(set (match_operand:DI 0 "register_operand" "=r") + (sign_extend:DI + (unspec:SI [(match_operand:SI 1 "register_operand" "r")] + SHA256_OP)))] + "TARGET_ZKNH && TARGET_64BIT" + "\t%0,%1" [(set_attr "type" "crypto")]) -(define_insn "riscv_sha256sum1_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r")] - UNSPEC_SHA_256_SUM1))] +(define_expand "riscv__si" + [(set (match_operand:SI 0 "register_operand" "=r") + (unspec:SI [(match_operand:SI 1 "register_operand" "r")] + SHA256_OP))] "TARGET_ZKNH" - "sha256sum1\t%0,%1" + { + if (TARGET_64BIT) + { + rtx t = gen_reg_rtx (DImode); + emit_insn (gen_riscv__di_extended (t, operands[1])); + t = gen_lowpart (SImode, t); + SUBREG_PROMOTED_VAR_P (t) = 1; + SUBREG_PROMOTED_SET (t, SRP_SIGNED); + emit_move_insn (operands[0], t); + DONE; + } + } [(set_attr "type" "crypto")]) ;; ZKNH - SHA512 @@ -372,40 +383,88 @@ ;; ZKSH -(define_insn "riscv_sm3p0_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r")] - UNSPEC_SM3_P0))] - "TARGET_ZKSH" - "sm3p0\t%0,%1" +(define_int_iterator SM3_OP [UNSPEC_SM3_P0 UNSPEC_SM3_P1]) +(define_int_attr sm3_op [(UNSPEC_SM3_P0 "sm3p0") (UNSPEC_SM3_P1 "sm3p1")]) + +(define_insn "*riscv__si" + [(set (match_operand:SI 0 "register_operand" "=r") + (unspec:SI [(match_operand:SI 1 "register_operand" "r")] + SM3_OP))] + "TARGET_ZKSH && !TARGET_64BIT" + "\t%0,%1" [(set_attr "type" "crypto")]) -(define_insn "riscv_sm3p1_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r")] - UNSPEC_SM3_P1))] +(define_insn "riscv__di_extended" + [(set (match_operand:DI 0 "register_operand" "=r") + (sign_extend:DI + (unspec:SI [(match_operand:SI 1 "register_operand" "r")] + SM3_OP)))] + "TARGET_ZKSH && TARGET_64BIT" + "\t%0,%1" + [(set_attr "type" "crypto")]) + +(define_expand "riscv__si" + [(set (match_operand:SI 0 "register_operand" "=r") + (unspec:SI [(match_operand:SI 1 "register_operand" "r")] + SM3_OP))] "TARGET_ZKSH" - "sm3p1\t%0,%1" + { + if (TARGET_64BIT) + { + rtx t = gen_reg_rtx (DImode); + emit_insn (gen_riscv__di_extended (t, operands[1])); + t = gen_lowpart (SImode, t); + SUBREG_PROMOTED_VAR_P (t) = 1; + SUBREG_PROMOTED_SET (t, SRP_SIGNED); + emit_move_insn (operands[0], t); + DONE; + } + } [(set_attr "type" "crypto")]) ;; ZKSED -(define_insn "riscv_sm4ed_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r") - (match_operand:X 2 "register_operand" "r") - (match_operand:SI 3 "register_operand" "D03")] - UNSPEC_SM4_ED))] - "TARGET_ZKSED" - "sm4ed\t%0,%1,%2,%3" +(define_int_iterator SM4_OP [UNSPEC_SM4_ED UNSPEC_SM4_KS]) +(define_int_attr sm4_op [(UNSPEC_SM4_ED "sm4ed") (UNSPEC_SM4_KS "sm4ks")]) + +(define_insn "*riscv__si" + [(set (match_operand:SI 0 "register_operand" "=r") + (unspec:SI [(match_operand:SI 1 "register_operand" "r") + (match_operand:SI 2 "register_operand" "r") + (match_operand:SI 3 "register_operand" "D03")] + SM4_OP))] + "TARGET_ZKSED && !TARGET_64BIT" + "\t%0,%1,%2,%3" [(set_attr "type" "crypto")]) -(define_insn "riscv_sm4ks_" - [(set (match_operand:X 0 "register_operand" "=r") - (unspec:X [(match_operand:X 1 "register_operand" "r") - (match_operand:X 2 "register_operand" "r") - (match_operand:SI 3 "register_operand" "D03")] - UNSPEC_SM4_KS))] +(define_insn "riscv__di_extended" + [(set (match_operand:DI 0 "register_operand" "=r") + (sign_extend:DI + (unspec:SI [(match_operand:SI 1 "register_operand" "r") + (match_operand:SI 2 "register_operand" "r") + (match_operand:SI 3 "register_operand" "D03")] + SM4_OP)))] + "TARGET_ZKSED && TARGET_64BIT" + "\t%0,%1,%2,%3" + [(set_attr "type" "crypto")]) + +(define_expand "riscv__si" + [(set (match_operand:SI 0 "register_operand" "=r") + (unspec:SI [(match_operand:SI 1 "register_operand" "r") + (match_operand:SI 2 "register_operand" "r") + (match_operand:SI 3 "register_operand" "D03")] + SM4_OP))] "TARGET_ZKSED" - "sm4ks\t%0,%1,%2,%3" + { + if (TARGET_64BIT) + { + rtx t = gen_reg_rtx (DImode); + emit_insn (gen_riscv__di_extended (t, operands[1], operands[2], operands[3])); + t = gen_lowpart (SImode, t); + SUBREG_PROMOTED_VAR_P (t) = 1; + SUBREG_PROMOTED_SET (t, SRP_SIGNED); + emit_move_insn (operands[0], t); + DONE; + } + } [(set_attr "type" "crypto")]) diff --git a/gcc/config/riscv/riscv-builtins.cc b/gcc/config/riscv/riscv-builtins.cc index f6b06b3c16ac..3fe3a89dcc25 100644 --- a/gcc/config/riscv/riscv-builtins.cc +++ b/gcc/config/riscv/riscv-builtins.cc @@ -112,12 +112,11 @@ AVAIL (crypto_zknd64, TARGET_ZKND && TARGET_64BIT) AVAIL (crypto_zkne32, TARGET_ZKNE && !TARGET_64BIT) AVAIL (crypto_zkne64, TARGET_ZKNE && TARGET_64BIT) AVAIL (crypto_zkne_or_zknd, (TARGET_ZKNE || TARGET_ZKND) && TARGET_64BIT) +AVAIL (crypto_zknh, TARGET_ZKNH) AVAIL (crypto_zknh32, TARGET_ZKNH && !TARGET_64BIT) AVAIL (crypto_zknh64, TARGET_ZKNH && TARGET_64BIT) -AVAIL (crypto_zksh32, TARGET_ZKSH && !TARGET_64BIT) -AVAIL (crypto_zksh64, TARGET_ZKSH && TARGET_64BIT) -AVAIL (crypto_zksed32, TARGET_ZKSED && !TARGET_64BIT) -AVAIL (crypto_zksed64, TARGET_ZKSED && TARGET_64BIT) +AVAIL (crypto_zksh, TARGET_ZKSH) +AVAIL (crypto_zksed, TARGET_ZKSED) AVAIL (clmul_zbkc32_or_zbc32, (TARGET_ZBKC || TARGET_ZBC) && !TARGET_64BIT) AVAIL (clmul_zbkc64_or_zbc64, (TARGET_ZBKC || TARGET_ZBC) && TARGET_64BIT) AVAIL (clmulr_zbc32, TARGET_ZBC && !TARGET_64BIT) diff --git a/gcc/config/riscv/riscv-ftypes.def b/gcc/config/riscv/riscv-ftypes.def index 366861ce640e..33620c57ca06 100644 --- a/gcc/config/riscv/riscv-ftypes.def +++ b/gcc/config/riscv/riscv-ftypes.def @@ -41,4 +41,3 @@ DEF_RISCV_FTYPE (2, (UDI, USI, USI)) DEF_RISCV_FTYPE (2, (UDI, UDI, USI)) DEF_RISCV_FTYPE (2, (UDI, UDI, UDI)) DEF_RISCV_FTYPE (3, (USI, USI, USI, USI)) -DEF_RISCV_FTYPE (3, (UDI, UDI, UDI, USI)) diff --git a/gcc/config/riscv/riscv-scalar-crypto.def b/gcc/config/riscv/riscv-scalar-crypto.def index db86ec9fd78a..3db9ed4a03e5 100644 --- a/gcc/config/riscv/riscv-scalar-crypto.def +++ b/gcc/config/riscv/riscv-scalar-crypto.def @@ -54,14 +54,10 @@ DIRECT_BUILTIN (aes64es, RISCV_UDI_FTYPE_UDI_UDI, crypto_zkne64), DIRECT_BUILTIN (aes64esm, RISCV_UDI_FTYPE_UDI_UDI, crypto_zkne64), // ZKNH -RISCV_BUILTIN (sha256sig0_si, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32), -RISCV_BUILTIN (sha256sig0_di, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64), -RISCV_BUILTIN (sha256sig1_si, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32), -RISCV_BUILTIN (sha256sig1_di, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64), -RISCV_BUILTIN (sha256sum0_si, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32), -RISCV_BUILTIN (sha256sum0_di, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64), -RISCV_BUILTIN (sha256sum1_si, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32), -RISCV_BUILTIN (sha256sum1_di, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64), +RISCV_BUILTIN (sha256sig0_si, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh), +RISCV_BUILTIN (sha256sig1_si, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh), +RISCV_BUILTIN (sha256sum0_si, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh), +RISCV_BUILTIN (sha256sum1_si, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh), DIRECT_BUILTIN (sha512sig0h, RISCV_USI_FTYPE_USI_USI, crypto_zknh32), DIRECT_BUILTIN (sha512sig0l, RISCV_USI_FTYPE_USI_USI, crypto_zknh32), @@ -76,13 +72,9 @@ DIRECT_BUILTIN (sha512sum0, RISCV_UDI_FTYPE_UDI, crypto_zknh64), DIRECT_BUILTIN (sha512sum1, RISCV_UDI_FTYPE_UDI, crypto_zknh64), // ZKSH -RISCV_BUILTIN (sm3p0_si, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh32), -RISCV_BUILTIN (sm3p0_di, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zksh64), -RISCV_BUILTIN (sm3p1_si, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh32), -RISCV_BUILTIN (sm3p1_di, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zksh64), +RISCV_BUILTIN (sm3p0_si, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh), +RISCV_BUILTIN (sm3p1_si, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh), // ZKSED -RISCV_BUILTIN (sm4ed_si, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed32), -RISCV_BUILTIN (sm4ed_di, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI_UDI_USI, crypto_zksed64), -RISCV_BUILTIN (sm4ks_si, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed32), -RISCV_BUILTIN (sm4ks_di, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI_UDI_USI, crypto_zksed64), +RISCV_BUILTIN (sm4ed_si, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed), +RISCV_BUILTIN (sm4ks_si, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed), diff --git a/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c b/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c new file mode 100644 index 000000000000..c51b143a8a5c --- /dev/null +++ b/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c @@ -0,0 +1,10 @@ +/* { dg-do compile } */ +/* { dg-options "-O2 -march=rv32gc_zknh -mabi=ilp32d" } */ +/* { dg-skip-if "" { *-*-* } { "-g" "-flto"} } */ + +#include "zknh-sha256-64.c" + +/* { dg-final { scan-assembler-times "sha256sig0" 1 } } */ +/* { dg-final { scan-assembler-times "sha256sig1" 1 } } */ +/* { dg-final { scan-assembler-times "sha256sum0" 1 } } */ +/* { dg-final { scan-assembler-times "sha256sum1" 1 } } */ diff --git a/gcc/testsuite/gcc.target/riscv/zknh-sha256.c b/gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c similarity index 78% rename from gcc/testsuite/gcc.target/riscv/zknh-sha256.c rename to gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c index 952d611cd0b9..2ef37601e6fb 100644 --- a/gcc/testsuite/gcc.target/riscv/zknh-sha256.c +++ b/gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c @@ -2,22 +2,22 @@ /* { dg-options "-O2 -march=rv64gc_zknh -mabi=lp64" } */ /* { dg-skip-if "" { *-*-* } { "-g" "-flto"} } */ -unsigned long foo1(unsigned long rs1) +unsigned int foo1(unsigned int rs1) { return __builtin_riscv_sha256sig0(rs1); } -unsigned long foo2(unsigned long rs1) +unsigned int foo2(unsigned int rs1) { return __builtin_riscv_sha256sig1(rs1); } -unsigned long foo3(unsigned long rs1) +unsigned int foo3(unsigned int rs1) { return __builtin_riscv_sha256sum0(rs1); } -unsigned long foo4(unsigned long rs1) +unsigned int foo4(unsigned int rs1) { return __builtin_riscv_sha256sum1(rs1); } diff --git a/gcc/testsuite/gcc.target/riscv/zksed64.c b/gcc/testsuite/gcc.target/riscv/zksed64.c index 3485adf9cd88..913e7be4e4d9 100644 --- a/gcc/testsuite/gcc.target/riscv/zksed64.c +++ b/gcc/testsuite/gcc.target/riscv/zksed64.c @@ -4,12 +4,12 @@ #include -uint64_t foo1(uint64_t rs1, uint64_t rs2, unsigned bs) +uint32_t foo1(uint32_t rs1, uint32_t rs2, unsigned bs) { return __builtin_riscv_sm4ks(rs1,rs2,bs); } -uint64_t foo2(uint64_t rs1, uint64_t rs2, unsigned bs) +uint32_t foo2(uint32_t rs1, uint32_t rs2, unsigned bs) { return __builtin_riscv_sm4ed(rs1,rs2,bs); } diff --git a/gcc/testsuite/gcc.target/riscv/zksh64.c b/gcc/testsuite/gcc.target/riscv/zksh64.c index bdd137872785..30bb1bdeeeb7 100644 --- a/gcc/testsuite/gcc.target/riscv/zksh64.c +++ b/gcc/testsuite/gcc.target/riscv/zksh64.c @@ -4,12 +4,12 @@ #include -uint64_t foo1(uint64_t rs1) +uint32_t foo1(uint32_t rs1) { return __builtin_riscv_sm3p0(rs1); } -uint64_t foo2(uint64_t rs1) +uint32_t foo2(uint32_t rs1) { return __builtin_riscv_sm3p1(rs1); }