diff mbox series

[2/2] RISC-V: Make SHA-256, SM3 and SM4 builtins operate on uint32_t

Message ID 20f656ae84cf4312a901eebb9bb784a651066e55.1694482087.git.research_trasio@irq.a4lg.com
State New
Headers show
Series RISC-V: Change RISC-V bit manipulation / scalar crypto builtin types | expand

Commit Message

Tsukasa OI Sept. 12, 2023, 1:28 a.m. UTC
From: Tsukasa OI <research_trasio@irq.a4lg.com>

This is in parity with the LLVM commit a64b3e92c7cb ("[RISCV] Re-define
sha256, Zksed, and Zksh intrinsics to use i32 types.").

SHA-256, SM3 and SM4 instructions operate on 32-bit integers and upper
32-bits have no effects on RV64 (the output is sign-extended from the
original 32-bit value).  In that sense, making those intrinsics only
operate on uint32_t is much more natural than XLEN-bits wide integers.

This commit reforms instructions and expansions based on 32-bit
instruction handling on RV64 (such as ADDW).

Before:
   riscv_<op>_si: For RV32, fully operate on uint32_t
   riscv_<op>_di: For RV64, fully operate on uint64_t
After:
  *riscv_<op>_si: For RV32, fully operate on uint32_t
   riscv_<op>_di_extended:
                  For RV64.  Input is uint32_t and output is int64_t,
                  sign-extended from the int32_t result
                  (represents a part of <op> behavior).
   riscv_<op>_si: Common (fully operate on uint32_t).
                  On RV32, "expands" to *riscv_<op>_si.
                  On RV64, initially expands to riscv_<op>_di_extended *and*
                  extracts lower 32-bits from the int64_t result.

It also refines definitions of SHA-256, SM3 and SM4 intrinsics.

gcc/ChangeLog:

	* config/riscv/crypto.md (riscv_sha256sig0_<mode>,
	riscv_sha256sig1_<mode>, riscv_sha256sum0_<mode>,
	riscv_sha256sum1_<mode>, riscv_sm3p0_<mode>, riscv_sm3p1_<mode>,
	riscv_sm4ed_<mode>, riscv_sm4ks_<mode>): Remove and replace with
	new insn/expansions.
	(SHA256_OP, SM3_OP, SM4_OP): New iterators.
	(sha256_op, sm3_op, sm4_op): New attributes for iteration.
	(*riscv_<sha256_op>_si): New raw instruction for RV32.
	(*riscv_<sm3_op>_si): Ditto.
	(*riscv_<sm4_op>_si): Ditto.
	(riscv_<sha256_op>_di_extended): New base instruction for RV64.
	(riscv_<sm3_op>_di_extended): Ditto.
	(riscv_<sm4_op>_di_extended): Ditto.
	(riscv_<sha256_op>_si): New common instruction expansion.
	(riscv_<sm3_op>_si): Ditto.
	(riscv_<sm4_op>_si): Ditto.
	* config/riscv/riscv-builtins.cc: Add availability "crypto_zknh",
	"crypto_zksh" and "crypto_zksed".  Remove availability
	"crypto_zksh{32,64}" and "crypto_zksed{32,64}".
	* config/riscv/riscv-ftypes.def: Remove unused function type.
	* config/riscv/riscv-scalar-crypto.def: Make SHA-256, SM3 and SM4
	intrinsics to operate on uint32_t.

gcc/testsuite/ChangeLog:

	* gcc.target/riscv/zknh-sha256.c: Moved to...
	* gcc.target/riscv/zknh-sha256-64.c: ...here.  Test RV64.
	* gcc.target/riscv/zknh-sha256-32.c: New test for RV32.
	* gcc.target/riscv/zksh64.c: Change the type.
	* gcc.target/riscv/zksed64.c: Ditto.
---
 gcc/config/riscv/crypto.md                    | 161 ++++++++++++------
 gcc/config/riscv/riscv-builtins.cc            |   7 +-
 gcc/config/riscv/riscv-ftypes.def             |   1 -
 gcc/config/riscv/riscv-scalar-crypto.def      |  24 +--
 .../gcc.target/riscv/zknh-sha256-32.c         |  10 ++
 .../riscv/{zknh-sha256.c => zknh-sha256-64.c} |   8 +-
 gcc/testsuite/gcc.target/riscv/zksed64.c      |   4 +-
 gcc/testsuite/gcc.target/riscv/zksh64.c       |   4 +-
 8 files changed, 139 insertions(+), 80 deletions(-)
 create mode 100644 gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c
 rename gcc/testsuite/gcc.target/riscv/{zknh-sha256.c => zknh-sha256-64.c} (78%)

Comments

Kito Cheng Sept. 12, 2023, 2:44 a.m. UTC | #1
LGTM, I think llvm and GCC are inconsistent for those intrinsics API
is really unfortunate...so really appreciate making those API align :)
And did you have plan to add riscv_crypto.h after updating/fixing all builtin?

On Tue, Sep 12, 2023 at 9:29 AM Tsukasa OI via Gcc-patches
<gcc-patches@gcc.gnu.org> wrote:
>
> From: Tsukasa OI <research_trasio@irq.a4lg.com>
>
> This is in parity with the LLVM commit a64b3e92c7cb ("[RISCV] Re-define
> sha256, Zksed, and Zksh intrinsics to use i32 types.").
>
> SHA-256, SM3 and SM4 instructions operate on 32-bit integers and upper
> 32-bits have no effects on RV64 (the output is sign-extended from the
> original 32-bit value).  In that sense, making those intrinsics only
> operate on uint32_t is much more natural than XLEN-bits wide integers.
>
> This commit reforms instructions and expansions based on 32-bit
> instruction handling on RV64 (such as ADDW).
>
> Before:
>    riscv_<op>_si: For RV32, fully operate on uint32_t
>    riscv_<op>_di: For RV64, fully operate on uint64_t
> After:
>   *riscv_<op>_si: For RV32, fully operate on uint32_t
>    riscv_<op>_di_extended:
>                   For RV64.  Input is uint32_t and output is int64_t,
>                   sign-extended from the int32_t result
>                   (represents a part of <op> behavior).
>    riscv_<op>_si: Common (fully operate on uint32_t).
>                   On RV32, "expands" to *riscv_<op>_si.
>                   On RV64, initially expands to riscv_<op>_di_extended *and*
>                   extracts lower 32-bits from the int64_t result.
>
> It also refines definitions of SHA-256, SM3 and SM4 intrinsics.
>
> gcc/ChangeLog:
>
>         * config/riscv/crypto.md (riscv_sha256sig0_<mode>,
>         riscv_sha256sig1_<mode>, riscv_sha256sum0_<mode>,
>         riscv_sha256sum1_<mode>, riscv_sm3p0_<mode>, riscv_sm3p1_<mode>,
>         riscv_sm4ed_<mode>, riscv_sm4ks_<mode>): Remove and replace with
>         new insn/expansions.
>         (SHA256_OP, SM3_OP, SM4_OP): New iterators.
>         (sha256_op, sm3_op, sm4_op): New attributes for iteration.
>         (*riscv_<sha256_op>_si): New raw instruction for RV32.
>         (*riscv_<sm3_op>_si): Ditto.
>         (*riscv_<sm4_op>_si): Ditto.
>         (riscv_<sha256_op>_di_extended): New base instruction for RV64.
>         (riscv_<sm3_op>_di_extended): Ditto.
>         (riscv_<sm4_op>_di_extended): Ditto.
>         (riscv_<sha256_op>_si): New common instruction expansion.
>         (riscv_<sm3_op>_si): Ditto.
>         (riscv_<sm4_op>_si): Ditto.
>         * config/riscv/riscv-builtins.cc: Add availability "crypto_zknh",
>         "crypto_zksh" and "crypto_zksed".  Remove availability
>         "crypto_zksh{32,64}" and "crypto_zksed{32,64}".
>         * config/riscv/riscv-ftypes.def: Remove unused function type.
>         * config/riscv/riscv-scalar-crypto.def: Make SHA-256, SM3 and SM4
>         intrinsics to operate on uint32_t.
>
> gcc/testsuite/ChangeLog:
>
>         * gcc.target/riscv/zknh-sha256.c: Moved to...
>         * gcc.target/riscv/zknh-sha256-64.c: ...here.  Test RV64.
>         * gcc.target/riscv/zknh-sha256-32.c: New test for RV32.
>         * gcc.target/riscv/zksh64.c: Change the type.
>         * gcc.target/riscv/zksed64.c: Ditto.
> ---
>  gcc/config/riscv/crypto.md                    | 161 ++++++++++++------
>  gcc/config/riscv/riscv-builtins.cc            |   7 +-
>  gcc/config/riscv/riscv-ftypes.def             |   1 -
>  gcc/config/riscv/riscv-scalar-crypto.def      |  24 +--
>  .../gcc.target/riscv/zknh-sha256-32.c         |  10 ++
>  .../riscv/{zknh-sha256.c => zknh-sha256-64.c} |   8 +-
>  gcc/testsuite/gcc.target/riscv/zksed64.c      |   4 +-
>  gcc/testsuite/gcc.target/riscv/zksh64.c       |   4 +-
>  8 files changed, 139 insertions(+), 80 deletions(-)
>  create mode 100644 gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c
>  rename gcc/testsuite/gcc.target/riscv/{zknh-sha256.c => zknh-sha256-64.c} (78%)
>
> diff --git a/gcc/config/riscv/crypto.md b/gcc/config/riscv/crypto.md
> index e4b7f0190dfe..03a1d03397d9 100644
> --- a/gcc/config/riscv/crypto.md
> +++ b/gcc/config/riscv/crypto.md
> @@ -250,36 +250,47 @@
>
>  ;; ZKNH - SHA256
>
> -(define_insn "riscv_sha256sig0_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")]
> -                  UNSPEC_SHA_256_SIG0))]
> -  "TARGET_ZKNH"
> -  "sha256sig0\t%0,%1"
> -  [(set_attr "type" "crypto")])
> -
> -(define_insn "riscv_sha256sig1_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")]
> -                  UNSPEC_SHA_256_SIG1))]
> -  "TARGET_ZKNH"
> -  "sha256sig1\t%0,%1"
> +(define_int_iterator SHA256_OP [
> +  UNSPEC_SHA_256_SIG0 UNSPEC_SHA_256_SIG1
> +  UNSPEC_SHA_256_SUM0 UNSPEC_SHA_256_SUM1])
> +(define_int_attr sha256_op [
> +  (UNSPEC_SHA_256_SIG0 "sha256sig0") (UNSPEC_SHA_256_SIG1 "sha256sig1")
> +  (UNSPEC_SHA_256_SUM0 "sha256sum0") (UNSPEC_SHA_256_SUM1 "sha256sum1")])
> +
> +(define_insn "*riscv_<sha256_op>_si"
> +  [(set (match_operand:SI 0 "register_operand" "=r")
> +        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
> +                   SHA256_OP))]
> +  "TARGET_ZKNH && !TARGET_64BIT"
> +  "<sha256_op>\t%0,%1"
>    [(set_attr "type" "crypto")])
>
> -(define_insn "riscv_sha256sum0_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")]
> -                  UNSPEC_SHA_256_SUM0))]
> -  "TARGET_ZKNH"
> -  "sha256sum0\t%0,%1"
> +(define_insn "riscv_<sha256_op>_di_extended"
> +  [(set (match_operand:DI 0 "register_operand" "=r")
> +        (sign_extend:DI
> +             (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
> +                        SHA256_OP)))]
> +  "TARGET_ZKNH && TARGET_64BIT"
> +  "<sha256_op>\t%0,%1"
>    [(set_attr "type" "crypto")])
>
> -(define_insn "riscv_sha256sum1_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")]
> -                  UNSPEC_SHA_256_SUM1))]
> +(define_expand "riscv_<sha256_op>_si"
> +  [(set (match_operand:SI 0 "register_operand" "=r")
> +        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
> +                   SHA256_OP))]
>    "TARGET_ZKNH"
> -  "sha256sum1\t%0,%1"
> +  {
> +    if (TARGET_64BIT)
> +      {
> +        rtx t = gen_reg_rtx (DImode);
> +        emit_insn (gen_riscv_<sha256_op>_di_extended (t, operands[1]));
> +        t = gen_lowpart (SImode, t);
> +        SUBREG_PROMOTED_VAR_P (t) = 1;
> +        SUBREG_PROMOTED_SET (t, SRP_SIGNED);
> +        emit_move_insn (operands[0], t);
> +        DONE;
> +      }
> +  }
>    [(set_attr "type" "crypto")])
>
>  ;; ZKNH - SHA512
> @@ -372,40 +383,88 @@
>
>   ;; ZKSH
>
> -(define_insn "riscv_sm3p0_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")]
> -                  UNSPEC_SM3_P0))]
> -  "TARGET_ZKSH"
> -  "sm3p0\t%0,%1"
> +(define_int_iterator SM3_OP [UNSPEC_SM3_P0 UNSPEC_SM3_P1])
> +(define_int_attr sm3_op [(UNSPEC_SM3_P0 "sm3p0") (UNSPEC_SM3_P1 "sm3p1")])
> +
> +(define_insn "*riscv_<sm3_op>_si"
> +  [(set (match_operand:SI 0 "register_operand" "=r")
> +        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
> +                   SM3_OP))]
> +  "TARGET_ZKSH && !TARGET_64BIT"
> +  "<sm3_op>\t%0,%1"
>    [(set_attr "type" "crypto")])
>
> -(define_insn "riscv_sm3p1_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")]
> -                  UNSPEC_SM3_P1))]
> +(define_insn "riscv_<sm3_op>_di_extended"
> +  [(set (match_operand:DI 0 "register_operand" "=r")
> +        (sign_extend:DI
> +             (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
> +                        SM3_OP)))]
> +  "TARGET_ZKSH && TARGET_64BIT"
> +  "<sm3_op>\t%0,%1"
> +  [(set_attr "type" "crypto")])
> +
> +(define_expand "riscv_<sm3_op>_si"
> +  [(set (match_operand:SI 0 "register_operand" "=r")
> +        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
> +                   SM3_OP))]
>    "TARGET_ZKSH"
> -  "sm3p1\t%0,%1"
> +  {
> +    if (TARGET_64BIT)
> +      {
> +        rtx t = gen_reg_rtx (DImode);
> +        emit_insn (gen_riscv_<sm3_op>_di_extended (t, operands[1]));
> +        t = gen_lowpart (SImode, t);
> +        SUBREG_PROMOTED_VAR_P (t) = 1;
> +        SUBREG_PROMOTED_SET (t, SRP_SIGNED);
> +        emit_move_insn (operands[0], t);
> +        DONE;
> +      }
> +  }
>    [(set_attr "type" "crypto")])
>
>  ;; ZKSED
>
> -(define_insn "riscv_sm4ed_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")
> -                  (match_operand:X 2 "register_operand" "r")
> -                  (match_operand:SI 3 "register_operand" "D03")]
> -                  UNSPEC_SM4_ED))]
> -  "TARGET_ZKSED"
> -  "sm4ed\t%0,%1,%2,%3"
> +(define_int_iterator SM4_OP [UNSPEC_SM4_ED UNSPEC_SM4_KS])
> +(define_int_attr sm4_op [(UNSPEC_SM4_ED "sm4ed") (UNSPEC_SM4_KS "sm4ks")])
> +
> +(define_insn "*riscv_<sm4_op>_si"
> +  [(set (match_operand:SI 0 "register_operand" "=r")
> +        (unspec:SI [(match_operand:SI 1 "register_operand" "r")
> +                   (match_operand:SI 2 "register_operand" "r")
> +                   (match_operand:SI 3 "register_operand" "D03")]
> +                   SM4_OP))]
> +  "TARGET_ZKSED && !TARGET_64BIT"
> +  "<sm4_op>\t%0,%1,%2,%3"
>    [(set_attr "type" "crypto")])
>
> -(define_insn "riscv_sm4ks_<mode>"
> -  [(set (match_operand:X 0 "register_operand" "=r")
> -        (unspec:X [(match_operand:X 1 "register_operand" "r")
> -                  (match_operand:X 2 "register_operand" "r")
> -                  (match_operand:SI 3 "register_operand" "D03")]
> -                  UNSPEC_SM4_KS))]
> +(define_insn "riscv_<sm4_op>_di_extended"
> +  [(set (match_operand:DI 0 "register_operand" "=r")
> +        (sign_extend:DI
> +             (unspec:SI [(match_operand:SI 1 "register_operand" "r")
> +                        (match_operand:SI 2 "register_operand" "r")
> +                        (match_operand:SI 3 "register_operand" "D03")]
> +                        SM4_OP)))]
> +  "TARGET_ZKSED && TARGET_64BIT"
> +  "<sm4_op>\t%0,%1,%2,%3"
> +  [(set_attr "type" "crypto")])
> +
> +(define_expand "riscv_<sm4_op>_si"
> +  [(set (match_operand:SI 0 "register_operand" "=r")
> +        (unspec:SI [(match_operand:SI 1 "register_operand" "r")
> +                   (match_operand:SI 2 "register_operand" "r")
> +                   (match_operand:SI 3 "register_operand" "D03")]
> +                   SM4_OP))]
>    "TARGET_ZKSED"
> -  "sm4ks\t%0,%1,%2,%3"
> +  {
> +    if (TARGET_64BIT)
> +      {
> +        rtx t = gen_reg_rtx (DImode);
> +        emit_insn (gen_riscv_<sm4_op>_di_extended (t, operands[1], operands[2], operands[3]));
> +        t = gen_lowpart (SImode, t);
> +        SUBREG_PROMOTED_VAR_P (t) = 1;
> +        SUBREG_PROMOTED_SET (t, SRP_SIGNED);
> +        emit_move_insn (operands[0], t);
> +        DONE;
> +      }
> +  }
>    [(set_attr "type" "crypto")])
> diff --git a/gcc/config/riscv/riscv-builtins.cc b/gcc/config/riscv/riscv-builtins.cc
> index f6b06b3c16ac..3fe3a89dcc25 100644
> --- a/gcc/config/riscv/riscv-builtins.cc
> +++ b/gcc/config/riscv/riscv-builtins.cc
> @@ -112,12 +112,11 @@ AVAIL (crypto_zknd64, TARGET_ZKND && TARGET_64BIT)
>  AVAIL (crypto_zkne32, TARGET_ZKNE && !TARGET_64BIT)
>  AVAIL (crypto_zkne64, TARGET_ZKNE && TARGET_64BIT)
>  AVAIL (crypto_zkne_or_zknd, (TARGET_ZKNE || TARGET_ZKND) && TARGET_64BIT)
> +AVAIL (crypto_zknh, TARGET_ZKNH)
>  AVAIL (crypto_zknh32, TARGET_ZKNH && !TARGET_64BIT)
>  AVAIL (crypto_zknh64, TARGET_ZKNH && TARGET_64BIT)
> -AVAIL (crypto_zksh32, TARGET_ZKSH && !TARGET_64BIT)
> -AVAIL (crypto_zksh64, TARGET_ZKSH && TARGET_64BIT)
> -AVAIL (crypto_zksed32, TARGET_ZKSED && !TARGET_64BIT)
> -AVAIL (crypto_zksed64, TARGET_ZKSED && TARGET_64BIT)
> +AVAIL (crypto_zksh, TARGET_ZKSH)
> +AVAIL (crypto_zksed, TARGET_ZKSED)
>  AVAIL (clmul_zbkc32_or_zbc32, (TARGET_ZBKC || TARGET_ZBC) && !TARGET_64BIT)
>  AVAIL (clmul_zbkc64_or_zbc64, (TARGET_ZBKC || TARGET_ZBC) && TARGET_64BIT)
>  AVAIL (clmulr_zbc32, TARGET_ZBC && !TARGET_64BIT)
> diff --git a/gcc/config/riscv/riscv-ftypes.def b/gcc/config/riscv/riscv-ftypes.def
> index 366861ce640e..33620c57ca06 100644
> --- a/gcc/config/riscv/riscv-ftypes.def
> +++ b/gcc/config/riscv/riscv-ftypes.def
> @@ -41,4 +41,3 @@ DEF_RISCV_FTYPE (2, (UDI, USI, USI))
>  DEF_RISCV_FTYPE (2, (UDI, UDI, USI))
>  DEF_RISCV_FTYPE (2, (UDI, UDI, UDI))
>  DEF_RISCV_FTYPE (3, (USI, USI, USI, USI))
> -DEF_RISCV_FTYPE (3, (UDI, UDI, UDI, USI))
> diff --git a/gcc/config/riscv/riscv-scalar-crypto.def b/gcc/config/riscv/riscv-scalar-crypto.def
> index db86ec9fd78a..3db9ed4a03e5 100644
> --- a/gcc/config/riscv/riscv-scalar-crypto.def
> +++ b/gcc/config/riscv/riscv-scalar-crypto.def
> @@ -54,14 +54,10 @@ DIRECT_BUILTIN (aes64es, RISCV_UDI_FTYPE_UDI_UDI, crypto_zkne64),
>  DIRECT_BUILTIN (aes64esm, RISCV_UDI_FTYPE_UDI_UDI, crypto_zkne64),
>
>  // ZKNH
> -RISCV_BUILTIN (sha256sig0_si, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
> -RISCV_BUILTIN (sha256sig0_di, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
> -RISCV_BUILTIN (sha256sig1_si, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
> -RISCV_BUILTIN (sha256sig1_di, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
> -RISCV_BUILTIN (sha256sum0_si, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
> -RISCV_BUILTIN (sha256sum0_di, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
> -RISCV_BUILTIN (sha256sum1_si, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
> -RISCV_BUILTIN (sha256sum1_di, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
> +RISCV_BUILTIN (sha256sig0_si, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
> +RISCV_BUILTIN (sha256sig1_si, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
> +RISCV_BUILTIN (sha256sum0_si, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
> +RISCV_BUILTIN (sha256sum1_si, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
>
>  DIRECT_BUILTIN (sha512sig0h, RISCV_USI_FTYPE_USI_USI, crypto_zknh32),
>  DIRECT_BUILTIN (sha512sig0l, RISCV_USI_FTYPE_USI_USI, crypto_zknh32),
> @@ -76,13 +72,9 @@ DIRECT_BUILTIN (sha512sum0, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
>  DIRECT_BUILTIN (sha512sum1, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
>
>  // ZKSH
> -RISCV_BUILTIN (sm3p0_si, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh32),
> -RISCV_BUILTIN (sm3p0_di, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zksh64),
> -RISCV_BUILTIN (sm3p1_si, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh32),
> -RISCV_BUILTIN (sm3p1_di, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zksh64),
> +RISCV_BUILTIN (sm3p0_si, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh),
> +RISCV_BUILTIN (sm3p1_si, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh),
>
>  // ZKSED
> -RISCV_BUILTIN (sm4ed_si, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed32),
> -RISCV_BUILTIN (sm4ed_di, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI_UDI_USI, crypto_zksed64),
> -RISCV_BUILTIN (sm4ks_si, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed32),
> -RISCV_BUILTIN (sm4ks_di, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI_UDI_USI, crypto_zksed64),
> +RISCV_BUILTIN (sm4ed_si, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed),
> +RISCV_BUILTIN (sm4ks_si, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed),
> diff --git a/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c b/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c
> new file mode 100644
> index 000000000000..c51b143a8a5c
> --- /dev/null
> +++ b/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c
> @@ -0,0 +1,10 @@
> +/* { dg-do compile } */
> +/* { dg-options "-O2 -march=rv32gc_zknh -mabi=ilp32d" } */
> +/* { dg-skip-if "" { *-*-* } { "-g" "-flto"} } */
> +
> +#include "zknh-sha256-64.c"
> +
> +/* { dg-final { scan-assembler-times "sha256sig0" 1 } } */
> +/* { dg-final { scan-assembler-times "sha256sig1" 1 } } */
> +/* { dg-final { scan-assembler-times "sha256sum0" 1 } } */
> +/* { dg-final { scan-assembler-times "sha256sum1" 1 } } */
> diff --git a/gcc/testsuite/gcc.target/riscv/zknh-sha256.c b/gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c
> similarity index 78%
> rename from gcc/testsuite/gcc.target/riscv/zknh-sha256.c
> rename to gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c
> index 952d611cd0b9..2ef37601e6fb 100644
> --- a/gcc/testsuite/gcc.target/riscv/zknh-sha256.c
> +++ b/gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c
> @@ -2,22 +2,22 @@
>  /* { dg-options "-O2 -march=rv64gc_zknh -mabi=lp64" } */
>  /* { dg-skip-if "" { *-*-* } { "-g" "-flto"} } */
>
> -unsigned long foo1(unsigned long rs1)
> +unsigned int foo1(unsigned int rs1)
>  {
>      return __builtin_riscv_sha256sig0(rs1);
>  }
>
> -unsigned long foo2(unsigned long rs1)
> +unsigned int foo2(unsigned int rs1)
>  {
>      return __builtin_riscv_sha256sig1(rs1);
>  }
>
> -unsigned long foo3(unsigned long rs1)
> +unsigned int foo3(unsigned int rs1)
>  {
>      return __builtin_riscv_sha256sum0(rs1);
>  }
>
> -unsigned long foo4(unsigned long rs1)
> +unsigned int foo4(unsigned int rs1)
>  {
>      return __builtin_riscv_sha256sum1(rs1);
>  }
> diff --git a/gcc/testsuite/gcc.target/riscv/zksed64.c b/gcc/testsuite/gcc.target/riscv/zksed64.c
> index 3485adf9cd88..913e7be4e4d9 100644
> --- a/gcc/testsuite/gcc.target/riscv/zksed64.c
> +++ b/gcc/testsuite/gcc.target/riscv/zksed64.c
> @@ -4,12 +4,12 @@
>
>  #include <stdint-gcc.h>
>
> -uint64_t foo1(uint64_t rs1, uint64_t rs2, unsigned bs)
> +uint32_t foo1(uint32_t rs1, uint32_t rs2, unsigned bs)
>  {
>      return __builtin_riscv_sm4ks(rs1,rs2,bs);
>  }
>
> -uint64_t foo2(uint64_t rs1, uint64_t rs2, unsigned bs)
> +uint32_t foo2(uint32_t rs1, uint32_t rs2, unsigned bs)
>  {
>      return __builtin_riscv_sm4ed(rs1,rs2,bs);
>  }
> diff --git a/gcc/testsuite/gcc.target/riscv/zksh64.c b/gcc/testsuite/gcc.target/riscv/zksh64.c
> index bdd137872785..30bb1bdeeeb7 100644
> --- a/gcc/testsuite/gcc.target/riscv/zksh64.c
> +++ b/gcc/testsuite/gcc.target/riscv/zksh64.c
> @@ -4,12 +4,12 @@
>
>  #include <stdint-gcc.h>
>
> -uint64_t foo1(uint64_t rs1)
> +uint32_t foo1(uint32_t rs1)
>  {
>      return __builtin_riscv_sm3p0(rs1);
>  }
>
> -uint64_t foo2(uint64_t rs1)
> +uint32_t foo2(uint32_t rs1)
>  {
>      return __builtin_riscv_sm3p1(rs1);
>  }
> --
> 2.42.0
>
Tsukasa OI Sept. 12, 2023, 3:20 a.m. UTC | #2
On 2023/09/12 11:44, Kito Cheng wrote:
> LGTM, I think llvm and GCC are inconsistent for those intrinsics API
> is really unfortunate...so really appreciate making those API align :)

I guess that you mean LGTM to this patch set (given the context).
If so, I will commit this patch set later.

> And did you have plan to add riscv_crypto.h after updating/fixing all builtin?

Like riscv_vector.h?  I don't and I think we need to discuss what to
include in... like Toolchain SIG first.

Additionally, we need to discuss what to do with these XLEN-specific
builtins specific to LLVM (my idea is, add XLEN-specific builtins like
LLVM but not to remove XLEN independent ones).

Thanks,
Tsukasa

> Excerpt from the latest clang/include/clang/Basic/BuiltinsRISCV.def:
> 
> // Zbb extension
> TARGET_BUILTIN(__builtin_riscv_orc_b_32, "UiUi", "nc", "zbb")
> TARGET_BUILTIN(__builtin_riscv_orc_b_64, "UWiUWi", "nc", "zbb,64bit")
> TARGET_BUILTIN(__builtin_riscv_clz_32, "UiUi", "nc", "zbb|xtheadbb")
> TARGET_BUILTIN(__builtin_riscv_clz_64, "UiUWi", "nc", "zbb|xtheadbb,64bit")
> TARGET_BUILTIN(__builtin_riscv_ctz_32, "UiUi", "nc", "zbb")
> TARGET_BUILTIN(__builtin_riscv_ctz_64, "UiUWi", "nc", "zbb,64bit")
> 
> // Zbc or Zbkc extension
> TARGET_BUILTIN(__builtin_riscv_clmul_32, "UiUiUi", "nc", "zbc|zbkc")
> TARGET_BUILTIN(__builtin_riscv_clmul_64, "UWiUWiUWi", "nc", "zbc|zbkc,64bit")
> TARGET_BUILTIN(__builtin_riscv_clmulh_32, "UiUiUi", "nc", "zbc|zbkc,32bit")
> TARGET_BUILTIN(__builtin_riscv_clmulh_64, "UWiUWiUWi", "nc", "zbc|zbkc,64bit")
> TARGET_BUILTIN(__builtin_riscv_clmulr_32, "UiUiUi", "nc", "zbc,32bit")
> TARGET_BUILTIN(__builtin_riscv_clmulr_64, "UWiUWiUWi", "nc", "zbc,64bit")
> 
> // Zbkx
> TARGET_BUILTIN(__builtin_riscv_xperm4_32, "UiUiUi", "nc", "zbkx,32bit")
> TARGET_BUILTIN(__builtin_riscv_xperm4_64, "UWiUWiUWi", "nc", "zbkx,64bit")
> TARGET_BUILTIN(__builtin_riscv_xperm8_32, "UiUiUi", "nc", "zbkx,32bit")
> TARGET_BUILTIN(__builtin_riscv_xperm8_64, "UWiUWiUWi", "nc", "zbkx,64bit")
> 
> // Zbkb extension
> TARGET_BUILTIN(__builtin_riscv_brev8_32, "UiUi", "nc", "zbkb")
> TARGET_BUILTIN(__builtin_riscv_brev8_64, "UWiUWi", "nc", "zbkb,64bit")
> TARGET_BUILTIN(__builtin_riscv_zip_32, "UiUi", "nc", "zbkb,32bit")
> TARGET_BUILTIN(__builtin_riscv_unzip_32, "UiUi", "nc", "zbkb,32bit")
diff mbox series

Patch

diff --git a/gcc/config/riscv/crypto.md b/gcc/config/riscv/crypto.md
index e4b7f0190dfe..03a1d03397d9 100644
--- a/gcc/config/riscv/crypto.md
+++ b/gcc/config/riscv/crypto.md
@@ -250,36 +250,47 @@ 
 
 ;; ZKNH - SHA256
 
-(define_insn "riscv_sha256sig0_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")]
-                  UNSPEC_SHA_256_SIG0))]
-  "TARGET_ZKNH"
-  "sha256sig0\t%0,%1"
-  [(set_attr "type" "crypto")])
-
-(define_insn "riscv_sha256sig1_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")]
-                  UNSPEC_SHA_256_SIG1))]
-  "TARGET_ZKNH"
-  "sha256sig1\t%0,%1"
+(define_int_iterator SHA256_OP [
+  UNSPEC_SHA_256_SIG0 UNSPEC_SHA_256_SIG1
+  UNSPEC_SHA_256_SUM0 UNSPEC_SHA_256_SUM1])
+(define_int_attr sha256_op [
+  (UNSPEC_SHA_256_SIG0 "sha256sig0") (UNSPEC_SHA_256_SIG1 "sha256sig1")
+  (UNSPEC_SHA_256_SUM0 "sha256sum0") (UNSPEC_SHA_256_SUM1 "sha256sum1")])
+
+(define_insn "*riscv_<sha256_op>_si"
+  [(set (match_operand:SI 0 "register_operand" "=r")
+        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
+                   SHA256_OP))]
+  "TARGET_ZKNH && !TARGET_64BIT"
+  "<sha256_op>\t%0,%1"
   [(set_attr "type" "crypto")])
 
-(define_insn "riscv_sha256sum0_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")]
-                  UNSPEC_SHA_256_SUM0))]
-  "TARGET_ZKNH"
-  "sha256sum0\t%0,%1"
+(define_insn "riscv_<sha256_op>_di_extended"
+  [(set (match_operand:DI 0 "register_operand" "=r")
+        (sign_extend:DI
+             (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
+                        SHA256_OP)))]
+  "TARGET_ZKNH && TARGET_64BIT"
+  "<sha256_op>\t%0,%1"
   [(set_attr "type" "crypto")])
 
-(define_insn "riscv_sha256sum1_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")]
-                  UNSPEC_SHA_256_SUM1))]
+(define_expand "riscv_<sha256_op>_si"
+  [(set (match_operand:SI 0 "register_operand" "=r")
+        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
+                   SHA256_OP))]
   "TARGET_ZKNH"
-  "sha256sum1\t%0,%1"
+  {
+    if (TARGET_64BIT)
+      {
+        rtx t = gen_reg_rtx (DImode);
+        emit_insn (gen_riscv_<sha256_op>_di_extended (t, operands[1]));
+        t = gen_lowpart (SImode, t);
+        SUBREG_PROMOTED_VAR_P (t) = 1;
+        SUBREG_PROMOTED_SET (t, SRP_SIGNED);
+        emit_move_insn (operands[0], t);
+        DONE;
+      }
+  }
   [(set_attr "type" "crypto")])
 
 ;; ZKNH - SHA512
@@ -372,40 +383,88 @@ 
 
  ;; ZKSH
 
-(define_insn "riscv_sm3p0_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")]
-                  UNSPEC_SM3_P0))]
-  "TARGET_ZKSH"
-  "sm3p0\t%0,%1"
+(define_int_iterator SM3_OP [UNSPEC_SM3_P0 UNSPEC_SM3_P1])
+(define_int_attr sm3_op [(UNSPEC_SM3_P0 "sm3p0") (UNSPEC_SM3_P1 "sm3p1")])
+
+(define_insn "*riscv_<sm3_op>_si"
+  [(set (match_operand:SI 0 "register_operand" "=r")
+        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
+                   SM3_OP))]
+  "TARGET_ZKSH && !TARGET_64BIT"
+  "<sm3_op>\t%0,%1"
   [(set_attr "type" "crypto")])
 
-(define_insn "riscv_sm3p1_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")]
-                  UNSPEC_SM3_P1))]
+(define_insn "riscv_<sm3_op>_di_extended"
+  [(set (match_operand:DI 0 "register_operand" "=r")
+        (sign_extend:DI
+             (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
+                        SM3_OP)))]
+  "TARGET_ZKSH && TARGET_64BIT"
+  "<sm3_op>\t%0,%1"
+  [(set_attr "type" "crypto")])
+
+(define_expand "riscv_<sm3_op>_si"
+  [(set (match_operand:SI 0 "register_operand" "=r")
+        (unspec:SI [(match_operand:SI 1 "register_operand" "r")]
+                   SM3_OP))]
   "TARGET_ZKSH"
-  "sm3p1\t%0,%1"
+  {
+    if (TARGET_64BIT)
+      {
+        rtx t = gen_reg_rtx (DImode);
+        emit_insn (gen_riscv_<sm3_op>_di_extended (t, operands[1]));
+        t = gen_lowpart (SImode, t);
+        SUBREG_PROMOTED_VAR_P (t) = 1;
+        SUBREG_PROMOTED_SET (t, SRP_SIGNED);
+        emit_move_insn (operands[0], t);
+        DONE;
+      }
+  }
   [(set_attr "type" "crypto")])
 
 ;; ZKSED
 
-(define_insn "riscv_sm4ed_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")
-                  (match_operand:X 2 "register_operand" "r")
-                  (match_operand:SI 3 "register_operand" "D03")]
-                  UNSPEC_SM4_ED))]
-  "TARGET_ZKSED"
-  "sm4ed\t%0,%1,%2,%3"
+(define_int_iterator SM4_OP [UNSPEC_SM4_ED UNSPEC_SM4_KS])
+(define_int_attr sm4_op [(UNSPEC_SM4_ED "sm4ed") (UNSPEC_SM4_KS "sm4ks")])
+
+(define_insn "*riscv_<sm4_op>_si"
+  [(set (match_operand:SI 0 "register_operand" "=r")
+        (unspec:SI [(match_operand:SI 1 "register_operand" "r")
+                   (match_operand:SI 2 "register_operand" "r")
+                   (match_operand:SI 3 "register_operand" "D03")]
+                   SM4_OP))]
+  "TARGET_ZKSED && !TARGET_64BIT"
+  "<sm4_op>\t%0,%1,%2,%3"
   [(set_attr "type" "crypto")])
 
-(define_insn "riscv_sm4ks_<mode>"
-  [(set (match_operand:X 0 "register_operand" "=r")
-        (unspec:X [(match_operand:X 1 "register_operand" "r")
-                  (match_operand:X 2 "register_operand" "r")
-                  (match_operand:SI 3 "register_operand" "D03")]
-                  UNSPEC_SM4_KS))]
+(define_insn "riscv_<sm4_op>_di_extended"
+  [(set (match_operand:DI 0 "register_operand" "=r")
+        (sign_extend:DI
+             (unspec:SI [(match_operand:SI 1 "register_operand" "r")
+                        (match_operand:SI 2 "register_operand" "r")
+                        (match_operand:SI 3 "register_operand" "D03")]
+                        SM4_OP)))]
+  "TARGET_ZKSED && TARGET_64BIT"
+  "<sm4_op>\t%0,%1,%2,%3"
+  [(set_attr "type" "crypto")])
+
+(define_expand "riscv_<sm4_op>_si"
+  [(set (match_operand:SI 0 "register_operand" "=r")
+        (unspec:SI [(match_operand:SI 1 "register_operand" "r")
+                   (match_operand:SI 2 "register_operand" "r")
+                   (match_operand:SI 3 "register_operand" "D03")]
+                   SM4_OP))]
   "TARGET_ZKSED"
-  "sm4ks\t%0,%1,%2,%3"
+  {
+    if (TARGET_64BIT)
+      {
+        rtx t = gen_reg_rtx (DImode);
+        emit_insn (gen_riscv_<sm4_op>_di_extended (t, operands[1], operands[2], operands[3]));
+        t = gen_lowpart (SImode, t);
+        SUBREG_PROMOTED_VAR_P (t) = 1;
+        SUBREG_PROMOTED_SET (t, SRP_SIGNED);
+        emit_move_insn (operands[0], t);
+        DONE;
+      }
+  }
   [(set_attr "type" "crypto")])
diff --git a/gcc/config/riscv/riscv-builtins.cc b/gcc/config/riscv/riscv-builtins.cc
index f6b06b3c16ac..3fe3a89dcc25 100644
--- a/gcc/config/riscv/riscv-builtins.cc
+++ b/gcc/config/riscv/riscv-builtins.cc
@@ -112,12 +112,11 @@  AVAIL (crypto_zknd64, TARGET_ZKND && TARGET_64BIT)
 AVAIL (crypto_zkne32, TARGET_ZKNE && !TARGET_64BIT)
 AVAIL (crypto_zkne64, TARGET_ZKNE && TARGET_64BIT)
 AVAIL (crypto_zkne_or_zknd, (TARGET_ZKNE || TARGET_ZKND) && TARGET_64BIT)
+AVAIL (crypto_zknh, TARGET_ZKNH)
 AVAIL (crypto_zknh32, TARGET_ZKNH && !TARGET_64BIT)
 AVAIL (crypto_zknh64, TARGET_ZKNH && TARGET_64BIT)
-AVAIL (crypto_zksh32, TARGET_ZKSH && !TARGET_64BIT)
-AVAIL (crypto_zksh64, TARGET_ZKSH && TARGET_64BIT)
-AVAIL (crypto_zksed32, TARGET_ZKSED && !TARGET_64BIT)
-AVAIL (crypto_zksed64, TARGET_ZKSED && TARGET_64BIT)
+AVAIL (crypto_zksh, TARGET_ZKSH)
+AVAIL (crypto_zksed, TARGET_ZKSED)
 AVAIL (clmul_zbkc32_or_zbc32, (TARGET_ZBKC || TARGET_ZBC) && !TARGET_64BIT)
 AVAIL (clmul_zbkc64_or_zbc64, (TARGET_ZBKC || TARGET_ZBC) && TARGET_64BIT)
 AVAIL (clmulr_zbc32, TARGET_ZBC && !TARGET_64BIT)
diff --git a/gcc/config/riscv/riscv-ftypes.def b/gcc/config/riscv/riscv-ftypes.def
index 366861ce640e..33620c57ca06 100644
--- a/gcc/config/riscv/riscv-ftypes.def
+++ b/gcc/config/riscv/riscv-ftypes.def
@@ -41,4 +41,3 @@  DEF_RISCV_FTYPE (2, (UDI, USI, USI))
 DEF_RISCV_FTYPE (2, (UDI, UDI, USI))
 DEF_RISCV_FTYPE (2, (UDI, UDI, UDI))
 DEF_RISCV_FTYPE (3, (USI, USI, USI, USI))
-DEF_RISCV_FTYPE (3, (UDI, UDI, UDI, USI))
diff --git a/gcc/config/riscv/riscv-scalar-crypto.def b/gcc/config/riscv/riscv-scalar-crypto.def
index db86ec9fd78a..3db9ed4a03e5 100644
--- a/gcc/config/riscv/riscv-scalar-crypto.def
+++ b/gcc/config/riscv/riscv-scalar-crypto.def
@@ -54,14 +54,10 @@  DIRECT_BUILTIN (aes64es, RISCV_UDI_FTYPE_UDI_UDI, crypto_zkne64),
 DIRECT_BUILTIN (aes64esm, RISCV_UDI_FTYPE_UDI_UDI, crypto_zkne64),
 
 // ZKNH
-RISCV_BUILTIN (sha256sig0_si, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
-RISCV_BUILTIN (sha256sig0_di, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
-RISCV_BUILTIN (sha256sig1_si, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
-RISCV_BUILTIN (sha256sig1_di, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
-RISCV_BUILTIN (sha256sum0_si, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
-RISCV_BUILTIN (sha256sum0_di, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
-RISCV_BUILTIN (sha256sum1_si, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh32),
-RISCV_BUILTIN (sha256sum1_di, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
+RISCV_BUILTIN (sha256sig0_si, "sha256sig0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
+RISCV_BUILTIN (sha256sig1_si, "sha256sig1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
+RISCV_BUILTIN (sha256sum0_si, "sha256sum0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
+RISCV_BUILTIN (sha256sum1_si, "sha256sum1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zknh),
 
 DIRECT_BUILTIN (sha512sig0h, RISCV_USI_FTYPE_USI_USI, crypto_zknh32),
 DIRECT_BUILTIN (sha512sig0l, RISCV_USI_FTYPE_USI_USI, crypto_zknh32),
@@ -76,13 +72,9 @@  DIRECT_BUILTIN (sha512sum0, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
 DIRECT_BUILTIN (sha512sum1, RISCV_UDI_FTYPE_UDI, crypto_zknh64),
 
 // ZKSH
-RISCV_BUILTIN (sm3p0_si, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh32),
-RISCV_BUILTIN (sm3p0_di, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zksh64),
-RISCV_BUILTIN (sm3p1_si, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh32),
-RISCV_BUILTIN (sm3p1_di, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI, crypto_zksh64),
+RISCV_BUILTIN (sm3p0_si, "sm3p0", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh),
+RISCV_BUILTIN (sm3p1_si, "sm3p1", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI, crypto_zksh),
 
 // ZKSED
-RISCV_BUILTIN (sm4ed_si, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed32),
-RISCV_BUILTIN (sm4ed_di, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI_UDI_USI, crypto_zksed64),
-RISCV_BUILTIN (sm4ks_si, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed32),
-RISCV_BUILTIN (sm4ks_di, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_UDI_FTYPE_UDI_UDI_USI, crypto_zksed64),
+RISCV_BUILTIN (sm4ed_si, "sm4ed", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed),
+RISCV_BUILTIN (sm4ks_si, "sm4ks", RISCV_BUILTIN_DIRECT, RISCV_USI_FTYPE_USI_USI_USI, crypto_zksed),
diff --git a/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c b/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c
new file mode 100644
index 000000000000..c51b143a8a5c
--- /dev/null
+++ b/gcc/testsuite/gcc.target/riscv/zknh-sha256-32.c
@@ -0,0 +1,10 @@ 
+/* { dg-do compile } */
+/* { dg-options "-O2 -march=rv32gc_zknh -mabi=ilp32d" } */
+/* { dg-skip-if "" { *-*-* } { "-g" "-flto"} } */
+
+#include "zknh-sha256-64.c"
+
+/* { dg-final { scan-assembler-times "sha256sig0" 1 } } */
+/* { dg-final { scan-assembler-times "sha256sig1" 1 } } */
+/* { dg-final { scan-assembler-times "sha256sum0" 1 } } */
+/* { dg-final { scan-assembler-times "sha256sum1" 1 } } */
diff --git a/gcc/testsuite/gcc.target/riscv/zknh-sha256.c b/gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c
similarity index 78%
rename from gcc/testsuite/gcc.target/riscv/zknh-sha256.c
rename to gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c
index 952d611cd0b9..2ef37601e6fb 100644
--- a/gcc/testsuite/gcc.target/riscv/zknh-sha256.c
+++ b/gcc/testsuite/gcc.target/riscv/zknh-sha256-64.c
@@ -2,22 +2,22 @@ 
 /* { dg-options "-O2 -march=rv64gc_zknh -mabi=lp64" } */
 /* { dg-skip-if "" { *-*-* } { "-g" "-flto"} } */
 
-unsigned long foo1(unsigned long rs1)
+unsigned int foo1(unsigned int rs1)
 {
     return __builtin_riscv_sha256sig0(rs1);
 }
 
-unsigned long foo2(unsigned long rs1)
+unsigned int foo2(unsigned int rs1)
 {
     return __builtin_riscv_sha256sig1(rs1);
 }
 
-unsigned long foo3(unsigned long rs1)
+unsigned int foo3(unsigned int rs1)
 {
     return __builtin_riscv_sha256sum0(rs1);
 }
 
-unsigned long foo4(unsigned long rs1)
+unsigned int foo4(unsigned int rs1)
 {
     return __builtin_riscv_sha256sum1(rs1);
 }
diff --git a/gcc/testsuite/gcc.target/riscv/zksed64.c b/gcc/testsuite/gcc.target/riscv/zksed64.c
index 3485adf9cd88..913e7be4e4d9 100644
--- a/gcc/testsuite/gcc.target/riscv/zksed64.c
+++ b/gcc/testsuite/gcc.target/riscv/zksed64.c
@@ -4,12 +4,12 @@ 
 
 #include <stdint-gcc.h>
 
-uint64_t foo1(uint64_t rs1, uint64_t rs2, unsigned bs)
+uint32_t foo1(uint32_t rs1, uint32_t rs2, unsigned bs)
 {
     return __builtin_riscv_sm4ks(rs1,rs2,bs);
 }
 
-uint64_t foo2(uint64_t rs1, uint64_t rs2, unsigned bs)
+uint32_t foo2(uint32_t rs1, uint32_t rs2, unsigned bs)
 {
     return __builtin_riscv_sm4ed(rs1,rs2,bs);
 }
diff --git a/gcc/testsuite/gcc.target/riscv/zksh64.c b/gcc/testsuite/gcc.target/riscv/zksh64.c
index bdd137872785..30bb1bdeeeb7 100644
--- a/gcc/testsuite/gcc.target/riscv/zksh64.c
+++ b/gcc/testsuite/gcc.target/riscv/zksh64.c
@@ -4,12 +4,12 @@ 
 
 #include <stdint-gcc.h>
 
-uint64_t foo1(uint64_t rs1)
+uint32_t foo1(uint32_t rs1)
 {
     return __builtin_riscv_sm3p0(rs1);
 }
 
-uint64_t foo2(uint64_t rs1)
+uint32_t foo2(uint32_t rs1)
 {
     return __builtin_riscv_sm3p1(rs1);
 }