From patchwork Mon Jan 30 07:35:24 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: "tianjia.zhang" X-Patchwork-Id: 13120445 X-Patchwork-Delegate: herbert@gondor.apana.org.au Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from vger.kernel.org (vger.kernel.org [23.128.96.18]) by smtp.lore.kernel.org (Postfix) with ESMTP id 7D587C54EED for ; Mon, 30 Jan 2023 07:35:34 +0000 (UTC) Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S235183AbjA3Hfd (ORCPT ); Mon, 30 Jan 2023 02:35:33 -0500 Received: from lindbergh.monkeyblade.net ([23.128.96.19]:47422 "EHLO lindbergh.monkeyblade.net" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S234964AbjA3Hfb (ORCPT ); Mon, 30 Jan 2023 02:35:31 -0500 Received: from out30-133.freemail.mail.aliyun.com (out30-133.freemail.mail.aliyun.com [115.124.30.133]) by lindbergh.monkeyblade.net (Postfix) with ESMTPS id 835D423874; Sun, 29 Jan 2023 23:35:28 -0800 (PST) X-Alimail-AntiSpam: AC=PASS;BC=-1|-1;BR=01201311R141e4;CH=green;DM=||false|;DS=||;FP=0|-1|-1|-1|0|-1|-1|-1;HT=ay29a033018046050;MF=tianjia.zhang@linux.alibaba.com;NM=1;PH=DS;RN=8;SR=0;TI=SMTPD_---0VaQT8Xt_1675064125; Received: from localhost(mailfrom:tianjia.zhang@linux.alibaba.com fp:SMTPD_---0VaQT8Xt_1675064125) by smtp.aliyun-inc.com; Mon, 30 Jan 2023 15:35:26 +0800 From: Tianjia Zhang To: Herbert Xu , "David S. Miller" , Catalin Marinas , Will Deacon , linux-crypto@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-kernel@vger.kernel.org Cc: Tianjia Zhang Subject: [PATCH v2] crypto: arm64/sm4 - Fix possible crash in GCM cryption Date: Mon, 30 Jan 2023 15:35:24 +0800 Message-Id: <20230130073524.68266-1-tianjia.zhang@linux.alibaba.com> X-Mailer: git-send-email 2.24.3 (Apple Git-128) In-Reply-To: <20230118141928.48136-1-tianjia.zhang@linux.alibaba.com> References: <20230118141928.48136-1-tianjia.zhang@linux.alibaba.com> MIME-Version: 1.0 Precedence: bulk List-ID: X-Mailing-List: linux-crypto@vger.kernel.org When the cryption total length is zero, GCM cryption call skcipher_walk_done() will cause an unexpected crash, so skip calling this function to avoid possible crash when the GCM cryption length is equal to zero. Fixes: ae1b83c7d572 ("crypto: arm64/sm4 - add CE implementation for GCM mode") Signed-off-by: Tianjia Zhang --- arch/arm64/crypto/sm4-ce-gcm-glue.c | 3 +++ 1 file changed, 3 insertions(+) diff --git a/arch/arm64/crypto/sm4-ce-gcm-glue.c b/arch/arm64/crypto/sm4-ce-gcm-glue.c index c450a2025ca9..29aa7470281d 100644 --- a/arch/arm64/crypto/sm4-ce-gcm-glue.c +++ b/arch/arm64/crypto/sm4-ce-gcm-glue.c @@ -178,6 +178,9 @@ static int gcm_crypt(struct aead_request *req, struct skcipher_walk *walk, kernel_neon_end(); + if (unlikely(!walk->nbytes)) + break; + err = skcipher_walk_done(walk, tail); if (err) return err;