From: Eric Biggers
Now that all users of xts_crypt() have been removed in favor of the XTS
template wrapping an ECB mode algorithm, remove xts_crypt().
Signed-off-by: Eric Biggers
---
crypto/xts.c | 72
include/crypto/xts.h | 17
From: Eric Biggers
Now that all users of lrw_crypt() have been removed in favor of the LRW
template wrapping an ECB mode algorithm, remove lrw_crypt(). Also
remove crypto/lrw.h as that is no longer needed either; and fold
'struct lrw_table_ctx' into 'struct priv', lrw_init_table() into
setkey(),
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-twofish-avx algorithm which did this. Users who re
From: Eric Biggers
Convert the x86 asm implementation of Camellia from the (deprecated)
blkcipher interface over to the skcipher interface.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/camellia_glue.c | 162
crypto/Kconfig | 2 +-
From: Eric Biggers
There are no users of the original glue_fpu_begin() anymore, so rename
glue_skwalk_fpu_begin() to glue_fpu_begin() so that it matches
glue_fpu_end() again.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/cast5_avx_glue.c | 4 ++--
arch/x86/crypto/glue_helper.c
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-camellia-aesni-avx2 algorithm which did this. User
From: Eric Biggers
Convert the AVX implementation of Twofish from the (deprecated)
ablkcipher and blkcipher interfaces over to the skcipher interface.
Note that this includes replacing the use of ablk_helper with
crypto_simd.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/twofish_avx_glue.c |
From: Eric Biggers
All users of ablk_helper have been converted over to crypto_simd, so
remove ablk_helper.
Signed-off-by: Eric Biggers
---
crypto/Kconfig | 4 --
crypto/Makefile | 1 -
crypto/ablk_helper.c | 150 --
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-cast6-avx algorithm which did this. Users who requ
From: Eric Biggers
The XTS template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic XTS code themselves via xts_crypt().
Remove the xts-camellia-asm algorithm which did this. Users who r
From: Eric Biggers
Now that all glue_helper users have been switched from the blkcipher
interface over to the skcipher interface, remove the versions of the
glue_helper functions that handled the blkcipher interface.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/glue_helper.c | 3
From: Eric Biggers
Convert the AVX implementation of CAST5 from the (deprecated) ablkcipher
and blkcipher interfaces over to the skcipher interface. Note that this
includes replacing the use of ablk_helper with crypto_simd.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/cast5_avx_glue.c | 35
From: Eric Biggers
Convert the x86 asm implementation of Blowfish from the (deprecated)
blkcipher interface over to the skcipher interface.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/blowfish_glue.c | 230
crypto/Kconfig | 2 +-
From: Eric Biggers
Convert the x86 asm implementation of Triple DES from the (deprecated)
blkcipher interface over to the skcipher interface.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/des3_ede_glue.c | 238
crypto/Kconfig | 2 +-
From: Eric Biggers
Convert the AESNI AVX and AESNI AVX2 implementations of Camellia from
the (deprecated) ablkcipher and blkcipher interfaces over to the
skcipher interface. Note that this includes replacing the use of
ablk_helper with crypto_simd.
Signed-off-by: Eric Biggers
---
arch/x86/cry
From: Eric Biggers
Convert the AVX implementation of CAST6 from the (deprecated) ablkcipher
and blkcipher interfaces over to the skcipher interface. Note that this
includes replacing the use of ablk_helper with crypto_simd.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/cast6_avx_glue.c | 31
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-camellia-asm algorithm which did this. Users who r
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-camellia-aesni algorithm which did this. Users who
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-serpent-sse2 algorithm which did this. Users who r
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-serpent-avx algorithm which did this. Users who re
From: Eric Biggers
Add ECB, CBC, and CTR functions to glue_helper which use skcipher_walk
rather than blkcipher_walk. This will allow converting the remaining
x86 algorithms from the blkcipher interface over to the skcipher
interface, after which we'll be able to remove the blkcipher_walk
versio
From: Eric Biggers
With ecb-cast5-avx, if a 128+ byte scatterlist element followed a
shorter one, then the algorithm accidentally encrypted/decrypted only 8
bytes instead of the expected 128 bytes. Fix it by setting the
encryption/decryption 'fn' correctly.
Fixes: c12ab20b162c ("crypto: cast5/a
From: Eric Biggers
Convert the SSE2 implementation of Serpent from the (deprecated)
ablkcipher and blkcipher interfaces over to the skcipher interface.
Note that this includes replacing the use of ablk_helper with
crypto_simd.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/serpent_sse2_glue.c
From: Eric Biggers
Convert the 3-way implementation of Twofish from the (deprecated)
blkcipher interface over to the skcipher interface.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/twofish_glue_3way.c | 151
crypto/Kconfig | 2 +-
From: Eric Biggers
The XTS template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic XTS code themselves via xts_crypt().
Remove the xts-serpent-sse2 algorithm which did this. Users who r
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-twofish-3way algorithm which did this. Users who r
From: Eric Biggers
The XTS template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic XTS code themselves via xts_crypt().
Remove the xts-twofish-3way algorithm which did this. Users who r
From: Eric Biggers
The LRW template now wraps an ECB mode algorithm rather than the block
cipher directly. Therefore it is now redundant for crypto modules to
wrap their ECB code with generic LRW code themselves via lrw_crypt().
Remove the lrw-serpent-avx2 algorithm which did this. Users who r
Hi,
I got tired of seeing some block cipher implementations using the old
API (blkcipher or ablkcipher) and some using the new API (skcipher)...
so this series makes a dent in the problem by converting all the
remaining x86 glue code over to the skcipher API.
Besides the conversion to use 'skciph
From: Eric Biggers
Add a function to crypto_simd that registers an array of skcipher
algorithms, then allocates and registers the simd wrapper algorithms for
them. It assumes the naming scheme where the names of the underlying
algorithms are prefixed with two underscores.
Also add the correspon
From: Eric Biggers
Convert the AVX and AVX2 implementations of Serpent from the
(deprecated) ablkcipher and blkcipher interfaces over to the skcipher
interface. Note that this includes replacing the use of ablk_helper
with crypto_simd.
Signed-off-by: Eric Biggers
---
arch/x86/crypto/serpent_a
On Mon, Feb 19, 2018 at 02:51:22PM +, Gilad Ben-Yossef wrote:
> Add device tree bindings for Arm CryptoCell 710 and 630p hardware
> revisions.
>
> Signed-off-by: Gilad Ben-Yossef
> ---
> Documentation/devicetree/bindings/crypto/arm-cryptocell.txt | 3 ++-
> 1 file changed, 2 insertions(+), 1
Remove enum definition which are not used by the REE interface
driver.
Signed-off-by: Gilad Ben-Yossef
---
drivers/crypto/ccree/cc_crypto_ctx.h | 20
1 file changed, 20 deletions(-)
diff --git a/drivers/crypto/ccree/cc_crypto_ctx.h
b/drivers/crypto/ccree/cc_crypto_ctx.h
in
This patch set introduces backward compatible support for the older
CryptoCell hardware revision 710 and 630 along some minor code
cleanups.
Gilad Ben-Yossef (4):
crypto: ccree: remove unused definitions
dt-bindings: Add DT bindings for ccree 710 and 630p
crypto: ccree: add support for older
Add support for the legacy CryptoCell 630 and 710 revs.
Signed-off-by: Gilad Ben-Yossef
---
drivers/crypto/Kconfig | 6 +-
drivers/crypto/ccree/cc_aead.c | 34 ++--
drivers/crypto/ccree/cc_cipher.c| 25 +-
drivers/crypto/ccree/cc_crypto_ctx.h| 1
Add device tree bindings for Arm CryptoCell 710 and 630p hardware
revisions.
Signed-off-by: Gilad Ben-Yossef
---
Documentation/devicetree/bindings/crypto/arm-cryptocell.txt | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/Documentation/devicetree/bindings/crypto/arm-cryptoce
Replace memset to 0 followed by kfree with kzfree for
simplicity.
Signed-off-by: Gilad Ben-Yossef
---
drivers/crypto/ccree/cc_request_mgr.c | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)
diff --git a/drivers/crypto/ccree/cc_request_mgr.c
b/drivers/crypto/ccree/cc_request_mgr.c
index 2
The inclusion of dma-direct.h was only needed temporarily to prevent
breakage from the DMA API rework, since the actual CESA fix making it
redundant was merged in parallel. Now that both have landed, it can go.
Signed-off-by: Robin Murphy
---
drivers/crypto/marvell/cesa.c | 1 -
1 file changed,
On 2/19/2018 11:14 AM, Christophe LEROY wrote:
> Le 19/02/2018 à 09:30, Horia Geantă a écrit :
>> On 2/19/2018 9:58 AM, Christophe LEROY wrote:
>>> Le 18/02/2018 à 18:14, Horia Geantă a écrit :
There is no ahash_exit() callback mirroring ahash_init().
The clean-up of request ctx shou
Am Sonntag, 18. Februar 2018, 15:25:27 CET schrieb Gilad Ben-Yossef:
Hi Gilad,
> On Sat, Feb 10, 2018 at 12:04 AM, Stephan Müller
wrote:
> > Crypto drivers may implement a streamlined serialization support for AIO
> > requests that is reported by the CRYPTO_ALG_SERIALIZES_IV_ACCESS flag to
> >
Le 19/02/2018 à 09:30, Horia Geantă a écrit :
On 2/19/2018 9:58 AM, Christophe LEROY wrote:
Le 18/02/2018 à 18:14, Horia Geantă a écrit :
There is no ahash_exit() callback mirroring ahash_init().
The clean-up of request ctx should be done in the last states of the hash flows
described here:
On 2/19/2018 9:58 AM, Christophe LEROY wrote:
> Le 18/02/2018 à 18:14, Horia Geantă a écrit :
>> There is no ahash_exit() callback mirroring ahash_init().
>>
>> The clean-up of request ctx should be done in the last states of the hash
>> flows
>> described here:
>> https://www.kernel.org/doc/html/
42 matches
Mail list logo