UPSTREAM: kasan, arm64: use ARCH_SLAB_MINALIGN instead of manual aligning

Upstream commit eb214f2dda31ffa989033b1e0f848ba0d3cb6188.

Instead of changing cache->align to be aligned to KASAN_SHADOW_SCALE_SIZE
in kasan_cache_create() we can reuse the ARCH_SLAB_MINALIGN macro.

Link: http://lkml.kernel.org/r/52ddd881916bcc153a9924c154daacde78522227.1546540962.git.andreyknvl@google.com
Signed-off-by: Andrey Konovalov <andreyknvl@google.com>
Suggested-by: Vincenzo Frascino <vincenzo.frascino@arm.com>
Cc: Andrey Ryabinin <aryabinin@virtuozzo.com>
Cc: Christoph Lameter <cl@linux.com>
Cc: Dmitry Vyukov <dvyukov@google.com>
Cc: Mark Rutland <mark.rutland@arm.com>
Cc: Vincenzo Frascino <vincenzo.frascino@arm.com>
Cc: Will Deacon <will.deacon@arm.com>
Signed-off-by: Andrew Morton <akpm@linux-foundation.org>
Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org>
Signed-off-by: Andrey Konovalov <andreyknvl@google.com>
Change-Id: I9ab03979c58c5f59d2c78854ec51c4b5f30605e6
Bug: 128674696
tirimbino
Andrey Konovalov 6 years ago committed by Paul Lawrence
parent c3380a353c
commit 6f4b0fa978
  1. 6
      arch/arm64/include/asm/cache.h
  2. 2
      mm/kasan/common.c

@ -46,6 +46,12 @@
*/
#define ARCH_DMA_MINALIGN L1_CACHE_BYTES
#ifdef CONFIG_KASAN_SW_TAGS
#define ARCH_SLAB_MINALIGN (1ULL << KASAN_SHADOW_SCALE_SHIFT)
#else
#define ARCH_SLAB_MINALIGN __alignof__(unsigned long long)
#endif
#ifndef __ASSEMBLY__
#include <linux/bitops.h>

@ -298,8 +298,6 @@ void kasan_cache_create(struct kmem_cache *cache, unsigned int *size,
return;
}
cache->align = round_up(cache->align, KASAN_SHADOW_SCALE_SIZE);
*flags |= SLAB_KASAN;
}

Loading…
Cancel
Save