Message ID | a01785b993e2b39864ee0cab09695ae23a02b2f5.1567077734.git.viresh.kumar@linaro.org (mailing list archive) |
---|---|
State | New, archived |
Headers | show |
Series | V4.4 backport of arm64 Spectre patches | expand |
On Thu, Aug 29, 2019 at 05:03:47PM +0530, Viresh Kumar wrote: > From: Robin Murphy <robin.murphy@arm.com> > > commit 022620eed3d0bc4bf2027326f599f5ad71c2ea3f upstream. > > Provide an optimised, assembly implementation of array_index_mask_nospec() > for arm64 so that the compiler is not in a position to transform the code > in ways which affect its ability to inhibit speculation (e.g. by introducing > conditional branches). > > This is similar to the sequence used by x86, modulo architectural differences > in the carry/borrow flags. > > Reviewed-by: Mark Rutland <mark.rutland@arm.com> > Signed-off-by: Robin Murphy <robin.murphy@arm.com> > Signed-off-by: Will Deacon <will.deacon@arm.com> > Signed-off-by: Catalin Marinas <catalin.marinas@arm.com> > Signed-off-by: Viresh Kumar <viresh.kumar@linaro.org> Reviewed-by: Mark Rutland <mark.rutland@arm.com> [v4.4 backport] Mark. > --- > arch/arm64/include/asm/barrier.h | 21 +++++++++++++++++++++ > 1 file changed, 21 insertions(+) > > diff --git a/arch/arm64/include/asm/barrier.h b/arch/arm64/include/asm/barrier.h > index 574486634c62..7c25e3e11b6d 100644 > --- a/arch/arm64/include/asm/barrier.h > +++ b/arch/arm64/include/asm/barrier.h > @@ -37,6 +37,27 @@ > #define dma_rmb() dmb(oshld) > #define dma_wmb() dmb(oshst) > > +/* > + * Generate a mask for array_index__nospec() that is ~0UL when 0 <= idx < sz > + * and 0 otherwise. > + */ > +#define array_index_mask_nospec array_index_mask_nospec > +static inline unsigned long array_index_mask_nospec(unsigned long idx, > + unsigned long sz) > +{ > + unsigned long mask; > + > + asm volatile( > + " cmp %1, %2\n" > + " sbc %0, xzr, xzr\n" > + : "=r" (mask) > + : "r" (idx), "Ir" (sz) > + : "cc"); > + > + csdb(); > + return mask; > +} > + > #define smp_mb() dmb(ish) > #define smp_rmb() dmb(ishld) > #define smp_wmb() dmb(ishst) > -- > 2.21.0.rc0.269.g1a574e7a288b >
diff --git a/arch/arm64/include/asm/barrier.h b/arch/arm64/include/asm/barrier.h index 574486634c62..7c25e3e11b6d 100644 --- a/arch/arm64/include/asm/barrier.h +++ b/arch/arm64/include/asm/barrier.h @@ -37,6 +37,27 @@ #define dma_rmb() dmb(oshld) #define dma_wmb() dmb(oshst) +/* + * Generate a mask for array_index__nospec() that is ~0UL when 0 <= idx < sz + * and 0 otherwise. + */ +#define array_index_mask_nospec array_index_mask_nospec +static inline unsigned long array_index_mask_nospec(unsigned long idx, + unsigned long sz) +{ + unsigned long mask; + + asm volatile( + " cmp %1, %2\n" + " sbc %0, xzr, xzr\n" + : "=r" (mask) + : "r" (idx), "Ir" (sz) + : "cc"); + + csdb(); + return mask; +} + #define smp_mb() dmb(ish) #define smp_rmb() dmb(ishld) #define smp_wmb() dmb(ishst)