From patchwork Sat May 5 08:11:00 2018 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Ingo Molnar X-Patchwork-Id: 10382005 Return-Path: Received: from mail.wl.linuxfoundation.org (pdx-wl-mail.web.codeaurora.org [172.30.200.125]) by pdx-korg-patchwork.web.codeaurora.org (Postfix) with ESMTP id A417A60467 for ; Sat, 5 May 2018 08:11:29 +0000 (UTC) Received: from mail.wl.linuxfoundation.org (localhost [127.0.0.1]) by mail.wl.linuxfoundation.org (Postfix) with ESMTP id 584CC290F4 for ; Sat, 5 May 2018 08:11:29 +0000 (UTC) Received: by mail.wl.linuxfoundation.org (Postfix, from userid 486) id 4473D2958E; Sat, 5 May 2018 08:11:29 +0000 (UTC) X-Spam-Checker-Version: SpamAssassin 3.3.1 (2010-03-16) on pdx-wl-mail.web.codeaurora.org X-Spam-Level: * X-Spam-Status: No, score=1.1 required=2.0 tests=BAYES_00,DKIM_SIGNED, DKIM_VALID, FSL_HELO_FAKE, MAILING_LIST_MULTI autolearn=no version=3.3.1 Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by mail.wl.linuxfoundation.org (Postfix) with ESMTPS id D9C5D290F4 for ; Sat, 5 May 2018 08:11:26 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20170209; h=Sender: Content-Transfer-Encoding:Content-Type:Cc:List-Subscribe:List-Help:List-Post: List-Archive:List-Unsubscribe:List-Id:In-Reply-To:MIME-Version:References: Message-ID:Subject:To:From:Date:Reply-To:Content-ID:Content-Description: Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID: List-Owner; bh=i3ZTufd90f3csfPMxXs8LT8jXhAYXrblLkBlAYeDC8Q=; b=Eqb7vfZeMF0+M6 XwIxVbKXEYgXuOA7IZToLEBtNrKXGjMy2jlykLoPp4EUXkoEeuGaHY8xj8pJvqT1KUiQNXvHm+Wz8 t2LCEavO8bObNWdSAs9JN/XwZElSraKktGNxwhR4TGzOc2wKZwo0JASSvORtT3o0succu1m9LR5Be zqrDKjWkkVV4HxUypsSLqR9F3hXVBP+tKSQcdW3S+i8UOU4adDfFWtOkAHMU8U1cjBunPufd+KLDT xbqmFRO8WGhyHYYySExaesQH7bEHE0IpU4KY6ZL05F99rL+He4OUMZDKVmVNtTs+ZqSp2BNbAFugY UM1yEr2blCNe03ko+bDw==; Received: from localhost ([127.0.0.1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.90_1 #2 (Red Hat Linux)) id 1fEsHq-0003kx-9b; Sat, 05 May 2018 08:11:22 +0000 Received: from mail-wm0-x241.google.com ([2a00:1450:400c:c09::241]) by bombadil.infradead.org with esmtps (Exim 4.90_1 #2 (Red Hat Linux)) id 1fEsHj-0003js-Lz for linux-arm-kernel@lists.infradead.org; Sat, 05 May 2018 08:11:19 +0000 Received: by mail-wm0-x241.google.com with SMTP id f8-v6so8327199wmc.4 for ; Sat, 05 May 2018 01:11:04 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=sender:date:from:to:cc:subject:message-id:references:mime-version :content-disposition:in-reply-to:user-agent; bh=4FIsrUEnrLo6jBpAHl0tdbppHTgPU2ijI3BpeuK1fFM=; b=DXcUfUZjiBRl13kBw4Cze5iIzhTN0aMOYlILmNMFFB+PPgVqOyry5vmgT8x4elXbC2 V1LzNiR9Ck0ULtHRyDFIKB48Y7Kc4jm7SStcjvkDVC3iRfBw9QR/h4G3/16omb7o1AqA 5UcveNP7eY3tg/cf6T8q6rhPOM4AU5zJEItz3X6Qje3MniZzrIItmGqiHS2/2NyXvTay KbsYrvMP2MeJCX3vWl5qy4Ng+n6UwruqMVrDE5pdNOJiRjFCaZVpavP7e5J3oiTtDtNl GgtrYpVhA4c7repGDewVh15ede5foIKZGS1z9FtnrCGeiBTP8M6XqqyaesbNPHWAKdFs oh8g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:sender:date:from:to:cc:subject:message-id :references:mime-version:content-disposition:in-reply-to:user-agent; bh=4FIsrUEnrLo6jBpAHl0tdbppHTgPU2ijI3BpeuK1fFM=; b=OQ5MjnFPTEqBcWpWGWOoWGSbJj41bgYY2H5zB2XwFKsokaLF3F6j4qyNbgl2G614hz DGiYg8POIq/QmMnXLgUCikdh+23toHF7FD0MsM/yjtD2m6dyUjqmpZC+HJiBnwDE5RYS wBgKaGEwp2n7VtQ1jjLgzD6fzGDDbnT1CAUIXh7JNDWJlK8W4LLx67Y0N7o+Q33o67/Z 2U0YYxDK5T9EDj3ZC4BsveyvsJBtml9ymmhMpqAN4JfGKI9EYdn7GX+FEwAFGS1ikXtQ ZgWMVvLnox2/pb41PVnkpeZZZkUWSgtcIVV2ob/Mv7hOShjbheVnpuNAGtZRl5CaP+yL BB+A== X-Gm-Message-State: ALQs6tAEG04lbH90WSNhQTRCVbjVhyqCoHT8aA+lV/tdXVqYHCjc3pPU fVUgQo3jkeIGjal3feLsfoE= X-Google-Smtp-Source: AB8JxZqpyz4d3m5t0oi9Qrd/1v4e/C5zaMHtl0YDDo64nNIHjQl+UdAPX1h++lC/+3C29Km10Xf9TA== X-Received: by 10.28.19.5 with SMTP id 5mr18293385wmt.89.1525507863250; Sat, 05 May 2018 01:11:03 -0700 (PDT) Received: from gmail.com (2E8B0CD5.catv.pool.telekom.hu. [46.139.12.213]) by smtp.gmail.com with ESMTPSA id p35-v6sm24236206wrb.12.2018.05.05.01.11.01 (version=TLS1_2 cipher=ECDHE-RSA-CHACHA20-POLY1305 bits=256/256); Sat, 05 May 2018 01:11:02 -0700 (PDT) Date: Sat, 5 May 2018 10:11:00 +0200 From: Ingo Molnar To: Mark Rutland Subject: [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Message-ID: <20180505081100.nsyrqrpzq2vd27bk@gmail.com> References: <20180504173937.25300-1-mark.rutland@arm.com> <20180504173937.25300-2-mark.rutland@arm.com> <20180504180105.GS12217@hirez.programming.kicks-ass.net> <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> MIME-Version: 1.0 Content-Disposition: inline In-Reply-To: <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> User-Agent: NeoMutt/20170609 (1.8.3) X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20180505_011115_784977_8436CA63 X-CRM114-Status: GOOD ( 24.28 ) X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.21 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Cc: Peter Zijlstra , catalin.marinas@arm.com, boqun.feng@gmail.com, will.deacon@arm.com, linux-kernel@vger.kernel.org, dvyukov@google.com, aryabinin@virtuozzo.com, linux-arm-kernel@lists.infradead.org Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+patchwork-linux-arm=patchwork.kernel.org@lists.infradead.org X-Virus-Scanned: ClamAV using ClamSMTP * Mark Rutland wrote: > On Fri, May 04, 2018 at 08:01:05PM +0200, Peter Zijlstra wrote: > > On Fri, May 04, 2018 at 06:39:32PM +0100, Mark Rutland wrote: > > > Currently only instruments the fully > > > ordered variants of atomic functions, ignoring the {relaxed,acquire,release} > > > ordering variants. > > > > > > This patch reworks the header to instrument all ordering variants of the atomic > > > functions, so that architectures implementing these are instrumented > > > appropriately. > > > > > > To minimise repetition, a macro is used to generate each variant from a common > > > template. The {full,relaxed,acquire,release} order variants respectively are > > > then built using this template, where the architecture provides an > > > implementation. > > > > include/asm-generic/atomic-instrumented.h | 1195 ++++++++++++++++++++++++----- > > > 1 file changed, 1008 insertions(+), 187 deletions(-) > > > > Is there really no way to either generate or further macro compress this? > > I can definitely macro compress this somewhat, but the bulk of the > repetition will be the ifdeffery, which can't be macro'd away IIUC. The thing is, the existing #ifdeffery is suboptimal to begin with. I just did the following cleanups (patch attached): include/linux/atomic.h | 1275 +++++++++++++++++++++--------------------------- 1 file changed, 543 insertions(+), 732 deletions(-) The gist of the changes is the following simplification of the main construct: Before: #ifndef atomic_fetch_dec_relaxed #ifndef atomic_fetch_dec #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) #else /* atomic_fetch_dec */ #define atomic_fetch_dec_relaxed atomic_fetch_dec #define atomic_fetch_dec_acquire atomic_fetch_dec #define atomic_fetch_dec_release atomic_fetch_dec #endif /* atomic_fetch_dec */ #else /* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_acquire #define atomic_fetch_dec_acquire(...) \ __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec_release #define atomic_fetch_dec_release(...) \ __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec #define atomic_fetch_dec(...) \ __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) #endif #endif /* atomic_fetch_dec_relaxed */ After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif The new variant is readable at a glance, and the hierarchy of defines is very obvious as well. And I think we could do even better - there's absolutely no reason why _every_ operation has to be made conditional on a finegrained level - they are overriden in API groups. In fact allowing individual override is arguably a fragility. So we could do the following simplification on top of that: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif Note how the grouping of APIs based on 'atomic_fetch_dec' is already an assumption in the primary !atomic_fetch_dec_relaxed branch. I much prefer such clear constructs of API mapping versus magic macros. Thanks, Ingo =============================> From 0171d4ed840d25befaedcf03e834bb76acb400c0 Mon Sep 17 00:00:00 2001 From: Ingo Molnar Date: Sat, 5 May 2018 09:57:02 +0200 Subject: [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Use structured defines to make it all much more readable. Before: #ifndef atomic_fetch_dec_relaxed #ifndef atomic_fetch_dec #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) #else /* atomic_fetch_dec */ #define atomic_fetch_dec_relaxed atomic_fetch_dec #define atomic_fetch_dec_acquire atomic_fetch_dec #define atomic_fetch_dec_release atomic_fetch_dec #endif /* atomic_fetch_dec */ #else /* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_acquire #define atomic_fetch_dec_acquire(...) \ __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec_release #define atomic_fetch_dec_release(...) \ __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec #define atomic_fetch_dec(...) \ __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) #endif #endif /* atomic_fetch_dec_relaxed */ After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif Beyond the linecount reduction this also makes it easier to follow the various conditions. Also clean up a few other minor details and make the code more consistent throughout. No change in functionality. Cc: Peter Zijlstra Cc: Linus Torvalds Cc: Andrew Morton Cc: Thomas Gleixner Cc: Paul E. McKenney Cc: Will Deacon Cc: linux-kernel@vger.kernel.org Signed-off-by: Ingo Molnar --- include/linux/atomic.h | 1275 +++++++++++++++++++++--------------------------- 1 file changed, 543 insertions(+), 732 deletions(-) diff --git a/include/linux/atomic.h b/include/linux/atomic.h index 01ce3997cb42..dc157c092ae5 100644 --- a/include/linux/atomic.h +++ b/include/linux/atomic.h @@ -24,11 +24,11 @@ */ #ifndef atomic_read_acquire -#define atomic_read_acquire(v) smp_load_acquire(&(v)->counter) +# define atomic_read_acquire(v) smp_load_acquire(&(v)->counter) #endif #ifndef atomic_set_release -#define atomic_set_release(v, i) smp_store_release(&(v)->counter, (i)) +# define atomic_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif /* @@ -71,454 +71,351 @@ }) #endif -/* atomic_add_return_relaxed */ -#ifndef atomic_add_return_relaxed -#define atomic_add_return_relaxed atomic_add_return -#define atomic_add_return_acquire atomic_add_return -#define atomic_add_return_release atomic_add_return - -#else /* atomic_add_return_relaxed */ - -#ifndef atomic_add_return_acquire -#define atomic_add_return_acquire(...) \ - __atomic_op_acquire(atomic_add_return, __VA_ARGS__) -#endif +/* atomic_add_return_relaxed() et al: */ -#ifndef atomic_add_return_release -#define atomic_add_return_release(...) \ - __atomic_op_release(atomic_add_return, __VA_ARGS__) -#endif - -#ifndef atomic_add_return -#define atomic_add_return(...) \ - __atomic_op_fence(atomic_add_return, __VA_ARGS__) -#endif -#endif /* atomic_add_return_relaxed */ +#ifndef atomic_add_return_relaxed +# define atomic_add_return_relaxed atomic_add_return +# define atomic_add_return_acquire atomic_add_return +# define atomic_add_return_release atomic_add_return +#else +# ifndef atomic_add_return_acquire +# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) +# endif +# ifndef atomic_add_return_release +# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) +# endif +# ifndef atomic_add_return +# define atomic_add_return(...) __atomic_op_fence(atomic_add_return, __VA_ARGS__) +# endif +#endif + +/* atomic_inc_return_relaxed() et al: */ -/* atomic_inc_return_relaxed */ #ifndef atomic_inc_return_relaxed -#define atomic_inc_return_relaxed atomic_inc_return -#define atomic_inc_return_acquire atomic_inc_return -#define atomic_inc_return_release atomic_inc_return - -#else /* atomic_inc_return_relaxed */ - -#ifndef atomic_inc_return_acquire -#define atomic_inc_return_acquire(...) \ - __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic_inc_return_release -#define atomic_inc_return_release(...) \ - __atomic_op_release(atomic_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic_inc_return -#define atomic_inc_return(...) \ - __atomic_op_fence(atomic_inc_return, __VA_ARGS__) -#endif -#endif /* atomic_inc_return_relaxed */ +# define atomic_inc_return_relaxed atomic_inc_return +# define atomic_inc_return_acquire atomic_inc_return +# define atomic_inc_return_release atomic_inc_return +#else +# ifndef atomic_inc_return_acquire +# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) +# endif +# ifndef atomic_inc_return_release +# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) +# endif +# ifndef atomic_inc_return +# define atomic_inc_return(...) __atomic_op_fence(atomic_inc_return, __VA_ARGS__) +# endif +#endif + +/* atomic_sub_return_relaxed() et al: */ -/* atomic_sub_return_relaxed */ #ifndef atomic_sub_return_relaxed -#define atomic_sub_return_relaxed atomic_sub_return -#define atomic_sub_return_acquire atomic_sub_return -#define atomic_sub_return_release atomic_sub_return - -#else /* atomic_sub_return_relaxed */ - -#ifndef atomic_sub_return_acquire -#define atomic_sub_return_acquire(...) \ - __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic_sub_return_release -#define atomic_sub_return_release(...) \ - __atomic_op_release(atomic_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic_sub_return -#define atomic_sub_return(...) \ - __atomic_op_fence(atomic_sub_return, __VA_ARGS__) -#endif -#endif /* atomic_sub_return_relaxed */ +# define atomic_sub_return_relaxed atomic_sub_return +# define atomic_sub_return_acquire atomic_sub_return +# define atomic_sub_return_release atomic_sub_return +#else +# ifndef atomic_sub_return_acquire +# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) +# endif +# ifndef atomic_sub_return_release +# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) +# endif +# ifndef atomic_sub_return +# define atomic_sub_return(...) __atomic_op_fence(atomic_sub_return, __VA_ARGS__) +# endif +#endif + +/* atomic_dec_return_relaxed() et al: */ -/* atomic_dec_return_relaxed */ #ifndef atomic_dec_return_relaxed -#define atomic_dec_return_relaxed atomic_dec_return -#define atomic_dec_return_acquire atomic_dec_return -#define atomic_dec_return_release atomic_dec_return - -#else /* atomic_dec_return_relaxed */ - -#ifndef atomic_dec_return_acquire -#define atomic_dec_return_acquire(...) \ - __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) -#endif +# define atomic_dec_return_relaxed atomic_dec_return +# define atomic_dec_return_acquire atomic_dec_return +# define atomic_dec_return_release atomic_dec_return +#else +# ifndef atomic_dec_return_acquire +# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) +# endif +# ifndef atomic_dec_return_release +# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) +# endif +# ifndef atomic_dec_return +# define atomic_dec_return(...) __atomic_op_fence(atomic_dec_return, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_add_relaxed() et al: */ -#ifndef atomic_dec_return_release -#define atomic_dec_return_release(...) \ - __atomic_op_release(atomic_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic_dec_return -#define atomic_dec_return(...) \ - __atomic_op_fence(atomic_dec_return, __VA_ARGS__) -#endif -#endif /* atomic_dec_return_relaxed */ - - -/* atomic_fetch_add_relaxed */ #ifndef atomic_fetch_add_relaxed -#define atomic_fetch_add_relaxed atomic_fetch_add -#define atomic_fetch_add_acquire atomic_fetch_add -#define atomic_fetch_add_release atomic_fetch_add - -#else /* atomic_fetch_add_relaxed */ - -#ifndef atomic_fetch_add_acquire -#define atomic_fetch_add_acquire(...) \ - __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_add_release -#define atomic_fetch_add_release(...) \ - __atomic_op_release(atomic_fetch_add, __VA_ARGS__) -#endif +# define atomic_fetch_add_relaxed atomic_fetch_add +# define atomic_fetch_add_acquire atomic_fetch_add +# define atomic_fetch_add_release atomic_fetch_add +#else +# ifndef atomic_fetch_add_acquire +# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic_fetch_add_release +# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic_fetch_add +# define atomic_fetch_add(...) __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_inc_relaxed() et al: */ -#ifndef atomic_fetch_add -#define atomic_fetch_add(...) \ - __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) -#endif -#endif /* atomic_fetch_add_relaxed */ - -/* atomic_fetch_inc_relaxed */ #ifndef atomic_fetch_inc_relaxed +# ifndef atomic_fetch_inc +# define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) +# define atomic_fetch_inc_relaxed(v) atomic_fetch_add_relaxed(1, (v)) +# define atomic_fetch_inc_acquire(v) atomic_fetch_add_acquire(1, (v)) +# define atomic_fetch_inc_release(v) atomic_fetch_add_release(1, (v)) +# else +# define atomic_fetch_inc_relaxed atomic_fetch_inc +# define atomic_fetch_inc_acquire atomic_fetch_inc +# define atomic_fetch_inc_release atomic_fetch_inc +# endif +#else +# ifndef atomic_fetch_inc_acquire +# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic_fetch_inc_release +# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic_fetch_inc +# define atomic_fetch_inc(...) __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_sub_relaxed() et al: */ -#ifndef atomic_fetch_inc -#define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) -#define atomic_fetch_inc_relaxed(v) atomic_fetch_add_relaxed(1, (v)) -#define atomic_fetch_inc_acquire(v) atomic_fetch_add_acquire(1, (v)) -#define atomic_fetch_inc_release(v) atomic_fetch_add_release(1, (v)) -#else /* atomic_fetch_inc */ -#define atomic_fetch_inc_relaxed atomic_fetch_inc -#define atomic_fetch_inc_acquire atomic_fetch_inc -#define atomic_fetch_inc_release atomic_fetch_inc -#endif /* atomic_fetch_inc */ - -#else /* atomic_fetch_inc_relaxed */ - -#ifndef atomic_fetch_inc_acquire -#define atomic_fetch_inc_acquire(...) \ - __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_inc_release -#define atomic_fetch_inc_release(...) \ - __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_inc -#define atomic_fetch_inc(...) \ - __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) -#endif -#endif /* atomic_fetch_inc_relaxed */ - -/* atomic_fetch_sub_relaxed */ #ifndef atomic_fetch_sub_relaxed -#define atomic_fetch_sub_relaxed atomic_fetch_sub -#define atomic_fetch_sub_acquire atomic_fetch_sub -#define atomic_fetch_sub_release atomic_fetch_sub - -#else /* atomic_fetch_sub_relaxed */ - -#ifndef atomic_fetch_sub_acquire -#define atomic_fetch_sub_acquire(...) \ - __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) -#endif +# define atomic_fetch_sub_relaxed atomic_fetch_sub +# define atomic_fetch_sub_acquire atomic_fetch_sub +# define atomic_fetch_sub_release atomic_fetch_sub +#else +# ifndef atomic_fetch_sub_acquire +# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic_fetch_sub_release +# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic_fetch_sub +# define atomic_fetch_sub(...) __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_dec_relaxed() et al: */ -#ifndef atomic_fetch_sub_release -#define atomic_fetch_sub_release(...) \ - __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_sub -#define atomic_fetch_sub(...) \ - __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) -#endif -#endif /* atomic_fetch_sub_relaxed */ - -/* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_relaxed +# ifndef atomic_fetch_dec +# define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) +# define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) +# define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) +# define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) +# else +# define atomic_fetch_dec_relaxed atomic_fetch_dec +# define atomic_fetch_dec_acquire atomic_fetch_dec +# define atomic_fetch_dec_release atomic_fetch_dec +# endif +#else +# ifndef atomic_fetch_dec_acquire +# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic_fetch_dec_release +# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic_fetch_dec +# define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_or_relaxed() et al: */ -#ifndef atomic_fetch_dec -#define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) -#define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) -#define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) -#define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) -#else /* atomic_fetch_dec */ -#define atomic_fetch_dec_relaxed atomic_fetch_dec -#define atomic_fetch_dec_acquire atomic_fetch_dec -#define atomic_fetch_dec_release atomic_fetch_dec -#endif /* atomic_fetch_dec */ - -#else /* atomic_fetch_dec_relaxed */ - -#ifndef atomic_fetch_dec_acquire -#define atomic_fetch_dec_acquire(...) \ - __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_dec_release -#define atomic_fetch_dec_release(...) \ - __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_dec -#define atomic_fetch_dec(...) \ - __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) -#endif -#endif /* atomic_fetch_dec_relaxed */ - -/* atomic_fetch_or_relaxed */ #ifndef atomic_fetch_or_relaxed -#define atomic_fetch_or_relaxed atomic_fetch_or -#define atomic_fetch_or_acquire atomic_fetch_or -#define atomic_fetch_or_release atomic_fetch_or - -#else /* atomic_fetch_or_relaxed */ - -#ifndef atomic_fetch_or_acquire -#define atomic_fetch_or_acquire(...) \ - __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_or_release -#define atomic_fetch_or_release(...) \ - __atomic_op_release(atomic_fetch_or, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_or -#define atomic_fetch_or(...) \ - __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) -#endif -#endif /* atomic_fetch_or_relaxed */ +# define atomic_fetch_or_relaxed atomic_fetch_or +# define atomic_fetch_or_acquire atomic_fetch_or +# define atomic_fetch_or_release atomic_fetch_or +#else +# ifndef atomic_fetch_or_acquire +# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic_fetch_or_release +# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic_fetch_or +# define atomic_fetch_or(...) __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_and_relaxed() et al: */ -/* atomic_fetch_and_relaxed */ #ifndef atomic_fetch_and_relaxed -#define atomic_fetch_and_relaxed atomic_fetch_and -#define atomic_fetch_and_acquire atomic_fetch_and -#define atomic_fetch_and_release atomic_fetch_and - -#else /* atomic_fetch_and_relaxed */ - -#ifndef atomic_fetch_and_acquire -#define atomic_fetch_and_acquire(...) \ - __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_and_release -#define atomic_fetch_and_release(...) \ - __atomic_op_release(atomic_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_and -#define atomic_fetch_and(...) \ - __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_relaxed atomic_fetch_and +# define atomic_fetch_and_acquire atomic_fetch_and +# define atomic_fetch_and_release atomic_fetch_and +#else +# ifndef atomic_fetch_and_acquire +# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic_fetch_and_release +# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic_fetch_and +# define atomic_fetch_and(...) __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# endif #endif -#endif /* atomic_fetch_and_relaxed */ #ifdef atomic_andnot -/* atomic_fetch_andnot_relaxed */ -#ifndef atomic_fetch_andnot_relaxed -#define atomic_fetch_andnot_relaxed atomic_fetch_andnot -#define atomic_fetch_andnot_acquire atomic_fetch_andnot -#define atomic_fetch_andnot_release atomic_fetch_andnot - -#else /* atomic_fetch_andnot_relaxed */ -#ifndef atomic_fetch_andnot_acquire -#define atomic_fetch_andnot_acquire(...) \ - __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) -#endif +/* atomic_fetch_andnot_relaxed() et al: */ -#ifndef atomic_fetch_andnot_release -#define atomic_fetch_andnot_release(...) \ - __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) +#ifndef atomic_fetch_andnot_relaxed +# define atomic_fetch_andnot_relaxed atomic_fetch_andnot +# define atomic_fetch_andnot_acquire atomic_fetch_andnot +# define atomic_fetch_andnot_release atomic_fetch_andnot +#else +# ifndef atomic_fetch_andnot_acquire +# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic_fetch_andnot_release +# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic_fetch_andnot +# define atomic_fetch_andnot(...) __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) +# endif #endif -#ifndef atomic_fetch_andnot -#define atomic_fetch_andnot(...) \ - __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) -#endif -#endif /* atomic_fetch_andnot_relaxed */ #endif /* atomic_andnot */ -/* atomic_fetch_xor_relaxed */ -#ifndef atomic_fetch_xor_relaxed -#define atomic_fetch_xor_relaxed atomic_fetch_xor -#define atomic_fetch_xor_acquire atomic_fetch_xor -#define atomic_fetch_xor_release atomic_fetch_xor - -#else /* atomic_fetch_xor_relaxed */ +/* atomic_fetch_xor_relaxed() et al: */ -#ifndef atomic_fetch_xor_acquire -#define atomic_fetch_xor_acquire(...) \ - __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_xor_release -#define atomic_fetch_xor_release(...) \ - __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) +#ifndef atomic_fetch_xor_relaxed +# define atomic_fetch_xor_relaxed atomic_fetch_xor +# define atomic_fetch_xor_acquire atomic_fetch_xor +# define atomic_fetch_xor_release atomic_fetch_xor +#else +# ifndef atomic_fetch_xor_acquire +# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic_fetch_xor_release +# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic_fetch_xor +# define atomic_fetch_xor(...) __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) +# endif #endif -#ifndef atomic_fetch_xor -#define atomic_fetch_xor(...) \ - __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) -#endif -#endif /* atomic_fetch_xor_relaxed */ +/* atomic_xchg_relaxed() et al: */ -/* atomic_xchg_relaxed */ #ifndef atomic_xchg_relaxed -#define atomic_xchg_relaxed atomic_xchg -#define atomic_xchg_acquire atomic_xchg -#define atomic_xchg_release atomic_xchg - -#else /* atomic_xchg_relaxed */ - -#ifndef atomic_xchg_acquire -#define atomic_xchg_acquire(...) \ - __atomic_op_acquire(atomic_xchg, __VA_ARGS__) -#endif - -#ifndef atomic_xchg_release -#define atomic_xchg_release(...) \ - __atomic_op_release(atomic_xchg, __VA_ARGS__) -#endif +#define atomic_xchg_relaxed atomic_xchg +#define atomic_xchg_acquire atomic_xchg +#define atomic_xchg_release atomic_xchg +#else +# ifndef atomic_xchg_acquire +# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) +# endif +# ifndef atomic_xchg_release +# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) +# endif +# ifndef atomic_xchg +# define atomic_xchg(...) __atomic_op_fence(atomic_xchg, __VA_ARGS__) +# endif +#endif + +/* atomic_cmpxchg_relaxed() et al: */ -#ifndef atomic_xchg -#define atomic_xchg(...) \ - __atomic_op_fence(atomic_xchg, __VA_ARGS__) -#endif -#endif /* atomic_xchg_relaxed */ - -/* atomic_cmpxchg_relaxed */ #ifndef atomic_cmpxchg_relaxed -#define atomic_cmpxchg_relaxed atomic_cmpxchg -#define atomic_cmpxchg_acquire atomic_cmpxchg -#define atomic_cmpxchg_release atomic_cmpxchg - -#else /* atomic_cmpxchg_relaxed */ - -#ifndef atomic_cmpxchg_acquire -#define atomic_cmpxchg_acquire(...) \ - __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_relaxed atomic_cmpxchg +# define atomic_cmpxchg_acquire atomic_cmpxchg +# define atomic_cmpxchg_release atomic_cmpxchg +#else +# ifndef atomic_cmpxchg_acquire +# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic_cmpxchg_release +# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic_cmpxchg +# define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) +# endif #endif -#ifndef atomic_cmpxchg_release -#define atomic_cmpxchg_release(...) \ - __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic_cmpxchg -#define atomic_cmpxchg(...) \ - __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) -#endif -#endif /* atomic_cmpxchg_relaxed */ - #ifndef atomic_try_cmpxchg - -#define __atomic_try_cmpxchg(type, _p, _po, _n) \ -({ \ +# define __atomic_try_cmpxchg(type, _p, _po, _n) \ + ({ \ typeof(_po) __po = (_po); \ typeof(*(_po)) __r, __o = *__po; \ __r = atomic_cmpxchg##type((_p), __o, (_n)); \ if (unlikely(__r != __o)) \ *__po = __r; \ likely(__r == __o); \ -}) - -#define atomic_try_cmpxchg(_p, _po, _n) __atomic_try_cmpxchg(, _p, _po, _n) -#define atomic_try_cmpxchg_relaxed(_p, _po, _n) __atomic_try_cmpxchg(_relaxed, _p, _po, _n) -#define atomic_try_cmpxchg_acquire(_p, _po, _n) __atomic_try_cmpxchg(_acquire, _p, _po, _n) -#define atomic_try_cmpxchg_release(_p, _po, _n) __atomic_try_cmpxchg(_release, _p, _po, _n) - -#else /* atomic_try_cmpxchg */ -#define atomic_try_cmpxchg_relaxed atomic_try_cmpxchg -#define atomic_try_cmpxchg_acquire atomic_try_cmpxchg -#define atomic_try_cmpxchg_release atomic_try_cmpxchg -#endif /* atomic_try_cmpxchg */ - -/* cmpxchg_relaxed */ -#ifndef cmpxchg_relaxed -#define cmpxchg_relaxed cmpxchg -#define cmpxchg_acquire cmpxchg -#define cmpxchg_release cmpxchg - -#else /* cmpxchg_relaxed */ - -#ifndef cmpxchg_acquire -#define cmpxchg_acquire(...) \ - __atomic_op_acquire(cmpxchg, __VA_ARGS__) + }) +# define atomic_try_cmpxchg(_p, _po, _n) __atomic_try_cmpxchg(, _p, _po, _n) +# define atomic_try_cmpxchg_relaxed(_p, _po, _n) __atomic_try_cmpxchg(_relaxed, _p, _po, _n) +# define atomic_try_cmpxchg_acquire(_p, _po, _n) __atomic_try_cmpxchg(_acquire, _p, _po, _n) +# define atomic_try_cmpxchg_release(_p, _po, _n) __atomic_try_cmpxchg(_release, _p, _po, _n) +#else +# define atomic_try_cmpxchg_relaxed atomic_try_cmpxchg +# define atomic_try_cmpxchg_acquire atomic_try_cmpxchg +# define atomic_try_cmpxchg_release atomic_try_cmpxchg #endif -#ifndef cmpxchg_release -#define cmpxchg_release(...) \ - __atomic_op_release(cmpxchg, __VA_ARGS__) -#endif +/* cmpxchg_relaxed() et al: */ -#ifndef cmpxchg -#define cmpxchg(...) \ - __atomic_op_fence(cmpxchg, __VA_ARGS__) -#endif -#endif /* cmpxchg_relaxed */ +#ifndef cmpxchg_relaxed +# define cmpxchg_relaxed cmpxchg +# define cmpxchg_acquire cmpxchg +# define cmpxchg_release cmpxchg +#else +# ifndef cmpxchg_acquire +# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) +# endif +# ifndef cmpxchg_release +# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) +# endif +# ifndef cmpxchg +# define cmpxchg(...) __atomic_op_fence(cmpxchg, __VA_ARGS__) +# endif +#endif + +/* cmpxchg64_relaxed() et al: */ -/* cmpxchg64_relaxed */ #ifndef cmpxchg64_relaxed -#define cmpxchg64_relaxed cmpxchg64 -#define cmpxchg64_acquire cmpxchg64 -#define cmpxchg64_release cmpxchg64 - -#else /* cmpxchg64_relaxed */ - -#ifndef cmpxchg64_acquire -#define cmpxchg64_acquire(...) \ - __atomic_op_acquire(cmpxchg64, __VA_ARGS__) -#endif - -#ifndef cmpxchg64_release -#define cmpxchg64_release(...) \ - __atomic_op_release(cmpxchg64, __VA_ARGS__) -#endif - -#ifndef cmpxchg64 -#define cmpxchg64(...) \ - __atomic_op_fence(cmpxchg64, __VA_ARGS__) -#endif -#endif /* cmpxchg64_relaxed */ +# define cmpxchg64_relaxed cmpxchg64 +# define cmpxchg64_acquire cmpxchg64 +# define cmpxchg64_release cmpxchg64 +#else +# ifndef cmpxchg64_acquire +# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) +# endif +# ifndef cmpxchg64_release +# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) +# endif +# ifndef cmpxchg64 +# define cmpxchg64(...) __atomic_op_fence(cmpxchg64, __VA_ARGS__) +# endif +#endif + +/* xchg_relaxed() et al: */ -/* xchg_relaxed */ #ifndef xchg_relaxed -#define xchg_relaxed xchg -#define xchg_acquire xchg -#define xchg_release xchg - -#else /* xchg_relaxed */ - -#ifndef xchg_acquire -#define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) -#endif - -#ifndef xchg_release -#define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) +# define xchg_relaxed xchg +# define xchg_acquire xchg +# define xchg_release xchg +#else +# ifndef xchg_acquire +# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) +# endif +# ifndef xchg_release +# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) +# endif +# ifndef xchg +# define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) +# endif #endif -#ifndef xchg -#define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) -#endif -#endif /* xchg_relaxed */ - /** * atomic_add_unless - add unless the number is already a given value * @v: pointer of type atomic_t @@ -541,7 +438,7 @@ static inline int atomic_add_unless(atomic_t *v, int a, int u) * Returns non-zero if @v was non-zero, and zero otherwise. */ #ifndef atomic_inc_not_zero -#define atomic_inc_not_zero(v) atomic_add_unless((v), 1, 0) +# define atomic_inc_not_zero(v) atomic_add_unless((v), 1, 0) #endif #ifndef atomic_andnot @@ -607,6 +504,7 @@ static inline int atomic_inc_not_zero_hint(atomic_t *v, int hint) static inline int atomic_inc_unless_negative(atomic_t *p) { int v, v1; + for (v = 0; v >= 0; v = v1) { v1 = atomic_cmpxchg(p, v, v + 1); if (likely(v1 == v)) @@ -620,6 +518,7 @@ static inline int atomic_inc_unless_negative(atomic_t *p) static inline int atomic_dec_unless_positive(atomic_t *p) { int v, v1; + for (v = 0; v <= 0; v = v1) { v1 = atomic_cmpxchg(p, v, v - 1); if (likely(v1 == v)) @@ -640,6 +539,7 @@ static inline int atomic_dec_unless_positive(atomic_t *p) static inline int atomic_dec_if_positive(atomic_t *v) { int c, old, dec; + c = atomic_read(v); for (;;) { dec = c - 1; @@ -654,400 +554,311 @@ static inline int atomic_dec_if_positive(atomic_t *v) } #endif -#define atomic_cond_read_relaxed(v, c) smp_cond_load_relaxed(&(v)->counter, (c)) -#define atomic_cond_read_acquire(v, c) smp_cond_load_acquire(&(v)->counter, (c)) +#define atomic_cond_read_relaxed(v, c) smp_cond_load_relaxed(&(v)->counter, (c)) +#define atomic_cond_read_acquire(v, c) smp_cond_load_acquire(&(v)->counter, (c)) #ifdef CONFIG_GENERIC_ATOMIC64 #include #endif #ifndef atomic64_read_acquire -#define atomic64_read_acquire(v) smp_load_acquire(&(v)->counter) +# define atomic64_read_acquire(v) smp_load_acquire(&(v)->counter) #endif #ifndef atomic64_set_release -#define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) -#endif - -/* atomic64_add_return_relaxed */ -#ifndef atomic64_add_return_relaxed -#define atomic64_add_return_relaxed atomic64_add_return -#define atomic64_add_return_acquire atomic64_add_return -#define atomic64_add_return_release atomic64_add_return - -#else /* atomic64_add_return_relaxed */ - -#ifndef atomic64_add_return_acquire -#define atomic64_add_return_acquire(...) \ - __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif -#ifndef atomic64_add_return_release -#define atomic64_add_return_release(...) \ - __atomic_op_release(atomic64_add_return, __VA_ARGS__) -#endif +/* atomic64_add_return_relaxed() et al: */ -#ifndef atomic64_add_return -#define atomic64_add_return(...) \ - __atomic_op_fence(atomic64_add_return, __VA_ARGS__) -#endif -#endif /* atomic64_add_return_relaxed */ +#ifndef atomic64_add_return_relaxed +# define atomic64_add_return_relaxed atomic64_add_return +# define atomic64_add_return_acquire atomic64_add_return +# define atomic64_add_return_release atomic64_add_return +#else +# ifndef atomic64_add_return_acquire +# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# endif +# ifndef atomic64_add_return_release +# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) +# endif +# ifndef atomic64_add_return +# define atomic64_add_return(...) __atomic_op_fence(atomic64_add_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_inc_return_relaxed() et al: */ -/* atomic64_inc_return_relaxed */ #ifndef atomic64_inc_return_relaxed -#define atomic64_inc_return_relaxed atomic64_inc_return -#define atomic64_inc_return_acquire atomic64_inc_return -#define atomic64_inc_return_release atomic64_inc_return - -#else /* atomic64_inc_return_relaxed */ - -#ifndef atomic64_inc_return_acquire -#define atomic64_inc_return_acquire(...) \ - __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic64_inc_return_release -#define atomic64_inc_return_release(...) \ - __atomic_op_release(atomic64_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic64_inc_return -#define atomic64_inc_return(...) \ - __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) -#endif -#endif /* atomic64_inc_return_relaxed */ - +# define atomic64_inc_return_relaxed atomic64_inc_return +# define atomic64_inc_return_acquire atomic64_inc_return +# define atomic64_inc_return_release atomic64_inc_return +#else +# ifndef atomic64_inc_return_acquire +# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) +# endif +# ifndef atomic64_inc_return_release +# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) +# endif +# ifndef atomic64_inc_return +# define atomic64_inc_return(...) __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_sub_return_relaxed() et al: */ -/* atomic64_sub_return_relaxed */ #ifndef atomic64_sub_return_relaxed -#define atomic64_sub_return_relaxed atomic64_sub_return -#define atomic64_sub_return_acquire atomic64_sub_return -#define atomic64_sub_return_release atomic64_sub_return +# define atomic64_sub_return_relaxed atomic64_sub_return +# define atomic64_sub_return_acquire atomic64_sub_return +# define atomic64_sub_return_release atomic64_sub_return +#else +# ifndef atomic64_sub_return_acquire +# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) +# endif +# ifndef atomic64_sub_return_release +# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) +# endif +# ifndef atomic64_sub_return +# define atomic64_sub_return(...) __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_dec_return_relaxed() et al: */ -#else /* atomic64_sub_return_relaxed */ - -#ifndef atomic64_sub_return_acquire -#define atomic64_sub_return_acquire(...) \ - __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic64_sub_return_release -#define atomic64_sub_return_release(...) \ - __atomic_op_release(atomic64_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic64_sub_return -#define atomic64_sub_return(...) \ - __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) -#endif -#endif /* atomic64_sub_return_relaxed */ - -/* atomic64_dec_return_relaxed */ #ifndef atomic64_dec_return_relaxed -#define atomic64_dec_return_relaxed atomic64_dec_return -#define atomic64_dec_return_acquire atomic64_dec_return -#define atomic64_dec_return_release atomic64_dec_return - -#else /* atomic64_dec_return_relaxed */ - -#ifndef atomic64_dec_return_acquire -#define atomic64_dec_return_acquire(...) \ - __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic64_dec_return_release -#define atomic64_dec_return_release(...) \ - __atomic_op_release(atomic64_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic64_dec_return -#define atomic64_dec_return(...) \ - __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) -#endif -#endif /* atomic64_dec_return_relaxed */ +# define atomic64_dec_return_relaxed atomic64_dec_return +# define atomic64_dec_return_acquire atomic64_dec_return +# define atomic64_dec_return_release atomic64_dec_return +#else +# ifndef atomic64_dec_return_acquire +# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) +# endif +# ifndef atomic64_dec_return_release +# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) +# endif +# ifndef atomic64_dec_return +# define atomic64_dec_return(...) __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_add_relaxed() et al: */ - -/* atomic64_fetch_add_relaxed */ #ifndef atomic64_fetch_add_relaxed -#define atomic64_fetch_add_relaxed atomic64_fetch_add -#define atomic64_fetch_add_acquire atomic64_fetch_add -#define atomic64_fetch_add_release atomic64_fetch_add - -#else /* atomic64_fetch_add_relaxed */ - -#ifndef atomic64_fetch_add_acquire -#define atomic64_fetch_add_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) -#endif +# define atomic64_fetch_add_relaxed atomic64_fetch_add +# define atomic64_fetch_add_acquire atomic64_fetch_add +# define atomic64_fetch_add_release atomic64_fetch_add +#else +# ifndef atomic64_fetch_add_acquire +# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_add_release +# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_add +# define atomic64_fetch_add(...) __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_inc_relaxed() et al: */ -#ifndef atomic64_fetch_add_release -#define atomic64_fetch_add_release(...) \ - __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_add -#define atomic64_fetch_add(...) \ - __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_add_relaxed */ - -/* atomic64_fetch_inc_relaxed */ #ifndef atomic64_fetch_inc_relaxed +# ifndef atomic64_fetch_inc +# define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) +# define atomic64_fetch_inc_relaxed(v) atomic64_fetch_add_relaxed(1, (v)) +# define atomic64_fetch_inc_acquire(v) atomic64_fetch_add_acquire(1, (v)) +# define atomic64_fetch_inc_release(v) atomic64_fetch_add_release(1, (v)) +# else +# define atomic64_fetch_inc_relaxed atomic64_fetch_inc +# define atomic64_fetch_inc_acquire atomic64_fetch_inc +# define atomic64_fetch_inc_release atomic64_fetch_inc +# endif +#else +# ifndef atomic64_fetch_inc_acquire +# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_inc_release +# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_inc +# define atomic64_fetch_inc(...) __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_sub_relaxed() et al: */ -#ifndef atomic64_fetch_inc -#define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) -#define atomic64_fetch_inc_relaxed(v) atomic64_fetch_add_relaxed(1, (v)) -#define atomic64_fetch_inc_acquire(v) atomic64_fetch_add_acquire(1, (v)) -#define atomic64_fetch_inc_release(v) atomic64_fetch_add_release(1, (v)) -#else /* atomic64_fetch_inc */ -#define atomic64_fetch_inc_relaxed atomic64_fetch_inc -#define atomic64_fetch_inc_acquire atomic64_fetch_inc -#define atomic64_fetch_inc_release atomic64_fetch_inc -#endif /* atomic64_fetch_inc */ - -#else /* atomic64_fetch_inc_relaxed */ - -#ifndef atomic64_fetch_inc_acquire -#define atomic64_fetch_inc_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_inc_release -#define atomic64_fetch_inc_release(...) \ - __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_inc -#define atomic64_fetch_inc(...) \ - __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_inc_relaxed */ - -/* atomic64_fetch_sub_relaxed */ #ifndef atomic64_fetch_sub_relaxed -#define atomic64_fetch_sub_relaxed atomic64_fetch_sub -#define atomic64_fetch_sub_acquire atomic64_fetch_sub -#define atomic64_fetch_sub_release atomic64_fetch_sub - -#else /* atomic64_fetch_sub_relaxed */ - -#ifndef atomic64_fetch_sub_acquire -#define atomic64_fetch_sub_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_sub_release -#define atomic64_fetch_sub_release(...) \ - __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_sub -#define atomic64_fetch_sub(...) \ - __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_sub_relaxed */ +# define atomic64_fetch_sub_relaxed atomic64_fetch_sub +# define atomic64_fetch_sub_acquire atomic64_fetch_sub +# define atomic64_fetch_sub_release atomic64_fetch_sub +#else +# ifndef atomic64_fetch_sub_acquire +# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_sub_release +# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_sub +# define atomic64_fetch_sub(...) __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_dec_relaxed() et al: */ -/* atomic64_fetch_dec_relaxed */ #ifndef atomic64_fetch_dec_relaxed +# ifndef atomic64_fetch_dec +# define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) +# define atomic64_fetch_dec_relaxed(v) atomic64_fetch_sub_relaxed(1, (v)) +# define atomic64_fetch_dec_acquire(v) atomic64_fetch_sub_acquire(1, (v)) +# define atomic64_fetch_dec_release(v) atomic64_fetch_sub_release(1, (v)) +# else +# define atomic64_fetch_dec_relaxed atomic64_fetch_dec +# define atomic64_fetch_dec_acquire atomic64_fetch_dec +# define atomic64_fetch_dec_release atomic64_fetch_dec +# endif +#else +# ifndef atomic64_fetch_dec_acquire +# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_dec_release +# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_dec +# define atomic64_fetch_dec(...) __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_or_relaxed() et al: */ -#ifndef atomic64_fetch_dec -#define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) -#define atomic64_fetch_dec_relaxed(v) atomic64_fetch_sub_relaxed(1, (v)) -#define atomic64_fetch_dec_acquire(v) atomic64_fetch_sub_acquire(1, (v)) -#define atomic64_fetch_dec_release(v) atomic64_fetch_sub_release(1, (v)) -#else /* atomic64_fetch_dec */ -#define atomic64_fetch_dec_relaxed atomic64_fetch_dec -#define atomic64_fetch_dec_acquire atomic64_fetch_dec -#define atomic64_fetch_dec_release atomic64_fetch_dec -#endif /* atomic64_fetch_dec */ - -#else /* atomic64_fetch_dec_relaxed */ - -#ifndef atomic64_fetch_dec_acquire -#define atomic64_fetch_dec_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_dec_release -#define atomic64_fetch_dec_release(...) \ - __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_dec -#define atomic64_fetch_dec(...) \ - __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_dec_relaxed */ - -/* atomic64_fetch_or_relaxed */ #ifndef atomic64_fetch_or_relaxed -#define atomic64_fetch_or_relaxed atomic64_fetch_or -#define atomic64_fetch_or_acquire atomic64_fetch_or -#define atomic64_fetch_or_release atomic64_fetch_or - -#else /* atomic64_fetch_or_relaxed */ - -#ifndef atomic64_fetch_or_acquire -#define atomic64_fetch_or_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_relaxed atomic64_fetch_or +# define atomic64_fetch_or_acquire atomic64_fetch_or +# define atomic64_fetch_or_release atomic64_fetch_or +#else +# ifndef atomic64_fetch_or_acquire +# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_or_release +# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_or +# define atomic64_fetch_or(...) __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_or_release -#define atomic64_fetch_or_release(...) \ - __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) -#endif -#ifndef atomic64_fetch_or -#define atomic64_fetch_or(...) \ - __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_or_relaxed */ +/* atomic64_fetch_and_relaxed() et al: */ -/* atomic64_fetch_and_relaxed */ #ifndef atomic64_fetch_and_relaxed -#define atomic64_fetch_and_relaxed atomic64_fetch_and -#define atomic64_fetch_and_acquire atomic64_fetch_and -#define atomic64_fetch_and_release atomic64_fetch_and - -#else /* atomic64_fetch_and_relaxed */ - -#ifndef atomic64_fetch_and_acquire -#define atomic64_fetch_and_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_relaxed atomic64_fetch_and +# define atomic64_fetch_and_acquire atomic64_fetch_and +# define atomic64_fetch_and_release atomic64_fetch_and +#else +# ifndef atomic64_fetch_and_acquire +# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_and_release +# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_and +# define atomic64_fetch_and(...) __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_and_release -#define atomic64_fetch_and_release(...) \ - __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_and -#define atomic64_fetch_and(...) \ - __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_and_relaxed */ - #ifdef atomic64_andnot -/* atomic64_fetch_andnot_relaxed */ -#ifndef atomic64_fetch_andnot_relaxed -#define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot -#define atomic64_fetch_andnot_acquire atomic64_fetch_andnot -#define atomic64_fetch_andnot_release atomic64_fetch_andnot - -#else /* atomic64_fetch_andnot_relaxed */ -#ifndef atomic64_fetch_andnot_acquire -#define atomic64_fetch_andnot_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) -#endif +/* atomic64_fetch_andnot_relaxed() et al: */ -#ifndef atomic64_fetch_andnot_release -#define atomic64_fetch_andnot_release(...) \ - __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) +#ifndef atomic64_fetch_andnot_relaxed +# define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot +# define atomic64_fetch_andnot_acquire atomic64_fetch_andnot +# define atomic64_fetch_andnot_release atomic64_fetch_andnot +#else +# ifndef atomic64_fetch_andnot_acquire +# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_andnot_release +# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_andnot +# define atomic64_fetch_andnot(...) __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_andnot -#define atomic64_fetch_andnot(...) \ - __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_andnot_relaxed */ #endif /* atomic64_andnot */ -/* atomic64_fetch_xor_relaxed */ -#ifndef atomic64_fetch_xor_relaxed -#define atomic64_fetch_xor_relaxed atomic64_fetch_xor -#define atomic64_fetch_xor_acquire atomic64_fetch_xor -#define atomic64_fetch_xor_release atomic64_fetch_xor - -#else /* atomic64_fetch_xor_relaxed */ +/* atomic64_fetch_xor_relaxed() et al: */ -#ifndef atomic64_fetch_xor_acquire -#define atomic64_fetch_xor_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_xor_release -#define atomic64_fetch_xor_release(...) \ - __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) +#ifndef atomic64_fetch_xor_relaxed +# define atomic64_fetch_xor_relaxed atomic64_fetch_xor +# define atomic64_fetch_xor_acquire atomic64_fetch_xor +# define atomic64_fetch_xor_release atomic64_fetch_xor +#else +# ifndef atomic64_fetch_xor_acquire +# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_xor_release +# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_xor +# define atomic64_fetch_xor(...) __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) #endif - -#ifndef atomic64_fetch_xor -#define atomic64_fetch_xor(...) \ - __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) #endif -#endif /* atomic64_fetch_xor_relaxed */ +/* atomic64_xchg_relaxed() et al: */ -/* atomic64_xchg_relaxed */ #ifndef atomic64_xchg_relaxed -#define atomic64_xchg_relaxed atomic64_xchg -#define atomic64_xchg_acquire atomic64_xchg -#define atomic64_xchg_release atomic64_xchg - -#else /* atomic64_xchg_relaxed */ - -#ifndef atomic64_xchg_acquire -#define atomic64_xchg_acquire(...) \ - __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) -#endif +# define atomic64_xchg_relaxed atomic64_xchg +# define atomic64_xchg_acquire atomic64_xchg +# define atomic64_xchg_release atomic64_xchg +#else +# ifndef atomic64_xchg_acquire +# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) +# endif +# ifndef atomic64_xchg_release +# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) +# endif +# ifndef atomic64_xchg +# define atomic64_xchg(...) __atomic_op_fence(atomic64_xchg, __VA_ARGS__) +# endif +#endif + +/* atomic64_cmpxchg_relaxed() et al: */ -#ifndef atomic64_xchg_release -#define atomic64_xchg_release(...) \ - __atomic_op_release(atomic64_xchg, __VA_ARGS__) -#endif - -#ifndef atomic64_xchg -#define atomic64_xchg(...) \ - __atomic_op_fence(atomic64_xchg, __VA_ARGS__) -#endif -#endif /* atomic64_xchg_relaxed */ - -/* atomic64_cmpxchg_relaxed */ #ifndef atomic64_cmpxchg_relaxed -#define atomic64_cmpxchg_relaxed atomic64_cmpxchg -#define atomic64_cmpxchg_acquire atomic64_cmpxchg -#define atomic64_cmpxchg_release atomic64_cmpxchg - -#else /* atomic64_cmpxchg_relaxed */ - -#ifndef atomic64_cmpxchg_acquire -#define atomic64_cmpxchg_acquire(...) \ - __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic64_cmpxchg_release -#define atomic64_cmpxchg_release(...) \ - __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic64_cmpxchg -#define atomic64_cmpxchg(...) \ - __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_relaxed atomic64_cmpxchg +# define atomic64_cmpxchg_acquire atomic64_cmpxchg +# define atomic64_cmpxchg_release atomic64_cmpxchg +#else +# ifndef atomic64_cmpxchg_acquire +# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic64_cmpxchg_release +# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic64_cmpxchg +# define atomic64_cmpxchg(...) __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# endif #endif -#endif /* atomic64_cmpxchg_relaxed */ #ifndef atomic64_try_cmpxchg - -#define __atomic64_try_cmpxchg(type, _p, _po, _n) \ -({ \ +# define __atomic64_try_cmpxchg(type, _p, _po, _n) \ + ({ \ typeof(_po) __po = (_po); \ typeof(*(_po)) __r, __o = *__po; \ __r = atomic64_cmpxchg##type((_p), __o, (_n)); \ if (unlikely(__r != __o)) \ *__po = __r; \ likely(__r == __o); \ -}) - -#define atomic64_try_cmpxchg(_p, _po, _n) __atomic64_try_cmpxchg(, _p, _po, _n) -#define atomic64_try_cmpxchg_relaxed(_p, _po, _n) __atomic64_try_cmpxchg(_relaxed, _p, _po, _n) -#define atomic64_try_cmpxchg_acquire(_p, _po, _n) __atomic64_try_cmpxchg(_acquire, _p, _po, _n) -#define atomic64_try_cmpxchg_release(_p, _po, _n) __atomic64_try_cmpxchg(_release, _p, _po, _n) - -#else /* atomic64_try_cmpxchg */ -#define atomic64_try_cmpxchg_relaxed atomic64_try_cmpxchg -#define atomic64_try_cmpxchg_acquire atomic64_try_cmpxchg -#define atomic64_try_cmpxchg_release atomic64_try_cmpxchg -#endif /* atomic64_try_cmpxchg */ + }) +# define atomic64_try_cmpxchg(_p, _po, _n) __atomic64_try_cmpxchg(, _p, _po, _n) +# define atomic64_try_cmpxchg_relaxed(_p, _po, _n) __atomic64_try_cmpxchg(_relaxed, _p, _po, _n) +# define atomic64_try_cmpxchg_acquire(_p, _po, _n) __atomic64_try_cmpxchg(_acquire, _p, _po, _n) +# define atomic64_try_cmpxchg_release(_p, _po, _n) __atomic64_try_cmpxchg(_release, _p, _po, _n) +#else +# define atomic64_try_cmpxchg_relaxed atomic64_try_cmpxchg +# define atomic64_try_cmpxchg_acquire atomic64_try_cmpxchg +# define atomic64_try_cmpxchg_release atomic64_try_cmpxchg +#endif #ifndef atomic64_andnot static inline void atomic64_andnot(long long i, atomic64_t *v)