From patchwork Sat May 5 08:36:35 2018 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Ingo Molnar X-Patchwork-Id: 10382045 Return-Path: Received: from mail.wl.linuxfoundation.org (pdx-wl-mail.web.codeaurora.org [172.30.200.125]) by pdx-korg-patchwork.web.codeaurora.org (Postfix) with ESMTP id 2927F60236 for ; Sat, 5 May 2018 08:39:33 +0000 (UTC) Received: from mail.wl.linuxfoundation.org (localhost [127.0.0.1]) by mail.wl.linuxfoundation.org (Postfix) with ESMTP id 09A1728EEF for ; Sat, 5 May 2018 08:39:33 +0000 (UTC) Received: by mail.wl.linuxfoundation.org (Postfix, from userid 486) id EFB892954B; Sat, 5 May 2018 08:39:32 +0000 (UTC) X-Spam-Checker-Version: SpamAssassin 3.3.1 (2010-03-16) on pdx-wl-mail.web.codeaurora.org X-Spam-Level: * X-Spam-Status: No, score=1.2 required=2.0 tests=BAYES_00,DKIM_SIGNED, FSL_HELO_FAKE, MAILING_LIST_MULTI, T_DKIM_INVALID autolearn=no version=3.3.1 Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by mail.wl.linuxfoundation.org (Postfix) with ESMTPS id 79CD528EEF for ; Sat, 5 May 2018 08:39:31 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20170209; h=Sender: Content-Transfer-Encoding:Content-Type:Cc:List-Subscribe:List-Help:List-Post: List-Archive:List-Unsubscribe:List-Id:In-Reply-To:MIME-Version:References: Message-ID:Subject:To:From:Date:Reply-To:Content-ID:Content-Description: Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID: List-Owner; bh=YqbUWMIodD7USY5L3ScoJjzV8GrSQqmGgOa2wfelFfo=; b=Nkq5IeJBd10VtG XteHPHtnOqafxktv2oCd01TOWwoa67V57VSdHTM1ylIYEp+90BjpYy9Gaq6tCe0V64A6cdkedQ3Uk UKuPa4CZhhUucH7LN/9CQX7owWsT4y+FIgOxpBtKjI3kFm4zaFi9H8tkXNjBXoYeJFcMyCGolFZcH KCwPMufEImJiAhrHPzi1vetL/xV6sOyk61WNCsK63olEOPSZcUxGPDYqFb3xEJ9pO6aGAusbkw+AI TmyD6X0n7gaFiW1TmrI6lcGQpljmFQbChME5+bvIo1tJSQzUy2y07iE1AokjsonmmEMXclwIPt6HQ wdD5rhgY1qMVeat5ndTw==; Received: from localhost ([127.0.0.1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.90_1 #2 (Red Hat Linux)) id 1fEsis-000566-T0; Sat, 05 May 2018 08:39:19 +0000 Received: from mail-wr0-x243.google.com ([2a00:1450:400c:c0c::243]) by bombadil.infradead.org with esmtps (Exim 4.90_1 #2 (Red Hat Linux)) id 1fEsgV-00043r-1F for linux-arm-kernel@lists.infradead.org; Sat, 05 May 2018 08:37:10 +0000 Received: by mail-wr0-x243.google.com with SMTP id p18-v6so23230352wrm.1 for ; Sat, 05 May 2018 01:36:40 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=sender:date:from:to:cc:subject:message-id:references:mime-version :content-disposition:in-reply-to:user-agent; bh=6VA61MMQVtZBQ1reWgIcNjw+EtfbaCEfrCvIiVQMQXA=; b=ebsX7vquGtyM+Y2yf1kN69dNjRPnhSuhFnzGCxdwDKW7kBmauR1MIkRmRIwLDKp2uR tGl+sFwJETmz0HsD1bJ7oPMtd5jzVzzTCwtB/XdDBjJSq08FqkhBaQC1SnjR3HhnyHJD Dvy5skXhD8xL6E+Mi7bF7XgI/A6j3+vtUKepvKZN+uxC17iq0pU8Qm4TQ6yOFDGNGtKu mPgLyKXjOCQtj074p5KKmgYlx6SWe6+1yxoxlO9JjQRnlBbk9RbQPu1fgEIKbWuUMK8b TXuVAfmq+dLY8VBVdXmlOlE7QpUtyk/HXQKKZRWjb3D4aETIXoZj1igfg+lamIS3js19 XL8w== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:sender:date:from:to:cc:subject:message-id :references:mime-version:content-disposition:in-reply-to:user-agent; bh=6VA61MMQVtZBQ1reWgIcNjw+EtfbaCEfrCvIiVQMQXA=; b=uEFK7mIOvWbAUwf1pqsIqZxUBhNJkS4G57Ovq6ZxZ8iReHzebHRf/+JSlE7olP57OY KymfOo1eI53/k2uvvKeSuYACzzjpapjk7HcCEWGgmsiFip8mC3m3oz0Z2oswmxzYk3n1 HO753s95clNRsqhAR6vvBk/qKK+vmLQVjdJ2R89/gVPUakcIO2AJ8xPgPqKNWBE0cgqg QAuWRLd+mfE1Ff/Rc0tkgxd1TVOLRddsR3ATiJuh1+Vx0VoqRI46EIWJJ9vwYsuSgdz7 s30Fnc6ActULlIDzwZy69fNMYtFlsGQ79Au7zocA/vver69AzA9p+ozAgTuoufbmtlQ+ /1og== X-Gm-Message-State: ALQs6tA1SDNE3L+hFWVtJkGYp4sH3XU3w20mRMb+lG3K2bQP/90WU/eJ UgsC4fNRdW5veC3Cveh9hYA= X-Google-Smtp-Source: AB8JxZq9xgftxl86qrp50rDj+IY2WzU+lc+o2EDDyikz/mZDhtxRM+2huhIr9V1bdWD+eRrykTgIVA== X-Received: by 2002:adf:b246:: with SMTP id y6-v6mr10017291wra.99.1525509399026; Sat, 05 May 2018 01:36:39 -0700 (PDT) Received: from gmail.com (2E8B0CD5.catv.pool.telekom.hu. [46.139.12.213]) by smtp.gmail.com with ESMTPSA id v111-v6sm18541440wrb.30.2018.05.05.01.36.37 (version=TLS1_2 cipher=ECDHE-RSA-CHACHA20-POLY1305 bits=256/256); Sat, 05 May 2018 01:36:38 -0700 (PDT) Date: Sat, 5 May 2018 10:36:35 +0200 From: Ingo Molnar To: Mark Rutland Subject: [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more Message-ID: <20180505083635.622xmcvb42dw5xxh@gmail.com> References: <20180504173937.25300-1-mark.rutland@arm.com> <20180504173937.25300-2-mark.rutland@arm.com> <20180504180105.GS12217@hirez.programming.kicks-ass.net> <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> <20180505081100.nsyrqrpzq2vd27bk@gmail.com> MIME-Version: 1.0 Content-Disposition: inline In-Reply-To: <20180505081100.nsyrqrpzq2vd27bk@gmail.com> User-Agent: NeoMutt/20170609 (1.8.3) X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20180505_013651_312990_34DF1C0A X-CRM114-Status: GOOD ( 19.86 ) X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.21 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Cc: Peter Zijlstra , Peter Zijlstra , catalin.marinas@arm.com, boqun.feng@gmail.com, will.deacon@arm.com, linux-kernel@vger.kernel.org, "Paul E. McKenney" , dvyukov@google.com, aryabinin@virtuozzo.com, Andrew Morton , Linus Torvalds , Thomas Gleixner , linux-arm-kernel@lists.infradead.org Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+patchwork-linux-arm=patchwork.kernel.org@lists.infradead.org X-Virus-Scanned: ClamAV using ClamSMTP * Ingo Molnar wrote: > Before: > > #ifndef atomic_fetch_dec_relaxed > > #ifndef atomic_fetch_dec > #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > #else /* atomic_fetch_dec */ > #define atomic_fetch_dec_relaxed atomic_fetch_dec > #define atomic_fetch_dec_acquire atomic_fetch_dec > #define atomic_fetch_dec_release atomic_fetch_dec > #endif /* atomic_fetch_dec */ > > #else /* atomic_fetch_dec_relaxed */ > > #ifndef atomic_fetch_dec_acquire > #define atomic_fetch_dec_acquire(...) \ > __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > #endif > > #ifndef atomic_fetch_dec_release > #define atomic_fetch_dec_release(...) \ > __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > #endif > > #ifndef atomic_fetch_dec > #define atomic_fetch_dec(...) \ > __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > #endif > #endif /* atomic_fetch_dec_relaxed */ > > After: > > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > # else > # define atomic_fetch_dec_relaxed atomic_fetch_dec > # define atomic_fetch_dec_acquire atomic_fetch_dec > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > # ifndef atomic_fetch_dec_acquire > # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > # endif > # ifndef atomic_fetch_dec_release > # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif > > The new variant is readable at a glance, and the hierarchy of defines is very > obvious as well. > > And I think we could do even better - there's absolutely no reason why _every_ > operation has to be made conditional on a finegrained level - they are overriden > in API groups. In fact allowing individual override is arguably a fragility. > > So we could do the following simplification on top of that: > > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > # else > # define atomic_fetch_dec_relaxed atomic_fetch_dec > # define atomic_fetch_dec_acquire atomic_fetch_dec > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif The attached patch implements this, which gives us another healthy simplification: include/linux/atomic.h | 312 ++++++++++--------------------------------------- 1 file changed, 62 insertions(+), 250 deletions(-) Note that the simplest definition block is now: #ifndef atomic_cmpxchg_relaxed # define atomic_cmpxchg_relaxed atomic_cmpxchg # define atomic_cmpxchg_acquire atomic_cmpxchg # define atomic_cmpxchg_release atomic_cmpxchg #else # ifndef atomic_cmpxchg # define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) # define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) # define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) # endif #endif ... which is very readable! The total linecount reduction of the two patches is pretty significant as well: include/linux/atomic.h | 1063 ++++++++++++++++-------------------------------- 1 file changed, 343 insertions(+), 720 deletions(-) Note that I kept the second patch separate, because technically it changes the way we use the defines - it should not break anything, unless I missed some detail. Please keep this kind of clarity and simplicity in new instrumentation patches! Thanks, Ingo ==================> From 5affbf7e91901143f84f1b2ca64f4afe70e210fd Mon Sep 17 00:00:00 2001 From: Ingo Molnar Date: Sat, 5 May 2018 10:23:23 +0200 Subject: [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more Before: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif The idea is that because we already group these APIs by certain defines such as atomic_fetch_dec_relaxed and atomic_fetch_dec in the primary branches - we can do the same in the secondary branch as well. ( Also remove some unnecessarily duplicate comments, as the API group defines are now pretty much self-documenting. ) No change in functionality. Cc: Peter Zijlstra Cc: Linus Torvalds Cc: Andrew Morton Cc: Thomas Gleixner Cc: Paul E. McKenney Cc: Will Deacon Cc: linux-kernel@vger.kernel.org Signed-off-by: Ingo Molnar --- include/linux/atomic.h | 312 ++++++++++--------------------------------------- 1 file changed, 62 insertions(+), 250 deletions(-) diff --git a/include/linux/atomic.h b/include/linux/atomic.h index 67aaafba256b..352ecc72d7f5 100644 --- a/include/linux/atomic.h +++ b/include/linux/atomic.h @@ -71,98 +71,66 @@ }) #endif -/* atomic_add_return_relaxed() et al: */ - #ifndef atomic_add_return_relaxed # define atomic_add_return_relaxed atomic_add_return # define atomic_add_return_acquire atomic_add_return # define atomic_add_return_release atomic_add_return #else -# ifndef atomic_add_return_acquire -# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) -# endif -# ifndef atomic_add_return_release -# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) -# endif # ifndef atomic_add_return # define atomic_add_return(...) __atomic_op_fence(atomic_add_return, __VA_ARGS__) +# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) +# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) # endif #endif -/* atomic_inc_return_relaxed() et al: */ - #ifndef atomic_inc_return_relaxed # define atomic_inc_return_relaxed atomic_inc_return # define atomic_inc_return_acquire atomic_inc_return # define atomic_inc_return_release atomic_inc_return #else -# ifndef atomic_inc_return_acquire -# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) -# endif -# ifndef atomic_inc_return_release -# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) -# endif # ifndef atomic_inc_return # define atomic_inc_return(...) __atomic_op_fence(atomic_inc_return, __VA_ARGS__) +# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) +# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) # endif #endif -/* atomic_sub_return_relaxed() et al: */ - #ifndef atomic_sub_return_relaxed # define atomic_sub_return_relaxed atomic_sub_return # define atomic_sub_return_acquire atomic_sub_return # define atomic_sub_return_release atomic_sub_return #else -# ifndef atomic_sub_return_acquire -# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) -# endif -# ifndef atomic_sub_return_release -# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) -# endif # ifndef atomic_sub_return # define atomic_sub_return(...) __atomic_op_fence(atomic_sub_return, __VA_ARGS__) +# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) +# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) # endif #endif -/* atomic_dec_return_relaxed() et al: */ - #ifndef atomic_dec_return_relaxed # define atomic_dec_return_relaxed atomic_dec_return # define atomic_dec_return_acquire atomic_dec_return # define atomic_dec_return_release atomic_dec_return #else -# ifndef atomic_dec_return_acquire -# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) -# endif -# ifndef atomic_dec_return_release -# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) -# endif # ifndef atomic_dec_return # define atomic_dec_return(...) __atomic_op_fence(atomic_dec_return, __VA_ARGS__) +# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) +# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) # endif #endif -/* atomic_fetch_add_relaxed() et al: */ - #ifndef atomic_fetch_add_relaxed # define atomic_fetch_add_relaxed atomic_fetch_add # define atomic_fetch_add_acquire atomic_fetch_add # define atomic_fetch_add_release atomic_fetch_add #else -# ifndef atomic_fetch_add_acquire -# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) -# endif -# ifndef atomic_fetch_add_release -# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) -# endif # ifndef atomic_fetch_add # define atomic_fetch_add(...) __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) +# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) +# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) # endif #endif -/* atomic_fetch_inc_relaxed() et al: */ - #ifndef atomic_fetch_inc_relaxed # ifndef atomic_fetch_inc # define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) @@ -175,37 +143,25 @@ # define atomic_fetch_inc_release atomic_fetch_inc # endif #else -# ifndef atomic_fetch_inc_acquire -# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) -# endif -# ifndef atomic_fetch_inc_release -# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) -# endif # ifndef atomic_fetch_inc # define atomic_fetch_inc(...) __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) +# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) +# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) # endif #endif -/* atomic_fetch_sub_relaxed() et al: */ - #ifndef atomic_fetch_sub_relaxed # define atomic_fetch_sub_relaxed atomic_fetch_sub # define atomic_fetch_sub_acquire atomic_fetch_sub # define atomic_fetch_sub_release atomic_fetch_sub #else -# ifndef atomic_fetch_sub_acquire -# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) -# endif -# ifndef atomic_fetch_sub_release -# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) -# endif # ifndef atomic_fetch_sub # define atomic_fetch_sub(...) __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) +# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) +# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) # endif #endif -/* atomic_fetch_dec_relaxed() et al: */ - #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) @@ -218,127 +174,86 @@ # define atomic_fetch_dec_release atomic_fetch_dec # endif #else -# ifndef atomic_fetch_dec_acquire -# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) -# endif -# ifndef atomic_fetch_dec_release -# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) -# endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) +# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) +# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif -/* atomic_fetch_or_relaxed() et al: */ - #ifndef atomic_fetch_or_relaxed # define atomic_fetch_or_relaxed atomic_fetch_or # define atomic_fetch_or_acquire atomic_fetch_or # define atomic_fetch_or_release atomic_fetch_or #else -# ifndef atomic_fetch_or_acquire -# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) -# endif -# ifndef atomic_fetch_or_release -# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) -# endif # ifndef atomic_fetch_or # define atomic_fetch_or(...) __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) +# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) +# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) # endif #endif -/* atomic_fetch_and_relaxed() et al: */ - #ifndef atomic_fetch_and_relaxed # define atomic_fetch_and_relaxed atomic_fetch_and # define atomic_fetch_and_acquire atomic_fetch_and # define atomic_fetch_and_release atomic_fetch_and #else -# ifndef atomic_fetch_and_acquire -# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) -# endif -# ifndef atomic_fetch_and_release -# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) -# endif # ifndef atomic_fetch_and # define atomic_fetch_and(...) __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) # endif #endif #ifdef atomic_andnot -/* atomic_fetch_andnot_relaxed() et al: */ - #ifndef atomic_fetch_andnot_relaxed # define atomic_fetch_andnot_relaxed atomic_fetch_andnot # define atomic_fetch_andnot_acquire atomic_fetch_andnot # define atomic_fetch_andnot_release atomic_fetch_andnot #else -# ifndef atomic_fetch_andnot_acquire -# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) -# endif -# ifndef atomic_fetch_andnot_release -# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) -# endif # ifndef atomic_fetch_andnot # define atomic_fetch_andnot(...) __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) +# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) +# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) # endif #endif #endif /* atomic_andnot */ -/* atomic_fetch_xor_relaxed() et al: */ - #ifndef atomic_fetch_xor_relaxed # define atomic_fetch_xor_relaxed atomic_fetch_xor # define atomic_fetch_xor_acquire atomic_fetch_xor # define atomic_fetch_xor_release atomic_fetch_xor #else -# ifndef atomic_fetch_xor_acquire -# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) -# endif -# ifndef atomic_fetch_xor_release -# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) -# endif # ifndef atomic_fetch_xor # define atomic_fetch_xor(...) __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) +# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) +# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) # endif #endif - -/* atomic_xchg_relaxed() et al: */ - #ifndef atomic_xchg_relaxed #define atomic_xchg_relaxed atomic_xchg #define atomic_xchg_acquire atomic_xchg #define atomic_xchg_release atomic_xchg #else -# ifndef atomic_xchg_acquire -# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) -# endif -# ifndef atomic_xchg_release -# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) -# endif # ifndef atomic_xchg # define atomic_xchg(...) __atomic_op_fence(atomic_xchg, __VA_ARGS__) +# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) +# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) # endif #endif -/* atomic_cmpxchg_relaxed() et al: */ - #ifndef atomic_cmpxchg_relaxed # define atomic_cmpxchg_relaxed atomic_cmpxchg # define atomic_cmpxchg_acquire atomic_cmpxchg # define atomic_cmpxchg_release atomic_cmpxchg #else -# ifndef atomic_cmpxchg_acquire -# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) -# endif -# ifndef atomic_cmpxchg_release -# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) -# endif # ifndef atomic_cmpxchg # define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) # endif #endif @@ -362,57 +277,39 @@ # define atomic_try_cmpxchg_release atomic_try_cmpxchg #endif -/* cmpxchg_relaxed() et al: */ - #ifndef cmpxchg_relaxed # define cmpxchg_relaxed cmpxchg # define cmpxchg_acquire cmpxchg # define cmpxchg_release cmpxchg #else -# ifndef cmpxchg_acquire -# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) -# endif -# ifndef cmpxchg_release -# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) -# endif # ifndef cmpxchg # define cmpxchg(...) __atomic_op_fence(cmpxchg, __VA_ARGS__) +# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) +# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) # endif #endif -/* cmpxchg64_relaxed() et al: */ - #ifndef cmpxchg64_relaxed # define cmpxchg64_relaxed cmpxchg64 # define cmpxchg64_acquire cmpxchg64 # define cmpxchg64_release cmpxchg64 #else -# ifndef cmpxchg64_acquire -# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) -# endif -# ifndef cmpxchg64_release -# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) -# endif # ifndef cmpxchg64 # define cmpxchg64(...) __atomic_op_fence(cmpxchg64, __VA_ARGS__) +# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) +# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) # endif #endif -/* xchg_relaxed() et al: */ - #ifndef xchg_relaxed # define xchg_relaxed xchg # define xchg_acquire xchg # define xchg_release xchg #else -# ifndef xchg_acquire -# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) -# endif -# ifndef xchg_release -# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) -# endif # ifndef xchg # define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) +# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) +# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) # endif #endif @@ -569,98 +466,66 @@ static inline int atomic_dec_if_positive(atomic_t *v) # define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif -/* atomic64_add_return_relaxed() et al: */ - #ifndef atomic64_add_return_relaxed # define atomic64_add_return_relaxed atomic64_add_return # define atomic64_add_return_acquire atomic64_add_return # define atomic64_add_return_release atomic64_add_return #else -# ifndef atomic64_add_return_acquire -# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) -# endif -# ifndef atomic64_add_return_release -# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) -# endif # ifndef atomic64_add_return # define atomic64_add_return(...) __atomic_op_fence(atomic64_add_return, __VA_ARGS__) +# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) # endif #endif -/* atomic64_inc_return_relaxed() et al: */ - #ifndef atomic64_inc_return_relaxed # define atomic64_inc_return_relaxed atomic64_inc_return # define atomic64_inc_return_acquire atomic64_inc_return # define atomic64_inc_return_release atomic64_inc_return #else -# ifndef atomic64_inc_return_acquire -# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) -# endif -# ifndef atomic64_inc_return_release -# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) -# endif # ifndef atomic64_inc_return # define atomic64_inc_return(...) __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) +# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) +# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) # endif #endif -/* atomic64_sub_return_relaxed() et al: */ - #ifndef atomic64_sub_return_relaxed # define atomic64_sub_return_relaxed atomic64_sub_return # define atomic64_sub_return_acquire atomic64_sub_return # define atomic64_sub_return_release atomic64_sub_return #else -# ifndef atomic64_sub_return_acquire -# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) -# endif -# ifndef atomic64_sub_return_release -# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) -# endif # ifndef atomic64_sub_return # define atomic64_sub_return(...) __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) +# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) +# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) # endif #endif -/* atomic64_dec_return_relaxed() et al: */ - #ifndef atomic64_dec_return_relaxed # define atomic64_dec_return_relaxed atomic64_dec_return # define atomic64_dec_return_acquire atomic64_dec_return # define atomic64_dec_return_release atomic64_dec_return #else -# ifndef atomic64_dec_return_acquire -# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) -# endif -# ifndef atomic64_dec_return_release -# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) -# endif # ifndef atomic64_dec_return # define atomic64_dec_return(...) __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) +# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) +# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) # endif #endif -/* atomic64_fetch_add_relaxed() et al: */ - #ifndef atomic64_fetch_add_relaxed # define atomic64_fetch_add_relaxed atomic64_fetch_add # define atomic64_fetch_add_acquire atomic64_fetch_add # define atomic64_fetch_add_release atomic64_fetch_add #else -# ifndef atomic64_fetch_add_acquire -# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_add_release -# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) -# endif # ifndef atomic64_fetch_add # define atomic64_fetch_add(...) __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) +# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) +# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) # endif #endif -/* atomic64_fetch_inc_relaxed() et al: */ - #ifndef atomic64_fetch_inc_relaxed # ifndef atomic64_fetch_inc # define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) @@ -673,37 +538,25 @@ static inline int atomic_dec_if_positive(atomic_t *v) # define atomic64_fetch_inc_release atomic64_fetch_inc # endif #else -# ifndef atomic64_fetch_inc_acquire -# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_inc_release -# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) -# endif # ifndef atomic64_fetch_inc # define atomic64_fetch_inc(...) __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) +# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) +# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) # endif #endif -/* atomic64_fetch_sub_relaxed() et al: */ - #ifndef atomic64_fetch_sub_relaxed # define atomic64_fetch_sub_relaxed atomic64_fetch_sub # define atomic64_fetch_sub_acquire atomic64_fetch_sub # define atomic64_fetch_sub_release atomic64_fetch_sub #else -# ifndef atomic64_fetch_sub_acquire -# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_sub_release -# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) -# endif # ifndef atomic64_fetch_sub # define atomic64_fetch_sub(...) __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) +# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) +# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) # endif #endif -/* atomic64_fetch_dec_relaxed() et al: */ - #ifndef atomic64_fetch_dec_relaxed # ifndef atomic64_fetch_dec # define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) @@ -716,127 +569,86 @@ static inline int atomic_dec_if_positive(atomic_t *v) # define atomic64_fetch_dec_release atomic64_fetch_dec # endif #else -# ifndef atomic64_fetch_dec_acquire -# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_dec_release -# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) -# endif # ifndef atomic64_fetch_dec # define atomic64_fetch_dec(...) __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) +# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) +# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) # endif #endif -/* atomic64_fetch_or_relaxed() et al: */ - #ifndef atomic64_fetch_or_relaxed # define atomic64_fetch_or_relaxed atomic64_fetch_or # define atomic64_fetch_or_acquire atomic64_fetch_or # define atomic64_fetch_or_release atomic64_fetch_or #else -# ifndef atomic64_fetch_or_acquire -# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_or_release -# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) -# endif # ifndef atomic64_fetch_or # define atomic64_fetch_or(...) __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) # endif #endif - -/* atomic64_fetch_and_relaxed() et al: */ - #ifndef atomic64_fetch_and_relaxed # define atomic64_fetch_and_relaxed atomic64_fetch_and # define atomic64_fetch_and_acquire atomic64_fetch_and # define atomic64_fetch_and_release atomic64_fetch_and #else -# ifndef atomic64_fetch_and_acquire -# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_and_release -# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) -# endif # ifndef atomic64_fetch_and # define atomic64_fetch_and(...) __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) # endif #endif #ifdef atomic64_andnot -/* atomic64_fetch_andnot_relaxed() et al: */ - #ifndef atomic64_fetch_andnot_relaxed # define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot # define atomic64_fetch_andnot_acquire atomic64_fetch_andnot # define atomic64_fetch_andnot_release atomic64_fetch_andnot #else -# ifndef atomic64_fetch_andnot_acquire -# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_andnot_release -# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) -# endif # ifndef atomic64_fetch_andnot # define atomic64_fetch_andnot(...) __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) +# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) +# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) # endif #endif #endif /* atomic64_andnot */ -/* atomic64_fetch_xor_relaxed() et al: */ - #ifndef atomic64_fetch_xor_relaxed # define atomic64_fetch_xor_relaxed atomic64_fetch_xor # define atomic64_fetch_xor_acquire atomic64_fetch_xor # define atomic64_fetch_xor_release atomic64_fetch_xor #else -# ifndef atomic64_fetch_xor_acquire -# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_xor_release -# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) -# endif # ifndef atomic64_fetch_xor # define atomic64_fetch_xor(...) __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) +# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) +# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) # endif #endif -/* atomic64_xchg_relaxed() et al: */ - #ifndef atomic64_xchg_relaxed # define atomic64_xchg_relaxed atomic64_xchg # define atomic64_xchg_acquire atomic64_xchg # define atomic64_xchg_release atomic64_xchg #else -# ifndef atomic64_xchg_acquire -# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) -# endif -# ifndef atomic64_xchg_release -# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) -# endif # ifndef atomic64_xchg # define atomic64_xchg(...) __atomic_op_fence(atomic64_xchg, __VA_ARGS__) +# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) +# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) # endif #endif -/* atomic64_cmpxchg_relaxed() et al: */ - #ifndef atomic64_cmpxchg_relaxed # define atomic64_cmpxchg_relaxed atomic64_cmpxchg # define atomic64_cmpxchg_acquire atomic64_cmpxchg # define atomic64_cmpxchg_release atomic64_cmpxchg #else -# ifndef atomic64_cmpxchg_acquire -# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) -# endif -# ifndef atomic64_cmpxchg_release -# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) -# endif # ifndef atomic64_cmpxchg # define atomic64_cmpxchg(...) __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) # endif #endif