From patchwork Fri Apr 8 05:08:23 2011 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Lucas Meneghel Rodrigues X-Patchwork-Id: 693991 Received: from vger.kernel.org (vger.kernel.org [209.132.180.67]) by demeter1.kernel.org (8.14.4/8.14.3) with ESMTP id p388EDtb003611 for ; Fri, 8 Apr 2011 08:15:22 GMT Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S1752173Ab1DHFIr (ORCPT ); Fri, 8 Apr 2011 01:08:47 -0400 Received: from mx1.redhat.com ([209.132.183.28]:58096 "EHLO mx1.redhat.com" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S1752193Ab1DHFIi (ORCPT ); Fri, 8 Apr 2011 01:08:38 -0400 Received: from int-mx12.intmail.prod.int.phx2.redhat.com (int-mx12.intmail.prod.int.phx2.redhat.com [10.5.11.25]) by mx1.redhat.com (8.14.4/8.14.4) with ESMTP id p3858aSM032736 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=OK); Fri, 8 Apr 2011 01:08:36 -0400 Received: from freedom.redhat.com (vpn-8-92.rdu.redhat.com [10.11.8.92]) by int-mx12.intmail.prod.int.phx2.redhat.com (8.14.4/8.14.4) with ESMTP id p3858QKg008833; Fri, 8 Apr 2011 01:08:34 -0400 From: Lucas Meneghel Rodrigues To: autotest@test.kernel.org Cc: kvm@vger.kernel.org, crosa@redhat.com, ehabkost@redhat.com, mgoldish@redhat.com, jadmanski@google.com, Lucas Meneghel Rodrigues Subject: [PATCH 2/4] KVM test: Fail a test right away if 'dependency_fail = yes' is on params Date: Fri, 8 Apr 2011 02:08:23 -0300 Message-Id: <1302239305-15786-3-git-send-email-lmr@redhat.com> In-Reply-To: <1302239305-15786-1-git-send-email-lmr@redhat.com> References: <1302239305-15786-1-git-send-email-lmr@redhat.com> X-Scanned-By: MIMEDefang 2.68 on 10.5.11.25 Sender: kvm-owner@vger.kernel.org Precedence: bulk List-ID: X-Mailing-List: kvm@vger.kernel.org X-Greylist: IP, sender and recipient auto-whitelisted, not delayed by milter-greylist-4.2.6 (demeter1.kernel.org [140.211.167.41]); Fri, 08 Apr 2011 08:15:22 +0000 (UTC) When the KVM config file parser generates the list of tests, it will generate a full list of dicts, each dict maps to a test to be executed. However, due to the design of our dependency system, we skip running tests that had a dependency failure. While fair, this also masks the fact that the tests that were not executed are indeed failures (test couldn't run because a dependency failed). So test jobs that had very serious problem (say, kvm build failed so every other test failed in sequence), will yield fairly reasonable PASS rates, that can fool developers. So, here's what we are going to do to solve this: * When a dependency fails, when it comes to execute a dependency test, don't just skip it. Execute it in a way that it will always throw a TestNA exception. In order to do that: * Introduce an extra parameter 'dependency_fail = yes' on the dependent test 'params' dict. * Make test preprocessing code to fail the test right away with TestNA whenever params[dependency_fail] is 'yes'. Signed-off-by: Lucas Meneghel Rodrigues --- client/tests/kvm/kvm_preprocessing.py | 6 +++++- client/tests/kvm/kvm_utils.py | 6 +++++- 2 files changed, 10 insertions(+), 2 deletions(-) diff --git a/client/tests/kvm/kvm_preprocessing.py b/client/tests/kvm/kvm_preprocessing.py index 515e3a5..47c29d4 100644 --- a/client/tests/kvm/kvm_preprocessing.py +++ b/client/tests/kvm/kvm_preprocessing.py @@ -193,8 +193,12 @@ def preprocess(test, params, env): @param params: A dict containing all VM and image parameters. @param env: The environment (a dict-like object). """ - error.context("preprocessing") + # If a dependency test prior to this test has failed, then let's 'run' this + # test, but fail it right away as TestNA. + if params("dependency_failed") == 'yes': + raise error.TestNA("Test dependency failed") + error.context("preprocessing") # Start tcpdump if it isn't already running if "address_cache" not in env: env["address_cache"] = {} diff --git a/client/tests/kvm/kvm_utils.py b/client/tests/kvm/kvm_utils.py index 5ecbd4a..ff9ee17 100644 --- a/client/tests/kvm/kvm_utils.py +++ b/client/tests/kvm/kvm_utils.py @@ -1173,7 +1173,11 @@ def run_tests(parser, job): if not current_status: failed = True else: - current_status = False + # We will force the test to fail as TestNA during preprocessing + dict['dependency_failed'] = 'yes' + current_status = job.run_test("kvm", params=dict, tag=test_tag, + iterations=test_iterations, + profile_only= bool(profilers) or None) status_dict[dict.get("name")] = current_status return not failed