Search

US-12625796-B2 - Testing automation for open standard cloud services applications

US12625796B2US 12625796 B2US12625796 B2US 12625796B2US-12625796-B2

Abstract

A method performed by a processing system including at least one processor includes monitoring a software repository for code changes, detecting a code change in a software instance that is stored in the software repository, generating a container image of the software instance in response to the detecting, creating a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance, configuring a testing platform for a development environment used to create the software instance, executing the test suite for the software instance by running the software encapsulated in the container on the testing platform, and publishing a test output of the test suite as a human-readable scorecard for the software instance.

Inventors

  • Andrew Leasck
  • Douglas Paul Schveninger
  • Oded Le'Sage
  • Bryan Strassner
  • Kelly Carter

Assignees

  • AT&T INTELLECTUAL PROPERTY I, L.P.

Dates

Publication Date
20260512
Application Date
20231222

Claims (20)

  1. 1 . A method comprising: monitoring, by a processing system including at least one processor, a software repository using a software program that checks for code changes; detecting, by the processing system during the monitoring, a code change in a software instance that is stored in the software repository; generating, by the processing system in response to the detecting, a container image of the software instance including the code change; creating, by the processing system, a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance; configuring, by the processing system, a testing platform for a development environment used to create the software instance; executing, by the processing system, the test suite for the software instance by running the software encapsulated in the container on the testing platform, wherein the executing is automated as a quality gate during a stage of a development of the software instance in a continuous integration/continuous delivery pipeline, and wherein the quality gate is designed to verify that the software instance meets a predefined standard before the development is permitted to proceed to a next stage of the development that follows the stage; and publishing, by the processing system, a test output of the test suite as a human-readable scorecard for the software instance, wherein the human-readable scorecard identifies at least one of: the test suite, a number of test cases that were run during the executing of the test suite, a number of test cases that were skipped during the executing of the test suite, an amount of time for which the test suite was executed, a number of successful test cases run during the executing of the test suite, or a number of failing test cases run during the executing of the test suite.
  2. 2 . The method of claim 1 , wherein the software instance is part of an open standard cloud services application.
  3. 3 . The method of claim 2 , wherein the test suite is configured to exercise a specific layer of the container image, and wherein the container image comprises a plurality of layers, and each layer of the plurality of layers represents a different area of an application environment of the open standard cloud services application.
  4. 4 . The method of claim 3 , wherein the test suite is configured to verify that the specific layer is functional and operating as expected.
  5. 5 . The method of claim 3 , wherein the test suite is one of a plurality of test suites, and wherein the plurality of test suites are configured to test respective layers of the plurality of layers in a consistent manner.
  6. 6 . The method of claim 5 , wherein the container comprises a standard container interface that is configured to generate the scorecard in a format that is consistent across the plurality of test suites.
  7. 7 . The method of claim 6 , wherein the standard container interface comprises a suite container which contains data to execute the test suite and a publisher container to publish the test output.
  8. 8 . The method of claim 7 , wherein the suite container comprises a testing framework, a test container, and a file system.
  9. 9 . The method of claim 8 , wherein the testing framework comprises programs for setup, run, and teardown of the test suite.
  10. 10 . The method of claim 9 , wherein the programs call functions in the test container.
  11. 11 . The method of claim 10 , wherein the functions in the test container write data to the file system.
  12. 12 . The method of claim 8 , wherein the file system stores parameters of the testing framework.
  13. 13 . The method of claim 12 , wherein the parameters of the testing framework include at least one of: a correlation identifier, an invocation identifier, or a targetsite.
  14. 14 . The method of claim 1 , wherein the human-readable scorecard further identifies at least one of: a correlation identifier of a testing framework, an invocation identifier of the testing framework, a targetsite of the testing framework, a type of the targetsite, a site of the executing, or a source that invoked the test suite.
  15. 15 . The method of claim 1 , wherein the container packages the software instance as an executable package with runtime, system tools, system libraries, and settings to execute the test suite on the software instance.
  16. 16 . A non-transitory computer-readable medium storing instructions which, when executed by a processing system including at least one processor, cause the processing system to perform operations, the operations comprising: monitoring a software repository using a software program that checks for code changes; detecting, during the monitoring, a code change in a software instance that is stored in the software repository; generating, in response to the detecting, a container image of the software instance including the code change; creating a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance; configuring a testing platform for a development environment used to create the software instance; executing the test suite for the software instance by running the software encapsulated in the container on the testing platform, wherein the executing is automated as a quality gate during a stage of a development of the software instance in a continuous integration/continuous delivery pipeline, and wherein the quality gate is designed to verify that the software instance meets a predefined standard before the development is permitted to proceed to a next stage of the development that follows the stage; and publishing a test output of the test suite as a human-readable scorecard for the software instance, wherein the human-readable scorecard identifies at least one of: the test suite, a number of test cases that were run during the executing of the test suite, a number of test cases that were skipped during the executing of the test suite, an amount of time for which the test suite was executed, a number of successful test cases run during the executing of the test suite, or a number of failing test cases run during the executing of the test suite.
  17. 17 . The non-transitory computer-readable medium of claim 16 , wherein the software instance is part of an open standard cloud services application.
  18. 18 . The non-transitory computer-readable medium of claim 17 , wherein the test suite is configured to exercise a specific layer of the container image, and wherein the container image comprises a plurality of layers, and each layer of the plurality of layers represents a different area of an application environment of the open standard cloud services application.
  19. 19 . The non-transitory computer-readable medium of claim 18 , wherein the test suite is configured to verify that the specific layer is functional and operating as expected.
  20. 20 . A device comprising: a processing system including at least one processor; and a non-transitory computer-readable medium storing instructions which, when executed by the processing system, cause the processing system to perform operations, the operations comprising: monitoring a software repository using a software program that checks for code changes; detecting, during the monitoring, a code change in a software instance that is stored in the software repository; generating, in response to the detecting, a container image of the software instance including the code change; creating a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance; configuring a testing platform for a development environment used to create the software instance; executing the test suite for the software instance by running the software encapsulated in the container on the testing platform, wherein the executing is automated as a quality gate during a stage of a development of the software instance in a continuous integration/continuous delivery pipeline, and wherein the quality gate is designed to verify that the software instance meets a predefined standard before the development is permitted to proceed to a next stage of the development that follows the stage; and publishing a test output of the test suite as a human-readable scorecard for the software instance, wherein the human-readable scorecard identifies at least one of: the test suite, a number of test cases that were run during the executing of the test suite, a number of test cases that were skipped during the executing of the test suite, an amount of time for which the test suite was executed, a number of successful test cases run during the executing of the test suite, or a number of failing test cases run during the executing of the test suite.

Description

This application is a continuation of U.S. patent application Ser. No. 17/348,439, filed Jun. 15, 2021, now U.S. Pat. No. 11,853,197, which is herein incorporated by reference in its entirety. The present disclosure relates generally to software engineering, and relates more particularly to devices, non-transitory computer-readable media, and methods for automating testing of software applications built using open source cloud computing platforms. BACKGROUND Continuous integration/continuous delivery (CI/CD) is a pipeline that bridges the gap between development and operation in software engineering by automating the building, testing, and deployment of applications, allowing code changes to be delivered more frequently and more reliably. Continuous integration establishes an automated and consistent manner for building, packaging, and testing applications, while continuous delivery automates the delivery of applications to the intended infrastructure environments (e.g., production, development, testing, etc.). Thus, code changes occurring during continuous integration can be pushed to the intended environments in an automated manner. SUMMARY The present disclosure broadly discloses methods, computer-readable media, and systems for automating testing for open standard cloud services applications. In one example, a method performed by a processing system including at least one processor includes monitoring a software repository for code changes, detecting a code change in a software instance that is stored in the software repository, generating a container image of the software instance in response to the detecting, creating a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance, configuring a testing platform for a development environment used to create the software instance, executing the test suite for the software instance by running the software encapsulated in the container on the testing platform, and publishing a test output of the test suite as a human-readable scorecard for the software instance. In another example, a non-transitory computer-readable medium may store instructions which, when executed by a processing system including at least one processor, cause the processing system to perform operations. The operations may include monitoring a software repository for code changes, detecting a code change in a software instance that is stored in the software repository, generating a container image of the software instance in response to the detecting, creating a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance, configuring a testing platform for a development environment used to create the software instance, executing the test suite for the software instance by running the software encapsulated in the container on the testing platform, and publishing a test output of the test suite as a human-readable scorecard for the software instance. In another example, a device may include a processing system including at least one processor and a non-transitory computer-readable medium storing instructions which, when executed by the processing system, cause the processing system to perform operations. The operations may include monitoring a software repository for code changes, detecting a code change in a software instance that is stored in the software repository, generating a container image of the software instance in response to the detecting, creating a container for the container image, wherein the container encapsulates software needed to run a test suite on the software instance, configuring a testing platform for a development environment used to create the software instance, executing the test suite for the software instance by running the software encapsulated in the container on the testing platform, and publishing a test output of the test suite as a human-readable scorecard for the software instance. BRIEF DESCRIPTION OF THE DRAWINGS The teachings of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which: FIG. 1 illustrates an example system in which examples of the present disclosure for automating testing for open standard cloud services applications may operate; FIG. 2 illustrates a flowchart of an example method for testing an open standard cloud services application in an automated manner, in accordance with the present disclosure; FIG. 3 illustrates a plurality of example container images that may be created in response to a detected code change; FIG. 4, for instance, illustrates an example standard container interface which may allow a test suite to be wrapped and executed; and FIG. 5 illustrates an example of a computing device, or computing system, specifically programmed to perform the steps, functions, blocks, and/or operatio