Search

US-12620158-B2 - Systems and methods of rendering effects during gameplay

US12620158B2US 12620158 B2US12620158 B2US 12620158B2US-12620158-B2

Abstract

Systems and methods of rendering effects are described herein. A computing device can receive a user selection of an effect to be applied during gameplay of a computer game. The computing device can initiate a framework with which to apply the effect during the gameplay. The computing device can determine scenes during the gameplay to apply the effect through the framework. The computing device can determine graphical elements in the scenes that correspond to non-user interface elements through the framework. The computing system can apply the effect to the graphical elements through the framework.

Inventors

  • Hongyu Sun
  • Chen Li
  • Chengeng Li
  • Qiang Qiu
  • Huihui Xu
  • Steven Jackson
  • Andrew Pham

Assignees

  • INNOPEAK TECHNOLOGY, INC.

Dates

Publication Date
20260505
Application Date
20230925

Claims (18)

  1. 1 . A computer-implemented method for rendering effects on a screen of a computing device during gameplay of a computer game or a mobile game, performed by the computing device, the method comprising: receiving a user selection of an effect to be applied during the gameplay; initiating a framework with which to apply the effect during the gameplay; determining, through the framework, scenes during the gameplay to apply the effect; determining, through the framework, graphical elements in the scenes based on detection of user interface elements; and applying, through the framework, the effect to the graphical elements; wherein determining the scenes during the gameplay to apply the effect comprises: detecting at least one texture depicted in the scenes during the gameplay, determining a frame signature based on a call sequence, or determining whether a texture value is within a texture value range; wherein the at least one texture is detected based on a texture scanning technique using a graphics debugging tool, and wherein the graphical debugging tool associates the at least one texture to a category associated with a game engine.
  2. 2 . The computer-implemented method of claim 1 , wherein the user selection is received through a user interface provided by the computer device, and wherein the user interface is provided upon launching of the computer game or the mobile game from the computing device.
  3. 3 . The computer-implemented method of claim 2 , wherein the user interface includes one or more effects selectable by a user operating the computing device and with which to apply to graphics during the gameplay.
  4. 4 . The computer-implemented method of claim 3 , wherein the one or more effects selectable by the user include at least one of: a color invent effect, a night vision effect, a pixelized effect, a high dynamic range effect, an old movie effect, or a cel shading effect.
  5. 5 . The computer-implemented method of claim 1 , wherein the framework is embedded in an application programming interface (API) layer of an operating system running on the computing device.
  6. 6 . The computer-implemented method of claim 5 , wherein the framework intervenes OpenGL API calls initiated by the computer game or the mobile game during the gameplay.
  7. 7 . The computer-implemented method of claim 1 , wherein the detection of the user interface elements is performed during a user interface pass of OpenGL graphics rendering on the computing device and the graphical elements correspond to non-user interface elements.
  8. 8 . The computer-implemented method of claim 1 , wherein applying the effect to the graphical elements comprises: storing context information associated with OpenGL states, capturing graphics data of framebuffers, modifying the graphics data of the framebuffers to change pixel color of the graphical elements during the gameplay, and restore the context information to the OpenGL states.
  9. 9 . The computer-implemented method of claim 1 , wherein applying the effect to the graphical elements comprises frame shader optimization, wherein the frame shader optimization includes saving read operations for image frames based on a framebuffer fetch function.
  10. 10 . The computer-implemented method of claim 1 , wherein applying the effect to the graphical elements comprises modifying an OpenGL with the effect and recompiling the OpenGL.
  11. 11 . A computing system comprising: at least one processor; and a non-transitory memory storing instructions that, when executed by the at least one processor, cause the computing system to perform a method for rendering effects on a screen of a computing device during gameplay of a computer game, the method comprising: receiving a user selection of an effect to be applied during the gameplay; initiating a framework with which to apply the effect during the gameplay; determining, through the framework, scenes during the gameplay to apply the effect; determining, through the framework, graphical elements in the scenes that correspond to non-user interface elements; and applying, through the framework, the effect to the graphical elements; wherein determining the scenes during the gameplay to apply the effect comprises: detecting at least one texture depicted in the scenes during the gameplay, determining a frame signature based on a call sequence, or determining whether a texture value is within a texture value range; wherein the at least one texture is detected based on a texture scanning technique using a graphics debugging tool, and wherein the graphical debugging tool associates the at least one texture to a category associated with a game engine.
  12. 12 . The computing system of claim 11 , wherein the user selection is received through a user interface provided by the computer device, and wherein the user interface is provided upon launching of the computer game from the computing device.
  13. 13 . The computing system of claim 12 , wherein the user interface includes one or more effects selectable by a user operating the computing device and with which to apply to graphics during the gameplay.
  14. 14 . The computing system of claim 13 , wherein the one or more effects selectable by the user include at least one of: a color invent effect, a night vision effect, a pixelized effect, a high dynamic range effect, an old movie effect, or a cel shading effect.
  15. 15 . A non-transitory storage medium of a computing system storing instructions that, when executed by at least one processor of the computing system, cause the computing system to perform a method for rendering effects on a screen of a computing device during gameplay of a computer game, the method comprising: receiving a user selection of an effect to be applied during the gameplay; initiating a framework with which to apply the effect during the gameplay; determining, through the framework, scenes during the gameplay to apply the effect; determining, through the framework, graphical elements in the scenes that correspond to non-user interface elements; and applying, through the framework, the effect to the graphical elements; wherein determining the scenes during the gameplay to apply the effect comprises: detecting at least one texture depicted in the scenes during the gameplay, determining a frame signature based on a call sequence, or determining whether a texture value is within a texture value range; wherein the at least one texture is detected based on a texture scanning technique using a graphics debugging tool, and wherein the graphical debugging tool associates the at least one texture to a category associated with a game engine.
  16. 16 . The non-transitory storage medium of claim 15 , wherein the user selection is received through a user interface provided by the computer device, and wherein the user interface is provided upon launching of the computer game from the computing device.
  17. 17 . The non-transitory storage medium of claim 16 , wherein the user interface includes one or more effects selectable by a user operating the computing system and with which to apply to graphics during the gameplay.
  18. 18 . The non-transitory storage medium of claim 17 , wherein the one or more effects selectable by the user include at least one of a color invent effect, a night vision effect, a pixelized effect, a high dynamic range effect, an old movie effect, or a cel shading effect.

Description

CROSS REFERENCE TO RELATED APPLICATIONS The present application is a continuation application of PCT/US2021/024222 filed on Mar. 25, 2021, and contents of which are incorporated herein by its entireties. TECHNICAL FIELD The present disclosure relates to the field of rendering techniques in game engines, and in particular to a computer-implemented method, a computing system, and a non-transitory storage medium. BACKGROUND Game development has evolved drastically in recent years. An increasing number of game engines, such as Unreal Engine, Unity, and CryEngine are available for game development on an increasing number of platforms, including mobile platforms. With the increasing number of available game engines and the increasing number of available platforms, game development faces several technical challenges. For example, modifying a game developed using a game engine for a platform so that the game can be implemented on other game engines for other platforms can be cumbersome and inefficient. Any later additions to the game would also need to be modified to be implemented on the other game engines and the other platforms. These technical challenges become exacerbated as game development continues to evolve and the number of available game engines and the number of available platforms continue to increase. Thus, the evolution of game development has created various technical challenges. SUMMARY According to one aspect of the present disclosure, a computer-implemented method for rendering effects is provided. The method may include: receiving a user selection of an effect to be applied during gameplay of a computer game or a mobile game; initiating a framework with which to apply the effect during the gameplay; determining, through the framework, scenes during the gameplay to apply the effect; determining, through the framework, graphical elements in the scenes based on detection of user interface elements; and applying, through the framework, the effect to the graphical elements. According to another aspect of the present disclosure, a computing system is provided. The computing system may include: at least one processor; and a memory storing instructions that, when executed by the at least one processor, the computing system is configured to perform a method for rendering effects based on the instructions, the method comprising: receiving a user selection of an effect to be applied during gameplay of a computer game; initiating a framework with which to apply the effect during the gameplay; determining, through the framework, scenes during the gameplay to apply the effect; determining, through the framework, graphical elements in the scenes that correspond to non-user interface elements; and applying, through the framework, the effect to the graphical elements. According to one aspect of the present disclosure, a non-transitory storage medium of a computing system storing instructions that, when executed by at least one processor of the computing system, the computing system is configured to perform a method for rendering effects based on the instructions, is provided. The method may include: receiving a user selection of an effect to be applied during gameplay of a computer game; initiating a framework with which to apply the effect during the gameplay; determining, through the framework, scenes during the gameplay to apply the effect; determining, through the framework, graphical elements in the scenes that correspond to non-user interface elements; and applying, through the framework, the effect to the graphical elements. BRIEF DESCRIPTION OF THE DRAWINGS The present disclosure, in accordance with one or more various embodiments, is described in detail with reference to the following figures. The figures are provided for purposes of illustration only and merely depict typical or exemplary embodiments FIG. 1A illustrates an example computer architecture, according to various embodiments of the present disclosure. FIG. 1B illustrates an example framework, according to various embodiments of the present disclosure. FIG. 1C illustrates an example computer architecture, according to various embodiments of the present disclosure. FIG. 2A illustrates an example graphics rendering pipeline, according to various embodiments of the present disclosure. FIG. 2B illustrates an example graphics rendering pipeline, according to various embodiments of the present disclosure. FIG. 3 illustrates an example workflow, according to various embodiments of the present disclosure. FIG. 4A illustrates a block diagram for operation of a framework, according to various embodiments of the present disclosure. FIG. 4B illustrates a block diagram for post-processing rendering, according to various embodiment of the present disclosure. FIG. 4C illustrates a block diagram for user interface (UI) detection, according to various embodiment of the present disclosure. FIG. 4D illustrates a block diagram for general UI detection, according