Search

US-12626019-B2 - Systems and methods for applying secrecy settings on a user device

US12626019B2US 12626019 B2US12626019 B2US 12626019B2US-12626019-B2

Abstract

A method of applying secrecy settings on a user device is provided. The method includes monitoring, by the user device, user activity data based on usage of one or more applications of the user device, determining, by the user device, whether a user is exhibiting a secretive behavior based on the user activity data and historical behavior of the user, extracting, by the user device, contextual information from other applications, determining, by the user device, whether the contextual information is related to the user activity data, processing, by the user device based on the secretive behavior, the contextual information to provide at least one data stream and at least one attribute associated with the contextual information, determining, by the user device, a predefined secrecy type based on an analysis of the at least one data stream and the at least one attribute, and applying, by the user device, secrecy settings on the user device based on the predefined secrecy type.

Inventors

  • Ajay Sharma
  • Arihant Jain
  • Prakhar SHRIVASTAV
  • Rahul Yadav
  • Vipul Gupta

Assignees

  • SAMSUNG ELECTRONICS CO., LTD.

Dates

Publication Date
20260512
Application Date
20240812
Priority Date
20220510

Claims (17)

  1. 1 . A method of applying secrecy settings on a user device, the method comprising: monitoring, by the user device, user activity data based on usage of one or more applications of the user device; determining, by the user device via a neural network model, whether a user is exhibiting a secretive behavior based on the user activity data and historical behavior of the user; extracting, by the user device, contextual information from other applications, wherein the contextual information is associated with the user activity data; determining, by the user device, whether the contextual information is related to the user activity data; in response to determining that the contextual information is related to the user activity data, processing, by the user device based on the secretive behavior, the contextual information to provide at least one data stream and at least one attribute associated with the contextual information; in response to determining to process the user activity data according to one of a plurality of predefined secrecy types, determining, by the user device, a predefined secrecy type from among the plurality of predefined secrecy types based on an analysis of the at least one data stream and the at least one attribute; and applying, by the user device, secrecy settings on the user device based on the predefined secrecy type, wherein the method further comprises: clustering, by the user device, the user activity data into at least one predefined category; and determining, by the user device, whether the user activity data of one of the at least one predefined category is to be processed according to one of the plurality of predefined secrecy types based on a usage pattern of the user device, and wherein the plurality of predefined secrecy types corresponds to the secretive behavior.
  2. 2 . The method of claim 1 , wherein the secretive behavior comprises one of a normal behavior or a secret behavior.
  3. 3 . The method of claim 1 , wherein the user activity data comprises at least one of application activity data, user actions, the usage pattern of the user device, demographics information, a call history, or messages.
  4. 4 . The method of claim 1 , wherein the at least one attribute associated with the contextual information comprises deletion history and application activity.
  5. 5 . The method of claim 1 , wherein the contextual information comprises at least one of conversations, application data, application activities, application usage, or search patterns at the user device, and wherein the at least one data stream comprises at least one of an audio data stream, an image data stream, or a video data stream captured on the user device.
  6. 6 . The method of claim 1 , further comprising: analyzing, by the user device, the at least one data stream and the at least one attribute to determine whether the user activity data should be processed according to one of the plurality of predefined secrecy types.
  7. 7 . The method of claim 6 , wherein the analyzing of the at least one data stream and the at least one attribute comprises: analyzing, by the user device via one or more neural network models, the at least one data stream and the at least one attribute to determine whether the at least one data stream is one of public or private; and based on a result of determining whether the at least one data stream is one of public or private, determining whether the user activity data should be processed according to one of the plurality of predefined secrecy types.
  8. 8 . The method of claim 1 , wherein the applying of the secrecy settings based on the predefined secrecy type comprises modifying at least one of display settings, tracking settings, cookie settings, download settings, or history settings at the user device.
  9. 9 . The method of claim 1 , wherein the applying of the secrecy settings based on the predefined secrecy type comprises at least one of: modifying, by the user device, current user interface content at the user device; modifying, by the user device, a tracking status of the one or more applications of the user device; modifying, by the user device, a cookie status of the one or more applications of the user device; modifying, by the user device, a downloading status of the one or more applications of the user device; or modifying, by the user device, history storage settings of the one or more applications of the user device.
  10. 10 . The method of claim 1 , further comprising: monitoring, by the user device, the user activity data during a specific time period based on the one or more applications of the user device, wherein the monitoring of the user activity data comprises tracking a sequence of events in the one or more applications of the user device; clustering, by the user device, the user activity data into at least one category; determining, by the user device, whether the user activity data of one of the at least one category is processed according to one of the plurality of predefined secrecy types based on the usage pattern of the user device; and training, by the user device, the neural network model with the user activity data, the at least one category, and the plurality of predefined secrecy types.
  11. 11 . The method of claim 1 , further comprising: in response to determining that the contextual information is not related to the user activity data, determining, by the user device, the predefined secrecy type among the plurality of predefined secrecy types based on the secretive behavior.
  12. 12 . The method of claim 1 , wherein the at least one data stream and the at least one attribute are indicative of at least one of current user actions or data from the other applications.
  13. 13 . An electronic device for applying secrecy settings, the electronic device comprising: memory storing one or more computer programs; and one or more processors communicatively coupled to the memory; wherein the one or more computer programs include computer-executable instructions that, when executed by the one or more processors, cause the electronic device to: monitor user activity data based on usage of one or more applications of the electronic device, determine, via a neural network model, whether a user is exhibiting a secretive behavior based on the user activity data and historical behavior of the user, extract contextual information from other applications, wherein the contextual information is associated with the user activity data, determine whether the contextual information is related to the user activity data, in response to determining that the contextual information is related to the user activity data, process, based on the secretive behavior, the contextual information to provide at least one data stream and at least one attribute associated with the contextual information, in response to determining to process the user activity data according to one of a plurality of predefined secrecy types, determine a predefined secrecy type from among the plurality of predefined secrecy types based on an analysis of the at least one data stream and the at least one attribute, and apply secrecy settings on the electronic device based on the predefined secrecy type, wherein the one or more computer programs further include computer-executable instructions that, when executed by the one or more processors, cause the electronic device to: cluster the user activity data into at least one predefined category, and determine whether the user activity data of one of the at least one predefined category is to be processed according to one of the plurality of predefined secrecy types based on a usage pattern of the electronic device, and wherein the plurality of predefined secrecy types corresponds to the secretive behavior.
  14. 14 . The electronic device of claim 13 , wherein the one or more computer programs further include computer-executable instructions that, when executed by the one or more processors, cause the electronic device to: create the usage pattern for the electronic device corresponding to received sequence of actions using frequent pattern mining algorithms, and wherein the frequent patten mining algorithms include a frequent pattern growth (FP-growth) scalable technique in which the FP-growth is provided as an input to K-pattern clustering and similar patterns are grouped by using frequent activity patterns' mining to cluster the user activity data into one or more predefined categories.
  15. 15 . The electronic device of claim 13 , wherein the one or more computer programs further include computer-executable instructions that, when executed by the one or more processors, cause the electronic device to: based on the predefined secrecy type being a second predefined secrecy type, perform downloads in a public space of the electronic device, and based on the predefined secrecy type being a third predefined secrecy type, perform the downloads in a private space of the electronic device, and wherein files downloaded in the private space are opened/viewed only in a particular application and only by an authenticated user and all other applications are blocked from accessing the files downloaded in the private space.
  16. 16 . The electronic device of claim 13 , wherein the plurality of predefined secrecy types include at least three predefined secrecy types comprising a first predefined secrecy type associated with no secrecy or normal mode, a second predefined secrecy type associated with secrecy from network, and a third predefined secrecy type associated with secrecy from other users/applications.
  17. 17 . One or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that, when executed by one or more processors of a user device, cause the user device to perform operations, the operations comprising: monitoring, by the user device, user activity data based on usage of one or more applications of the user device; determining, by the user device via a neural network model, whether a user is exhibiting a secretive behavior based on the user activity data and historical behavior of the user; extracting, by the user device, contextual information from other applications, wherein the contextual information is associated with the user activity data; determining, by the user device, whether the contextual information is related to the user activity data; in response to determining that the contextual information is related to the user activity data, processing, by the user device based on the secretive behavior, the contextual information to provide at least one data stream and at least one attribute associated with the contextual information; in response to determining to process the user activity data according to one of a plurality of predefined secrecy types, determining, by the user device, a predefined secrecy type from among the plurality of predefined secrecy types based on an analysis of the at least one data stream and the at least one attribute; and applying, by the user device, secrecy settings on the user device based on the predefined secrecy type, wherein the operations further comprise: clustering, by the user device, the user activity data into at least one predefined category; and determining, by the user device, whether the user activity data of one of the at least one predefined category is to be processed according to one of the plurality of predefined secrecy types based on a usage pattern of the user device, and wherein the plurality of predefined secrecy types corresponds to the secretive behavior.

Description

CROSS-REFERENCE TO RELATED APPLICATION(S) This application is a continuation application, claiming priority under § 65 (c), of an International application No. PCT/KR2023/002099, filed on Feb. 14, 2023, which is based on and claims the benefit of an Indian Provisional patent application No. 202211007796, filed on Feb. 14, 2022, in the Indian Intellectual Property Office, and of an Indian Complete patent application No. 202211007796, filed on May 10, 2022, in the Indian Intellectual Property Office, the disclosure of each of which is incorporated by reference herein in its entirety. BACKGROUND 1. Field The disclosure relates to providing secure device usage. More particularly, the disclosure relates to enhance safe device usage experience for the user where the user can browse and download from the internet in an intelligent way based on an automatic switching to incognito/secret mode using a machine learning model. 2. Description of Related Art Browsing a website or using any network application on users' phones can lead to leakage of data in various ways. Users usually either forget to enable a hidden/incognito mode or mostly do not have any control over this leak. There are multiple ways in which user's data is leaked and user-experience/privacy is hampered. For example, in case of data leak to the network, the user may be spammed with related advertisements. Specifically, the current problem is that when user searches/browses something (e.g., flight tickets), the data is leaked to the network applications/websites. This data is later used to spam the user and later, the user starts receiving related advertisements. Thus, the user experience is hampered. Further, the user's data may be leaked to other users. The various applications may maintain usage history, and such history data may be leaked to other users. Specifically, the current problem is that when a user performs some browsing/search action, the applications may maintain its history. While the user who searched may not want other people to know the search history, the other users may get to know about this history. Additionally, the user's data may be leaked to other applications. The content is visible to all the applications that are able to open it. Specifically, the downloaded content may be visible to the user who is using it and also to all applications which are able to open such content even if it is private. While there are some current mechanisms to avoid such data leakages, but such mechanisms are mostly based on manual settings to avoid data leakages. With machine learning advancements, the technology may be leveraged to automate such data leakages. Accordingly, there is a need to address the above challenges to provide an intelligent mechanism for safeguarding user's search/browsing history and to prevent leakage of data to other users, network, and applications. The above information is presented as background information only to assist with an understanding of the disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the disclosure. SUMMARY Aspects of the disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the disclosure is to provide a system and method for applying secrecy settings on a user device. Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments. In accordance with an aspect of the disclosure, a method of applying secrecy settings on a user device is provided. The method includes monitoring, by the user device, user activity data based on usage of one or more applications of the user device, determining, by the user device via a neural network model, whether a user is exhibiting a secretive behavior based on the user activity data and historical behavior of the user, extracting, by the user device, contextual information from other applications, wherein the contextual information is associated with the user activity data, determining, by the user device, whether the contextual information is related to the user activity data, in response to determining that the contextual information is related to the user activity data, processing, by the user device based on the secretive behavior, the contextual information to provide at least one data stream and at least one attribute associated with the contextual information, in response to determining to process the user activity data according to one of a plurality of predefined secrecy types, determining, by the user device, a predefined secrecy type from among the plurality of predefined secrecy types based on an analysis of the at least one data stream and the at least one attribute, and applying, by the user device, secrecy settings on t