Search

US-12621282-B2 - System and method for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications

US12621282B2US 12621282 B2US12621282 B2US 12621282B2US-12621282-B2

Abstract

Embodiments of the present invention provide a system for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications. The system is configured for identifying initiation of an authentication request from a user device of a user, monitoring and recording user characteristics via the user device, capturing user environment data of the user, via the user device, analyzing the user characteristics and the user environment data of the user, via an artificial intelligence engine, determining, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on at least one of the user characteristics and the user environment data, and performing an action comprising authenticating the user based on determining that the authentication request is not a misappropriation attempt or denying authentication of the user based on determining that the authentication request is a misappropriation attempt.

Inventors

  • Sanjay Lohar
  • George Anthony Albero
  • Jinna Kim
  • Olga Kocharyan
  • Timothy Scott Murphy
  • Christopher Perez

Assignees

  • BANK OF AMERICA CORPORATION

Dates

Publication Date
20260505
Application Date
20240603

Claims (20)

  1. 1 . A system for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications, the system comprising: at least one network communication interface; at least one non-transitory storage device; and at least one processing device coupled to the at least one non-transitory storage device and the at least one network communication interface, wherein the at least one processing device is configured to: identify initiation of an authentication request from a user device of a user; monitor and record one or more user characteristics via one or more components of the user device; capture user environment data of the user, via the one or more components of the user device; analyze the one or more user characteristics and the user environment data of the user, via an artificial intelligence engine; in response to analyzing the one or more user characteristics and the user environment data of the user, determine, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on at least one of the one or more user characteristics and the user environment data; and perform an action comprising: authenticating the user based on determining that the authentication request is not the misappropriation attempt; or denying authentication of the user based on determining that the authentication request is the misappropriation attempt.
  2. 2 . The system of claim 1 , wherein the at least one processing device is configured to determine, via the artificial intelligence engine, if the authentication request is the misappropriation attempt based on: extracting baseline characteristics of the user from a data repository; comparing the one or more user characteristics with the baseline characteristics of the user; and determining if any anomalies exist between the one or more user characteristics and the baseline characteristics.
  3. 3 . The system of claim 2 , wherein the at least one processing device is configured to: capture characteristics of the user over a time period before the initiation of the authentication request; generate the baseline characteristics based on the characteristics captured over the time period; and update the baseline characteristics at regular time intervals or irregular time intervals.
  4. 4 . The system of claim 1 , wherein the at least one processing device is configured to determine, via the artificial intelligence engine, if the authentication request is the misappropriation attempt based on: extracting historical environment data of the user from a data repository; comparing the user environment data with the historical environment data of the user; and determining if any anomalies exist between the user environment data and the historical environment data.
  5. 5 . The system of claim 1 , wherein the at least one processing device is configured to train the artificial intelligence engine based on at least one of user information of the user, region-specific behavior data, database honeypots, historical unauthorized user data, and misinformation associated with the user that is collected from the user.
  6. 6 . The system of claim 5 , wherein the at least one processing device is configured to: prompt the user to provide the misinformation associated with characteristics of the user; receive the misinformation from the user; and store the misinformation received from the user in a data repository.
  7. 7 . The system of claim 6 , wherein the at least one processing device is configured to: extract the misinformation associated with the user from the data repository; feed the misinformation to the artificial intelligence engine; and perform error correction and tuning of the artificial intelligence engine based on feeding the misinformation.
  8. 8 . The system of claim 1 , wherein the at least one processing device is configured to: scan for unauthorized user devices of unauthorized users linked with historical misappropriation attempts within a predefined distance of the user, via the one or more components of the user device; identify at least one of the unauthorized user devices of the unauthorized users linked with historical misappropriation attempts within a predefined distance of the user; and transmit a notification to the user device associated with presence of the unauthorized user devices of the unauthorized users.
  9. 9 . The system of claim 1 , wherein the at least one processing device is configured to transmit one or more notifications to the user device of the user based on analyzing the one or more user characteristics and the user environment data of the user to verify the analysis.
  10. 10 . The system of claim 1 , wherein the one or more user characteristics comprise at least one of speech characteristics, text characteristics, and behavioral characteristics.
  11. 11 . A computer program product for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications, the computer program product comprising a non-transitory computer-readable storage medium having computer executable instructions for causing a computer processor to perform the steps of: identifying initiation of an authentication request from a user device of a user; monitoring and recording one or more user characteristics via one or more components of the user device; capturing user environment data of the user, via the one or more components of the user device; analyzing the one or more user characteristics and the user environment data of the user, via an artificial intelligence engine; in response to analyzing the one or more user characteristics and the user environment data of the user, determining, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on at least one of the one or more user characteristics and the user environment data; and performing an action comprising: authenticating the user based on determining that the authentication request is not the misappropriation attempt; or denying authentication of the user based on determining that the authentication request is the misappropriation attempt.
  12. 12 . The computer program product of claim 11 , wherein the computer executable instructions cause the computer processor to perform the step of determining, via the artificial intelligence engine, if the authentication request is the misappropriation attempt based on: extracting baseline characteristics of the user from a data repository; comparing the one or more user characteristics with the baseline characteristics of the user; and determining if any anomalies exist between the one or more user characteristics and the baseline characteristics.
  13. 13 . The computer program product of claim 11 , wherein the computer executable instructions cause the computer processor to perform the step of determining, via the artificial intelligence engine, if the authentication request is the misappropriation attempt based on: extracting historical environment data of the user from a data repository; comparing the user environment data with the historical environment data of the user; and determining if any anomalies exist between the user environment data and the historical environment data.
  14. 14 . The computer program product of claim 11 , wherein the computer executable instructions cause the computer processor to perform the step of training the artificial intelligence engine based on at least one of user information of the user, region-specific behavior data, database honeypots, historical unauthorized user data, and misinformation associated with the user that is collected from the user.
  15. 15 . The computer program product of claim 11 , wherein the computer executable instructions cause the computer processor to perform the steps of: scanning for unauthorized user devices of unauthorized users linked with historical misappropriation attempts within a predefined distance of the user, via the one or more components of the user device; identifying at least one of the unauthorized user devices of the unauthorized users linked with historical misappropriation attempts within a predefined distance of the user; and transmitting a notification to the user device associated with presence of the unauthorized user devices of the unauthorized users.
  16. 16 . A computer implemented method for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications, wherein the method comprises: identifying initiation of an authentication request from a user device of a user; monitoring and recording one or more user characteristics via one or more components of the user device; capturing user environment data of the user, via the one or more components of the user device; analyzing the one or more user characteristics and the user environment data of the user, via an artificial intelligence engine; in response to analyzing the one or more user characteristics and the user environment data of the user, determining, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on at least one of the one or more user characteristics and the user environment data; and performing an action comprising: authenticating the user based on determining that the authentication request is not the misappropriation attempt; or denying authentication of the user based on determining that the authentication request is the misappropriation attempt.
  17. 17 . The computer implemented method of claim 16 , wherein determining, via the artificial intelligence engine, if the authentication request is the misappropriation attempt is based on: extracting baseline characteristics of the user from a data repository; comparing the one or more user characteristics with the baseline characteristics of the user; and determining if any anomalies exist between the one or more user characteristics and the baseline characteristics.
  18. 18 . The computer implemented method of claim 16 , wherein determining, via the artificial intelligence engine, if the authentication request is the misappropriation attempt is based on: extracting historical environment data of the user from a data repository; comparing the user environment data with the historical environment data of the user; and determining if any anomalies exist between the user environment data and the historical environment data.
  19. 19 . The computer implemented method of claim 16 , wherein the method comprises training the artificial intelligence engine based on at least one of user information of the user, region-specific behavior data, database honeypots, historical unauthorized user data, and misinformation associated with the user that is collected from the user.
  20. 20 . The computer implemented method of claim 16 , wherein the method comprises: scanning for unauthorized user devices of unauthorized users linked with historical misappropriation attempts within a predefined distance of the user, via the one or more components of the user device; identifying at least one of the unauthorized user devices of the unauthorized users linked with historical misappropriation attempts within a predefined distance of the user; and transmitting a notification to the user device associated with presence of the unauthorized user devices of the unauthorized users.

Description

BACKGROUND There exists a need for a system for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications. BRIEF SUMMARY The following presents a summary of certain embodiments of the invention. This summary is not intended to identify key or critical elements of all embodiments nor delineate the scope of any or all embodiments. Its sole purpose is to present certain concepts and elements of one or more embodiments in a summary form as a prelude to the more detailed description that follows. Embodiments of the present invention address the above needs and/or achieve other advantages by providing apparatuses (e.g., a system, computer program product and/or other devices) and methods for identifying and blocking synthetic media based misappropriation attempts associated with electronic communications. The system embodiments may comprise one or more memory devices having computer readable program code stored thereon, a communication device, and one or more processing devices operatively coupled to the one or more memory devices, wherein the one or more processing devices are configured to execute the computer readable program code to carry out the invention. In computer program product embodiments of the invention, the computer program product comprises at least one non-transitory computer readable medium comprising computer readable instructions for carrying out the invention. Computer implemented method embodiments of the invention may comprise providing a computing system comprising a computer processing device and a non-transitory computer readable medium, where the computer readable medium comprises configured computer program instruction code, such that when said instruction code is operated by said computer processing device, said computer processing device performs certain operations to carry out the invention. In some embodiments, the present invention identifies initiation of an authentication request from a user device of a user, monitors and records one or more user characteristics via one or more components of the user device, captures user environment data of the user, via the one or more components of the user device, analyzes the one or more user characteristics and the user environment data of the user, via an artificial intelligence engine, in response to analyzing the one or more user characteristics and the user environment data of the user, determines, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on at least one of the one or more user characteristics and the user environment data, and performs an action comprising authenticating the user based on determining that the authentication request is not a misappropriation attempt or denying authentication of the user based on determining that the authentication request is a misappropriation attempt. In some embodiments, the present invention determines, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on extracting baseline characteristics of the user from a data repository, comparing the one or more user characteristics with the baseline characteristics of the user, and determining if any anomalies exist between the one or more user characteristics and the baseline characteristics. In some embodiments, the present invention captures characteristics of the user over a time period before the initiation of the authentication request, generates the baseline characteristics based on the characteristics captured over the time period, and updates the baseline characteristics at regular time intervals or irregular time intervals. In some embodiments, the present invention determines, via the artificial intelligence engine, if the authentication request is a misappropriation attempt based on extracting historical environment data of the user from a data repository, comparing the user environment data with the historical environment data of the user, and determining if any anomalies exist between the user environment data and the historical environment data. In some embodiments, the present invention trains the artificial intelligence engine based on at least one of user information of the user, region-specific behavior data, database honeypots, historical unauthorized user data, and misinformation associated with the user that is collected from the user. In some embodiments, the present invention prompts the user to provide the misinformation associated with characteristics of the user, receives the misinformation from the user, and stores the misinformation received from the user in a data repository. In some embodiments, the present invention extracts the misinformation associated with the user from the data repository, feeds the misinformation to the artificial intelligence engine, and performs error correction and tuning of the artificial intelligence engine based on feeding th