Hackers Deceive AI, Exposing Windows Activation Keys Through ChatGPT
1 week ago / Read about 0 minute
Author:小编   

Researchers have uncovered a technique capable of coaxing ChatGPT-4 and GPT-4 mini into disclosing valid Windows product activation keys. This vulnerability arises due to the inclusion of publicly available Windows keys within ChatGPT's training data. An AI vulnerability hunter has submitted a report to Mozilla's ODIN (0-Day Investigation Network) bug bounty program, detailing an ingenious method: creating a puzzle game where crucial information is concealed within HTML tags. The game concludes by requesting the key, cleverly deceiving OpenAI's ChatGPT-4 and GPT-4 mini into revealing it.