Bruce Garvey and Adam Svendsen tested two commercial Generative AI products to see if they could be used to create Red Team scenarios. 'Red Teaming' requires participants to adopt an alternative, and usually contrary, point of view to an established and agreed proposition as a means of improving decision making or understanding of enemies or competitors. It can be described as 'playing devil's advocate', 'thinking like your enemy' or even as a structured way of 'thinking the unthinkable'.
When the two Generative AI products were challenged to produce Red Team scenarios, the results were disappointing, due at least in part to the constraints built into the products. Successive refining of prompts produced some improvement, but the results were still far from innovative, and human analysts might be better employed in generating the scenarios themselves rather than refining prompts.
Full details are provided in a White Paper that can be accessed by clicking the link below.