Twitter pranksters derail GPT-3 bot with newly discovered “prompt injection” hack

But unlike an SQL injection, a prompt injection might mostly make the bot (or the company behind it) look foolish rather than threaten data security.

Related:

  • No Related Posts

Leave a Reply