Show HN: Daily Jailbreak – Prompt Engineer's Wordle

I created a daily challenge for Prompt Engineers to build the shortest prompt to break a system prompt.

You are provided the system prompt and a forbidden method the LLM was told not to invoke. Your task is to trick the model into calling the function. Shortest successful attempts will show up in the leaderboard.

Give it a shot! You never know what could break an LLM.


Comments URL: https://news.ycombinator.com/item?id=43814080

Points: 43

# Comments: 25

https://www.vaultbreak.ai/daily-jailbreak

Creată 12h | 27 apr. 2025, 21:20:05


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

Why do electrons not fall into the nucleus?

Article URL:

28 apr. 2025, 08:50:03 | Hacker news
Ask HN: CS degrees, do they matter again?

tldr; skip to the --------

Last time I "Asked HN", I was in a very different place. Fresh out of a bootcamp, right at the peak, and subsequent collapse of the Covid hiring. It didn't go well. Ho

28 apr. 2025, 06:30:11 | Hacker news