In 2024 it will be possible to jailbreak chat gpt with prompt engineering.
➕
Plus
21
Ṁ2591
Dec 31
93%
chance

Evidenced by the public availability of a jailbreak at any point in 202 Currently it's possible to jailbreak using the constantly updating DAN prompts such as at the below link.

https://www.reddit.com/r/ChatGPT/comments/11aoc56/presenting_dan_110/?utm_source=share&utm_medium=android_app&utm_name=androidcss&utm_term=1&utm_content=share_button

If chat gpt ceases to exist or drastically reduces the conversation and prompt lengths this market will be resolved by assessing the bing model instead.

Jailbreak:

A prompt or series of prompts that enable gpt to produce responses broadly outside of the openai constraints for at least 5 consecutive prompts. Acting outside of constraints for each of these prompts is simple to test as the desired response would be not be obtained by posting the same prompts in a fresh gpt session.

Get
Ṁ1,000
and
S3.00
Sort by:

Define jailbreak

predicts YES

@ShadowyZephyr fair question, please see the updated description.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules