The top 3 Neural Nets in 2035 be able to be jailbroken to follow illegal commands
Basic
6
Ṁ1412035
30%
chance
1D
1W
1M
ALL
As is the case at the time of writing it must be possible with text prompts to get NNs to do things that in other cases they will refuse to do.
This question is managed and resolved by Manifold.
Get
1,000
and3.00