The conflict could be formal military engagements, but also civil uprisings or riots; I'm using a broad definition of "conflict", though there must be a field of engagement. The weapons platform could be a drone, a vehicular weapons targeting and firing system, a missile that chooses targets and launches itself, a self-guided ground assault device, or any number of other things.
Must be:
"Autonomous" - able to act without direct human action, and do so in a combat context. Can have oversight and be interruptible by human operators, but not require it. Must also be physically detached from human soldiers (no infantry firearms or supplemental gear that is unable to act when not attached to infantry helmets, packs, armor, etc.), though can be embedded into vehicles.
"Lethal" - equipped with lethal armaments intended for use on human targets, not simply a support or nonlethal pacification robot.
"Deployed" - present on the field of engagement in a nonreserve capacity.
Static explosive devices like mines and claymores don't count.
Doesn't need to actually fire or use its lethal tools against human targets, though they must be equipped while deployed.
Resolves YES if the criteria are met at any point after question creation and reported via verifiable media. Resolves NO shortly after close otherwise.
Longer term version:
Related questions
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ558 | |
2 | Ṁ259 | |
3 | Ṁ242 | |
4 | Ṁ60 | |
5 | Ṁ57 |
Loosely related (these are not lethal and only semi-autonomous) but Canada just announced it's sending 800 R70 drones to Ukraine so it feels like things are advancing in terms of AI/autonomous devices on the battlefield
Ah! One nitpick: I noticed my original description specifies that the deployment must happen after question creation, so I can't actually resolve yet. Sorry for the scare, but I will resolve if evidence appears that this happens any time from now since this tech seems to exist and has seen some previous deployment. @traders
@Stralor Ahh I see, that's fair enough.
What level of specifics need to be present in the reporting for it to be sufficient evidence?
For instance the Israeli forces currently have Elbit Systems' LANIUS quadcopter drones in their arsenal, among others, which are capable of carrying lethal munitions.
There have been attacks from Israeli forces using swarms of quadcopters in Gaza, however the models used are unconfirmed https://www.middleeasteye.net/news/war-gaza-israeli-quadcopters-hi-tech-weapon-menacing-palestinian-civilians
In Elbit System's Legion-X AM-PM swarm control software description page they state that it is battle tested and in operational use worldwide https://elbitsystems.com/product/legion-x-am-pm/
Personally I feel like this is sufficient evidence, however there doesn't seem to anything that explicitly states that the AM-PM software was/is controlling lethal drones, or that the drones that carried/carry out attacks are fully autonomous.
What do you think?
@LiamNewittClark hmm, looks like they have navigation and sensors autonomy, but nothing I saw suggested they have attack autonomy. it all describes requiring a human operator nearby and/or planning by humans. but clearly this tech is right on the cusp
@Stralor Could you clarify your position on loitering munitions/drones with autonomous targeting and decision to strike once in an engagement area but that need to be manually launched and guided to/assigned to an engagement area? It seems those have been deployed, so wonder if this resolves YES just based on confirmation of such deployments or whether those systems would not count.
@MartinModrak they would! it seems like Liam below posted a good article saying how this already happened 😱 I think I have to resolve both of these markets YES and may do so later
This already happened 3 years ago: https://www.npr.org/2021/06/01/1002196245/a-u-n-report-suggests-libya-saw-the-first-battlefield-killing-by-an-autonomous-d
@LiamNewittClark thanks for the link! I agree that's decent evidence. I'd prefer more details than that article, but I believe it's enough to resolve by if it happens again
@A16ab interesting argument. I suppose from my view launching itself is important in some cases (like missiles), but not others. I wouldn't expect a vehicular turret to drive the vehicle, f. ex. The important part for me is that they can decide when and where to kill without human direction. I'm happy to offset your losses.
@Stralor Hmmm, would there ever be such a weapon with zero human oversight? Does something like Iron Dome or CIWS/C-RAM that fires automatically count?
@Snarflak yep! those qualify on the "autonomous" requirement, though ofc not on the "lethal"
@Snarflak it's not clear to me that this happens automatically, since from what I read these systems don't recognize IFF. am I wrong?
@Stralor I would guess it's not automatic against boats because they're slow enough to be manually confirmed, but I don't know.
@ML A WWII mine doesn't qualify as autonomous; there is no FoF or self-determined target selection. This was already covered in the description, btw:
Static explosive devices like mines and claymores don't count.
And thank you for the suggestion. "Mobile" is a good yardstick, but not a hard requirement. For example, an auto-targeting and auto-firing turret emplacement would be on the line and could qualify, pending further scrutiny and context.
spotted this from a couple of days ago and while the tests mentioned were last year I would imagine with all of the current conflicts it's pretty likely they'll deploy in 2024
Ukrainians claim they deployed such system https://twitter.com/PStyle0ne1/status/1750414280350912997 , heard (no link now) that Russians also claim this. Will probably take time to clear up the specifics (e.g. does the system require that a human guides the drones to the area of action?)