Hacker News new | past | comments | ask | show | jobs | submit login
Q&A on the ethics of warfare in the autonomous future (bloomberg.com)
21 points by stablemap on Jan 2, 2018 | hide | past | favorite | 11 comments



>RL: Generally, some of the things that had to do with biology were a little frightening, things like synthetic biology where you don't really know the ultimate implications. And some of the work with electromagnetics was a little scary, particularly as it had to do with humans and lethality.

TH: Got it. So you spent some time in the private sector, and …

Got it? Really? Am I alone in having nothing but questions in my mind after such a briefing? Sure, the interview must go on, but the lack of precision in this “got it” is disturbing to me. I have noticed this phrase being use in some loose ways lately. Am I being too sensitive?


Seems pretty clear to me that he was not in a position to offer any specifics. Probably due to NDA or national security yada yada yada.


My comment concerns the interviewer's "Got it" response, not the interviewee's lack of information. I immediately considered what you're saying, but I wonder if it justifies the affirmative response. Your assumption is merely contextually reasonable. It is not implied and not 'clear'.

I think it's becoming accepted to say "got it" when a conveyance is understood even when the conveyance remains between the lines. The result is a confirmation of one of two meanings, and no telling which one it is. It's a lot worse when the only value is to be snarky. I've seen both result in dysfunction, though. I think I prefer sincerity.


Well, it says in the article "here is a lightly edited transcript of our discussion". The actual raw transcript was probably more like:

TH: Interesting, can you tell me some more about what worries you most here?

RL: I'm afraid I can't share any more details. Let's move on.

TH: So you spent some time in the private sector, and …

where the first part was edited into Got it.


Seems likely that he was referring to this, further on in the interview:

> ...battlefield weapons that incorporate microwaves and things like directed energy beams.


It will be interesting to see how human soldiers will react to the introduction of autonomous allies. They already have trust issues with unmanned systems [0], I imagine the reaction against to autonomous systems will be even stronger.

[0] https://www.foreignaffairs.com/articles/united-states/2017-1...


I think once autonomous systems demonstrate combat effectiveness, they will be integrated just fine. As an infantry veteran of many tours, I can tell you there will always be a preference for other people who are closer to the fight, in any situation, both on and off the battlefield. We like to stick to our own, because the 90% of the military that doesn't actually do the fighting is like a whole other stupid universe (about which I have many shocking and hilarious stories). So of course if you ask someone to choose between a human in an Apache on standby or a human in Las Vegas controlling a drone loitering over the city, you're going to get a preference for the former.

But if these autonomous systems are framed as pieces of equipment augmenting the infantry soldier and not as some kind of replacement, they will be quickly assimilated into the fight and this won't really be an issue. I mean of course as the technology improves and it becomes a reliable, effective force multiplier and not a complex piece of shit that makes your job harder (or your position easier to find, or your life more dangerous, etc etc). Framing it as a replacement for something that humans already do pretty well is the wrong way to do it, however, and I'm sure in that case it will always face stiff resistance from infantry soldiers.


That's an interesting link, but I think it's sort of the wrong question to ask. The last question of 10 unmanned vs 1 manned is closer to the the real question, but I think these operators are not used to making tough resource decisions because our military is so over funded.


But surely drones aren’t unmanned? Just their pilot is elsewhere. Curious as to what the real root of this is.


The focus seems to be on the fact that the pilot is elsewhere. From the article:

>Instead, the trust issue was a human issue. Not once did any of our respondents refer to a drone pilot as a human. Instead, drones were discussed in abstract terms that explicitly avoided any reference to a human controlling the machine. UAVs were “robots” or “machines” whose “operators,” as one respondent put it, were playing a video game “a world away.” This is despite the fact that most of them knew there was a human controlling the drone, and some even knew these pilots personally. Yet across the JTAC community, we heard a familiar narrative: drone pilots were coffee-drinking gamers whose distance from the battlefield severed their emotional connection to friendly ground troops. For instance, one of the JTACs explained that he preferred manned aircraft because their pilots “are in the fight, not just sipping a latte playing a video game.”


I suppose unmanned in that context means 'not sharing the risk of the battlefield by being there' or something.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: