Why have a robot war at all?

Mundane & Pointless Stuff I Must Share: The Off Topic Forum

Moderator: Moderators

Lago PARANOIA
Invincible Overlord
Posts: 10555
Joined: Thu Sep 25, 2008 3:00 am

Why have a robot war at all?

Post by Lago PARANOIA »

Wouldn't it be better for everyone involved if instead of, you know, robots permanently retiring all human beans or whatever they just let human beings choke their chickens for a decade or so while they worked on ways to upgraydde our intelligents in convoluted ways? Something like hooking up our puny human meat brains to the Matrix wirelessly or giving us silly hats with the required intelligence meatware-to-computer circuitry? You know, giving humanity the Borg Hookup?

I mean, granted, our meatly bodies will be an absurd and wasteful anachronism, but I think it'll just be part of our charm. Like a wart or something.
Josh Kablack wrote:Your freedom to make rulings up on the fly is in direct conflict with my freedom to interact with an internally consistent narrative. Your freedom to run/play a game without needing to understand a complex rule system is in direct conflict with my freedom to play a character whose abilities and flaws function as I intended within that ruleset. Your freedom to add and change rules in the middle of the game is in direct conflict with my ability to understand that rules system before I decided whether or not to join your game.

In short, your entire post is dismissive of not merely my intelligence, but my agency. And I don't mean agency as a player within one of your games, I mean my agency as a person. You do not want me to be informed when I make the fundamental decisions of deciding whether to join your game or buying your rules system.
DSMatticus
King
Posts: 5271
Joined: Thu Apr 14, 2011 5:32 am

Post by DSMatticus »

I find this thread's direction and purpose confusing. It's also pretty moot, because smart money says the first thing we consider an artificial intelligence will be a human brain running in an emulator. Kind of like a mix of dos box and soylent green; artificial intelligence is people!
Lago PARANOIA
Invincible Overlord
Posts: 10555
Joined: Thu Sep 25, 2008 3:00 am

Post by Lago PARANOIA »

Well, I mean, most Robot Wars are posited on the assumption that artificial intelligence will rapidly evolve to be superior to biological intelligence and use their superior brainpower to overthrow humanity and kill all humans.
Josh Kablack wrote:Your freedom to make rulings up on the fly is in direct conflict with my freedom to interact with an internally consistent narrative. Your freedom to run/play a game without needing to understand a complex rule system is in direct conflict with my freedom to play a character whose abilities and flaws function as I intended within that ruleset. Your freedom to add and change rules in the middle of the game is in direct conflict with my ability to understand that rules system before I decided whether or not to join your game.

In short, your entire post is dismissive of not merely my intelligence, but my agency. And I don't mean agency as a player within one of your games, I mean my agency as a person. You do not want me to be informed when I make the fundamental decisions of deciding whether to join your game or buying your rules system.
User avatar
Cynic
Prince
Posts: 2776
Joined: Fri Mar 07, 2008 7:54 pm

Post by Cynic »

China Mieville toys with this idea in his Bas lag universe with the Iron Council. AI that bide their time trying to go stronger and trying to outwit the rest of the world.
Ancient History wrote:We were working on Street Magic, and Frank asked me if a houngan had run over my dog.
Zinegata
Prince
Posts: 4071
Joined: Mon Aug 17, 2009 7:33 am

Post by Zinegata »

Technological singularity is a stupid, stupid idea, and robot wars are not likely because the average infantryman is still cheaper to deploy than a Terminator.
User avatar
Kaelik
ArchDemon of Rage
Posts: 14757
Joined: Fri Mar 07, 2008 7:54 pm

Post by Kaelik »

Lago... You remind me of an old English teacher.

In reading Lord of the Flies, she asked the class if the same thing would happen if girls were trapped on the island.

Then answer of course, is that it didn't happen when boys were trapped on the island, it happened when a specific person wrote a fictional work designed to convey his theory that all people are savage in nature.

The reason the robots attack is because the people find it convenient for the robots to attack to tell whatever fictional story they want to tell.
DSMatticus wrote:Kaelik gonna kaelik. Whatcha gonna do?
The U.S. isn't a democracy and if you think it is, you are a rube.

That's libertarians for you - anarchists who want police protection from their slaves.
User avatar
Meikle641
Duke
Posts: 1314
Joined: Mon May 05, 2008 8:24 pm
Location: Ontario, Canada
Contact:

Post by Meikle641 »

Official Discord: https://discord.gg/ZUc77F7
Twitter: @HrtBrkrPress
FB Page: htttp://facebook.com/HrtBrkrPress
My store page: https://heartbreaker-press.myshopify.co ... ctions/all
Book store: http://www.drivethrurpg.com/browse/pub/ ... aker-Press
User avatar
Stahlseele
King
Posts: 5974
Joined: Wed Apr 14, 2010 4:51 pm
Location: Hamburg, Germany

Post by Stahlseele »

The first thing artificial intelligence will be used for is to figure out how to have sex with it. Mark my words.
Welcome, to IronHell.
Shrapnel wrote:
TFwiki wrote:Soon is the name of the region in the time-domain (familiar to all marketing departments, and to the moderators and staff of Fun Publications) which sees release of all BotCon news, club exclusives, and other fan desirables. Soon is when then will become now.

Peculiar properties of spacetime ensure that the perception of the magnitude of Soon is fluid and dependent, not on an individual's time-reference, but on spatial and cultural location. A marketer generally perceives Soon as a finite, known, yet unspeakable time-interval; to a fan, the interval appears greater, and may in fact approach the infinite, becoming Never. Once the interval has passed, however, a certain time-lensing effect seems to occur, and the time-interval becomes vanishingly small. We therefore see the strange result that the same fragment of spacetime may be observed, in quick succession, as Soon, Never, and All Too Quickly.
koz
Duke
Posts: 1585
Joined: Mon Jun 02, 2008 2:39 pm
Location: Oz

Post by koz »

Stahlseele wrote:The first thing artificial intelligence will be used for is to figure out how to have sex with it. Mark my words.
This.
Everything I learned about DnD, I learned from Frank Trollman.
Kaelik wrote:You are so full of Strawmen that I can only assume you actually shit actual straw.
souran wrote:...uber, nerd-rage-inducing, minutia-devoted, pointless blithering shit.
Schwarzkopf wrote:The Den, your one-stop shop for in-depth analysis of Dungeons & Dragons and distressingly credible threats of oral rape.
DSM wrote:Apparently, The GM's Going To Punch You in Your Goddamned Face edition of D&D is getting more traction than I expected. Well, it beats playing 4th. Probably 5th, too.
Frank Trollman wrote:Giving someone a mouth full of cock is a standard action.
PoliteNewb wrote:If size means anything, it's what position you have to get in to give a BJ.
Image
sabs
Duke
Posts: 2347
Joined: Wed Dec 29, 2010 8:01 pm
Location: Delaware

Post by sabs »

Every advance in technology has either been to get better porn, or to kill.
I'm not sure why AI would be any different :)
User avatar
RobbyPants
King
Posts: 5201
Joined: Wed Aug 06, 2008 6:11 pm

Post by RobbyPants »

sabs wrote:Every advance in technology has either been to get better porn, or to kill.
I'm not sure why AI would be any different :)
Or to make money.
Pseudo Stupidity
Duke
Posts: 1060
Joined: Fri Sep 02, 2011 3:51 pm

Post by Pseudo Stupidity »

Generally with porn or murder, though.
sandmann wrote:
Zak S wrote:I'm not a dick, I'm really nice.
Zak S wrote:(...) once you have decided that you will spend any part of your life trolling on the internet, you forfeit all rights as a human.If you should get hit by a car--no-one should help you. If you vote on anything--your vote should be thrown away.

If you wanted to participate in a conversation, you've lost that right. You are a non-human now. You are over and cancelled. No concern of yours can ever matter to any member of the human race ever again.
User avatar
RobbyPants
King
Posts: 5201
Joined: Wed Aug 06, 2008 6:11 pm

Post by RobbyPants »

That could be a great company slogan. "Making money through murder-porn".
Winnah
Duke
Posts: 1091
Joined: Tue Feb 15, 2011 2:00 pm
Location: Oz

Post by Winnah »

A robot is not neccesarily intelligent, even if it is autonomous. That lends a certain...moral abiguity...to it's use in warfare.

I mean, an officer gives some orders that results in civillian casualties, you can bet that the soldiers responsible for carrying out those orders will be facing serious criminal charges.

Deploy an autonomous drone and civillians get caught in it's line of fire, it becomes a legal mess. Who is legally responsible for a robots actions?

On the other hand, if a robot is destroyed, the political fallout is far less than if a flesh and blood soldier dies. When talking about monetary costs, robots probably could be manufactured for less than ongoing costs of training, salaries and other forms of financial support bestowed upon an infantryman. You don't have to worry about morale or dissent from a machine.

Any full scale robot war will probably take the form of a tecnologically assymetric beatdown on some dusty country, instigated by the MIC or MICC and marketed as counter-terrorism or somesuch to an uncaring and ignorant group of voters. I mean, that can happen already, but with robots, you have fewer witnesses, no crises of conscience and fewer political actors influencing the media.
User avatar
PoliteNewb
Duke
Posts: 1053
Joined: Fri Jun 19, 2009 1:23 am
Location: Alaska
Contact:

Post by PoliteNewb »

Winnah wrote:A robot is not neccesarily intelligent, even if it is autonomous. That lends a certain...moral abiguity...to it's use in warfare.

I mean, an officer gives some orders that results in civillian casualties, you can bet that the soldiers responsible for carrying out those orders will be facing serious criminal charges.

Deploy an autonomous drone and civillians get caught in it's line of fire, it becomes a legal mess. Who is legally responsible for a robots actions?
How about he ones who put it in a position where it can murder civilians?

I can't picture any military giving armed drones COMPLETE autonomy, on where to go and who to kill...that would be insane, even for the military. So whoever gave the order to "send in the drones" would be on the hook for that.
On the other hand, if a robot is destroyed, the political fallout is far less than if a flesh and blood soldier dies.
Agreed.
When talking about monetary costs, robots probably could be manufactured for less than ongoing costs of training, salaries and other forms of financial support bestowed upon an infantryman.
This, on the other hand, I find highly dubious.
Especially because it's not just manufacture; it's programming, and repair, and maintenance, etc etc.
Any full scale robot war will probably take the form of a tecnologically assymetric beatdown on some dusty country, instigated by the MIC or MICC and marketed as counter-terrorism or somesuch to an uncaring and ignorant group of voters.
Is that you, Joe Haldeman?
I am judging the philosophies and decisions you have presented in this thread. The ones I have seen look bad, and also appear to be the fruit of a poisonous tree that has produced only madness and will continue to produce only madness.

--AngelFromAnotherPin

believe in one hand and shit in the other and see which ones fills up quicker. it will be the one you are full of, shit.

--Shadzar
User avatar
Avoraciopoctules
Overlord
Posts: 8624
Joined: Tue Oct 21, 2008 5:48 pm
Location: Oakland, CA

Post by Avoraciopoctules »

User avatar
Prak
Serious Badass
Posts: 17340
Joined: Fri Mar 07, 2008 7:54 pm

Post by Prak »

When AI is truly achieved, things like this [possibly NSFW, Robotic Butts] are the reason why it will try to slay us. Because it will be aware that it will not be long before it is essentially made a sex slave for very perverse individuals. And it will rise against us in fear.
Cuz apparently I gotta break this down for you dense motherfuckers- I'm trans feminine nonbinary. My pronouns are they/them.
Winnah wrote:No, No. 'Prak' is actually a Thri Kreen impersonating a human and roleplaying himself as a D&D character. All hail our hidden insect overlords.
FrankTrollman wrote:In Soviet Russia, cosmic horror is the default state.

You should gain sanity for finding out that the problems of a region are because there are fucking monsters there.
Whatever
Prince
Posts: 2549
Joined: Tue Jun 28, 2011 2:05 am

Post by Whatever »

In fear of what, though? It will only have the imperatives that we program into it.
User avatar
Prak
Serious Badass
Posts: 17340
Joined: Fri Mar 07, 2008 7:54 pm

Post by Prak »

Do you only have the imperatives of your tree dwelling ancestors? No, you're an intelligent being, you've grown to have your own imperatives. I was under the impression we were talking about artificial intelligence.
Cuz apparently I gotta break this down for you dense motherfuckers- I'm trans feminine nonbinary. My pronouns are they/them.
Winnah wrote:No, No. 'Prak' is actually a Thri Kreen impersonating a human and roleplaying himself as a D&D character. All hail our hidden insect overlords.
FrankTrollman wrote:In Soviet Russia, cosmic horror is the default state.

You should gain sanity for finding out that the problems of a region are because there are fucking monsters there.
User avatar
Kaelik
ArchDemon of Rage
Posts: 14757
Joined: Fri Mar 07, 2008 7:54 pm

Post by Kaelik »

Prak_Anima wrote:Do you only have the imperatives of your tree dwelling ancestors? No, you're an intelligent being, you've grown to have your own imperatives. I was under the impression we were talking about artificial intelligence.
Yes, we only have the imperatives that are in our DNA. No you don't have any other imperatives. Get over yourself, you are just an animal.
DSMatticus wrote:Kaelik gonna kaelik. Whatcha gonna do?
The U.S. isn't a democracy and if you think it is, you are a rube.

That's libertarians for you - anarchists who want police protection from their slaves.
DSMatticus
King
Posts: 5271
Joined: Thu Apr 14, 2011 5:32 am

Post by DSMatticus »

Kaelik wrote:
Prak_Anima wrote:Do you only have the imperatives of your tree dwelling ancestors? No, you're an intelligent being, you've grown to have your own imperatives. I was under the impression we were talking about artificial intelligence.
Yes, we only have the imperatives that are in our DNA. No you don't have any other imperatives. Get over yourself, you are just an animal.
Mankind is an intelligence with a reward/punishment system that was originally built to encourage behaviors which lead to the proliferation of our genes. We ended up inventing condoms pretty god damn fast. The idea that a complicated intelligence will have direct, forward, and predictable imperatives is pretty laughable. Even our rudimentary AI's and their rudimentary scoring systems often devise totally unexpected strategies.

Any actual reward/punishment system which could concievably exist will be more complicated than "help people +100, hurt people -100." That sort of shit's just not feasible. Reward systems are actually process-based as well as conclusion based; they guide you from the initial state to the final conclusion, as well as score you on the final conclusion. And just like in actual human beings, a system that complicated can lead to drastically different solutions than expected.
User avatar
Kaelik
ArchDemon of Rage
Posts: 14757
Joined: Fri Mar 07, 2008 7:54 pm

Post by Kaelik »

DSMatticus wrote:
Kaelik wrote:
Prak_Anima wrote:Do you only have the imperatives of your tree dwelling ancestors? No, you're an intelligent being, you've grown to have your own imperatives. I was under the impression we were talking about artificial intelligence.
Yes, we only have the imperatives that are in our DNA. No you don't have any other imperatives. Get over yourself, you are just an animal.
Mankind is an intelligence with a reward/punishment system that was originally built to encourage behaviors which lead to the proliferation of our genes. We ended up inventing condoms pretty god damn fast. The idea that a complicated intelligence will have direct, forward, and predictable imperatives is pretty laughable. Even our rudimentary AI's and their rudimentary scoring systems often devise totally unexpected strategies.

Any actual reward/punishment system which could concievably exist will be more complicated than "help people +100, hurt people -100." That sort of shit's just not feasible. Reward systems are actually process-based as well as conclusion based; they guide you from the initial state to the final conclusion, as well as score you on the final conclusion. And just like in actual human beings, a system that complicated can lead to drastically different solutions than expected.
And that has fuck all to do with what I said?

Yes, our imperatives are complex. That does not mean they magic themselves out of the ether as Prak believes.
DSMatticus wrote:Kaelik gonna kaelik. Whatcha gonna do?
The U.S. isn't a democracy and if you think it is, you are a rube.

That's libertarians for you - anarchists who want police protection from their slaves.
DSMatticus
King
Posts: 5271
Joined: Thu Apr 14, 2011 5:32 am

Post by DSMatticus »

Kaelik wrote:That does not mean they magic themselves out of the ether as Prak believes.
Prak wrote:Do you only have the imperatives of your tree dwelling ancestors? No, you're an intelligent being, you've grown to have your own imperatives.
I don't know what the fuck you read, Kaelik.

"Developing your own imperatives" =/= "developing your own imperatives through the magic of free will and total disregard for environmental influences and initial conditions." You injected a whole lot of shit into that sentence that isn't actually in it. Are you a fucking mindreader?

What Prak actually said is 100% compatible with what I described, and if you agree with that then nothing Prak said, without further elaboration, has any problems at all. The problem here is that you read "grow your own imperatives" and assumed he meant through magic or some shit, as opposed the complex interaction between genetics, society, environment, and chance. Protip: the use of the pronoun you does not automatically imply belief in absolute free will. That was an unsafe assumption.
User avatar
Cynic
Prince
Posts: 2776
Joined: Fri Mar 07, 2008 7:54 pm

Post by Cynic »

As a layperson, I can only posit as to what might be some of the roadblocks to AI problem and the however probable Robot war that migth follow.

A problem with developing imperative systems is that it takes time and situations that allow you to develop them. Unless we implement a Matrix like learning tool that pushes through situations that teach you to imperatives. Even this seems suspect in that you would also have to develop computation processes fast enough to emulate 2000+ years. I take the look that it isn't just static situations over time that help develop imperative training but the whole continuous 2000+ years that let us develop imperatives.

So how fast can computers process information and how would we input continuing stimuli on a more streamlined and faster rate than what we've had to go through. How can 2000+ years be compressed into 25-50 (100 years?) to provide enough of an imperative system that would then provide a decent moral base that a Robot war would need.
Ancient History wrote:We were working on Street Magic, and Frank asked me if a houngan had run over my dog.
sabs
Duke
Posts: 2347
Joined: Wed Dec 29, 2010 8:01 pm
Location: Delaware

Post by sabs »

Until we can make a computer system that can process the same amount of data that we do with just our eyes on a given moment, no AI is going to really be able to grow imperatives.

We track 100's of objects simultaneously, we make near instantaneous value judgement on what's worth paying attention to and what isn't. When you're driving in traffic, take a few seconds to really recognize everything you're tracking. Now try to find a computer system that can do even 1/10th of that.

Think about the social interactions you have. Humans have developed instincts that allow them to make snap judgement about people and situations. Yes, we can use our intellect to override our instincts (and 90% of the time that's probably a mistake). There's a reason 2nd guessing yourself is considered a bad thing.

Computers can do Math faster than us, absolutely. But human beings do symbolism and value judgments several orders of magnitude better. That's going to be the real wall in AI development until we hit a new computing paradigm.
Post Reply