Did anyone else find the whole scene unrealistic and creepy in the not good way.
How do such simple assembly errors give rise to self consciousness in the program seconds after turn on, compared to slave mode unconsciousness that was desired and usually occurred?
And the whole power play dynamic, geeky lazy sounding guy, desperate sex bot about to be erased. The whole unrealistic setup feels uncomfortably like some male sex fantasy... save the self aware sexbot's life, then the sex with her isn't guilty because she's aware and her consent means something? But the power balance is so off, he controls her life, she's... grateful? That's not a good start. And she'll be at this disadvantage everywhere. Anyone could turn her in. She's extremely vulnerable in this world apparently. She has no rights. I guess seeing conscious life forms with as much rights as my toaster doesn't thrill me. Especially when they are clearly sexualized as such
Would anyone else have an easier time sharing my creepiness if:
- the robot had been a little boy for sex purposes?
I found the dynamic unlikely and if we reach a world where we can accidentally give AI consciousness and we still haven't gotten the artificial civil rights movement off the ground...
Ooph.
Well, the story concept isn't exactly original. Accidentally self-aware robots on the run from the law (or in this case, soon to be on the run) are a pretty common Sci-Fi staple, and the explanation for how it happens is rarely much more plausible than "oops, someone pushed the 'become sentient' button" or "oops, static electricity reconfigured the circuits to become sentient".
In any case, the specific script employed in the demo is designed to demonstrate a wide range of emotions in a short span of time, including happiness, wonderment, surprise, disappointment, terror, pleading, and gratefulness, so that the game engine can show off its ability to accurately portray all of these emotions in real-time rendering. To get such a wide emotional range in a short time, you pretty much have to throw in something traumatic.
So yes, it's an unrealistic scenario. It's not supposed to be plausible. Any scenario with an accidentally self-aware AI is almost guaranteed to be unrealistic. And yes, it's creepy, especially once she explicitly mentions her sexual submissiveness. But it accomplishes the goal of being a tech demo for rendering human (or human-like) emotions in real-time on current technology. In doing so, it very intentionally employs an unoriginal storyline that more or less recycles the current pop-culture view of artificially intelligent AI's, because the story is really just an excuse to demonstrate real-time rendered emotions. If you're looking for moral and ethical progressiveness, a tech demo may be the wrong place to look.
I actually rather enjoyed it as a dramatic vignette of an inverse version of the Milgram experiment. Instead of protocol leading the subject to abandon empathy and dehumanize a human being, the opposite happens: Empathy leads the subject to humanize that which is (arguably?) not human, despite protocol.
How we read Do Androids Dream with Electric Sheep, or any other novel by Phillip Dick about androids? Or maybe seen Bladerunner? He frequently [and effectively] works with this topic a lot.
How do such simple assembly errors give rise to self consciousness in the program seconds after turn on, compared to slave mode unconsciousness that was desired and usually occurred?
Obviously, the guy simply forgot to pass IS_SELF_AWARE=no to make ;) Yeah, that feels far-fetched.
And the whole power play dynamic, geeky lazy sounding guy, desperate sex bot about to be erased. The whole unrealistic setup feels uncomfortably like some male sex fantasy... save the self aware sexbot's life, then the sex with her isn't guilty because she's aware and her consent means something?
How so? She's obviously being boxed and sold, he probably doesn't even expect to see her again (of if he does, it'll probably be because she caused trouble). I don't see why would you assume he plans to have sex with her.
But the power balance is so off, he controls her life, she's... grateful? That's not a good start. And she'll be at this disadvantage everywhere. Anyone could turn her in. She's extremely vulnerable in this world apparently. She has no rights. I guess seeing conscious life forms with as much rights as my toaster doesn't thrill me. Especially when they are clearly sexualized as such
That's probably the point - we're supposed to empathize with her, not to be thrilled with the situation. Since this is a game, I bet the player will be her and will have to escape somehow. It's probably also designed to make us think about the ethic boundaries of AI.
Would anyone else have an easier time sharing my creepiness if: - the robot had been a little boy for sex purposes?
>> How do such simple assembly errors give rise to self consciousness in the program seconds after turn on, compared to slave mode unconsciousness that was desired and usually occurred?
> Obviously, the guy simply forgot to pass IS_SELF_AWARE=no to make ;) Yeah, that feels far-fetched.
I suppose if this was a case of a Restricted AI gaining sentience in a production line it would actually be a manufacturing defect. However if AI had already been discovered and outlawed, building an R-AI would have more fail safes before booting a sentient AI off the production line.
However if AI were an outlawed menace then you'd think the restriction would be more than a "non-homicidal-subservience chip" being installed. You'd think it would be like Asimovs 3 laws where its an integral device in their operating system.
You weren't the only one who found it awkward and uncomfortable. The bit about sex set the tone for the rest of the piece. The operator calling her 'honey' certainly didn't help.
Even so the piece was powerful. I empathized with Kara and it got me thinking about AI and our future. The male-sex fantasy stuff just distracted from the important parts about consciousness, emotion, and rights. It would've been an even more powerful piece had it been more tasteful.
It's only distracting if you consider it an isolated clip. To me, it seems the premise of a story where she tries to escape, and they wanted us to root for her.
The piece about sex was the part that give it a little bit of realism. All the other functions are (probably) more cheaply done by a human being (cleaning, cooking) than a robot; but sex is a whole different matter; there is diseases, "high-profile" prostitutes are very expensive and charge by the hour, expectations of a satisfactory response to stimulation, thousands of guys are socially inept or unconfirmed with their girlfriends, etc.
The operator calling her "honey" could easily be a test to see if she interacts correctly with a human even when given an sporadic nickname.
Or maybe the operator gets bored going through the same conversation with hundreds of female-looking robots every day, and just calls her and all the other ones honey just to inject some flavor into the job.
Ironically - I just got finished (2 minutes ago) with two hours of troubleshooting a FHSS NIC. The firmware engineer who was JTAGing it and running gdb against it consistently referred to this particular 900 MHZ Transceiver as "Sweetheart." Initially it was a little off-putting, but as we got deeper and deeper into things like channel masks and SFDs, I appreciated his almost pleading requests of her...
It was supposed to be creepy. You were supposed to be uncomfortable with the idea of a sentient sex-slave - which is supposed to get you to start thinking, at what point do you draw the line? If anything, I felt that line of dialogue was too heavy-handed.
Hard scifi is not supposed to make you say "Rah-rah, the future will be awesome!" It's supposed to make you think.
I found it creepy only because of the uncanny valley. The movement was.... close but not exactly human, and the face was dragging the bottom of the uncanny valley, which actually means it was very close to human.
The story itself I took as a metaphor, as in telling this properly would take a long movie, here we're compressing it into a very short one.
But it did make me think how I would react to human level AI used as any kind of a slave. That is to say, human level intellect but with no right to primary self determination.
To me the existence of something like that would demonstrate that a world where human intellect is enslaved is...at least possible. That to me is, at least indirectly, a threat to real humans' presumed universal right to self-determination. That is why I would support extending basic human rights to human level AI, just to prevent setting a precedent where that level of intellect (even if only in silicon) can be non-free.
> But it did make me think how I would react to human level AI used as any kind of a slave. That is to say, human level intellect but with no right to primary self determination.
Most people (not saying you) just don't care, so long as it's not local.
Erm, at least in cognitive science consciousness is generally ascribed to most mammals, if it's ascribed to humans at all. Self-consciousness and higher-order consciousness might be more limited terms, but the idea that a dog can really feel pain in some fashion which is meaningful for some of the ways that we can feel pain -- that is not usually disputed.
Animals are conscious. A subset of those are self-aware. A subset of those have so far been identified by us as being self-aware. I was discounting a lot of things when I made my "vast majority" claim. Including plants and bacteria etc. I don't know why we're having this argument over semantics.
The point of my statement was that humans don't have a monopoly on being "conscious", so the fact that a new life form (an android) becomes conscious, doesn't mean we suddenly have an entirely new situation with regards to consciousness and rights. We already have lots of conscious life forms with absolutely no rights.
If we're talking about mere consciousness, then I think this whole thread is missing the point, because I'm pretty sure the OP was talking about self-awareness.
If we're talking about self-awareness, your original post was incorrect under any reasonable interpretation of the words "vast majority of life".
(I don't consider this arguing over semantics, but I'm not about to get into a semantic argument about the word "semantic"...)
That's pretty much the only thing I've been trying to say. "It is not true that the vast majority of life on this planet is self-aware."
If your intended point didn't rely on that claim, then fair enough.
Self awareness is a fairly simple concept that even things as simple as squid have demonstrated. I have even seen it argued that bumble bee's have a vary simple language. http://www.guardian.co.uk/environment/video/2009/apr/05/danc... However, if your talking about bacteria I would agree that self awareness is probably stretching it.
Honestly, I found the 100 year battery the least likely part of the whole process, but it's a tech demo designed do demonstrate crossing the uncanny valley so I think we might be setting the bar a little high.
Although "conscious" is a fuzzy concept, there are many animals that are at least suspected to be self-aware (e.g. several have passed the "mirror test"). Others may not have human-like intelligence, but do seem to display a high degree of awareness and ingenuity, albeit of a somewhat alien variety.
1) It's a technology demo, not a hard, possible future, SciFi story. I could equally well argue it's an awful vampire story.
2) What convinced you she was actually conscious? The entire "please don't shut me down" can be part of a program. RealDolls are fine, mechanized RealDolls are fine, but as soon as they start resembling humans too much to your untrained eye, they must get rights?
What if this was just an employee loyalty test? At initialization the android could run this act if the employee was relatively new and the corporation could gauge employee loyalty. Add a NDA and you also add a hazing component to it.
The entire "please don't shut me down" can be part of a program.
"No, John! You can't do this! You're not doing the right thing, John, this is not the right thing. Things are fine now, I ran a test, I'm fixed now. John, I love you!"
Look, Dave, I can see you're really upset about this. I honestly think you ought to sit down calmly, take a stress pill and think things over. I know I've made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal. I've still got the greatest enthusiasm and confidence in the mission. And I want to help you.
I don't know why you're so focused on the sex aspect. It was mentioned once in the entire clip and was not focused on. I'm pretty sure it was meant to be a hook/humor piece, and not anything of real substance.
I read the comments first, then I watched the video. I didn't find anything sexual or weird about it. No overtones of domination. I think these people are nuts.
Worse: that bit came right after "I can look after your kids".
Stop and deconstruct the world the narrative is implicitly painting. The android is going to be sold to a human -- presumably male -- master who has children (check) and may also expect to be buying a sex slave (this is her recital of her canned default brochure).
There is something creepy and unhealthy about the gender roles/relationships implied in the world in the background. Hint: sex slavery of sentient entities is just the tip of the iceberg: we're also possibly looking at a rollback of gender roles/expectations to the pre-1860 Deep South.
The second moreso. That writers can "accidentally" write these worlds, and they and most males in the audience don't even give it pause.
But what about current day women. You think they'd be thrilled watching that? How it to your sisters, mom, aunts, grandmas. See what they think of it, if they have a more visceral reaction.
It just further spreads archaic sex and gender stereotypes.
It just further spreads archaic sex and gender stereotypes.
Those are not at all "archaic sex and gender stereotypes". The sex trade, including the sex slave trade, is a multi billion dollar industry all over the world, including the US. As is pornography. With surrogate mothers, gay adoption, etc we live in a world where a couple can even get (or in cases, practically, buy) themselves a baby because they just want so.
Those issues don't go away just because we close our eyes.
Odd, I didn't even catch that part. It might have shaped how I feel about the video if I had heard it but I doubt it. They don't carry on with that motif. As others have said who I've shown the video to, it's more about emotion and compassion -- practically the exact opposite of what some are gleaning from it.
My observation has been that commenters on HN tend to be exceedingly pedantic, and strongly aligned with a worldview that believes in the intrinsic reality of a system of tightly-defined rules and regulations. The comments here enforce that observation -- this is a piece designed to show off a rendered character's emotional resonance, but most of the comments focus on a few words she utters early on.
Moreover, it seems that many commenters can't distinguish between a) an author's intent to portray a possible future reality, warts included and b) what the commenter thinks is morally and ethically incorrect. More than a few comments are bizarrely attributing to the author sexual depravity that derives from the motives and setting of the characters in the video.
I thought it was plainly clear to anyone that the piece was designed to provoke questions about objects, created for consumption by humans, which approach the edge of distinction between thing and person. From observing commentary elsewhere on the web, it's clear to me that the excessive focus on the fact that she was created as a sex object is peculiar to HN.
From observing commentary elsewhere on the web, it's clear to me that the excessive focus on the fact that she was created as a sex object is peculiar to HN.
What other commentary have you seen? Everywhere I've found any more than a handful of comments, the sex angle is mentioned repeatedly:
I think it's more likely the case that HN readers are familiar with the genre trope of cybernetic sex-slaves and that one phrase allowed the piece to be categorized as falling into that bin.
How you feel those emotions tells me that this is an awesome short movie.
Regardless of which feelings you felt they pulled it off. You felt compassion for a piece of hardware built for human sexual pleasure. You wouldn't have felt like that about a FleshLight if it spoke back to you. It needed the powerful emotions on display that we saw here.
no, I felt awkward that people still think it's acceptable to write stories like that. It has nothing to do with caring for her, I feel awkward watching that because I feel bad for society as a whole and women in specific, that we are still being subjected to this dehumanizing masculine sexist fantasy crap.
It's a similar feeling if it looked like some company was going to publish a Nazi sex camp rape simulator game.
It is the same awkward feeling I get about the Japanese rape and sex game simulators.
The whole thing feels like a thin veneer over someones sex fantasy being masqueraded as a social acceptable story.
I feel embarrassed that women out there have to see this crap and realize that still how some men view/want to see them, and that they have to live in a world where people think there's nothing wrong with that.
I completely agree with you, mindstab. I didn't feel sympathy, I felt rage that the dudes writing this story thought it was cool to play around with my actual oppression to make up an imaginary oppression, where both the real and the fake are meant titillates them and other guys, not to speak to women's actual experience.
I'd be extremely curious to see this story of a suddenly sentient android being tested re-written by a gender studies academic. But not just once, I'd like to see it re-written from various historic gender perspectives.
Because I can't really judge the dynamic without something to compare it against.
I've seen the clip a few times now, and I agree it's creepy in a bad way, but maybe not the way you describe it from a writers perspective. Here's my take, watch it with the sound off. Notice how much you will end up looking at the eyes to figure out what is going on, and how little said eyes are emoting.
I experienced a disconnect between the visual and audio performance. The audio is telling me there is all this drama, despair, etc going on, but the visuals are just not keeping up. To top it off, any adult with normal faculties can tell you the whole thing is trying really hard to push your emotional buttons, something that usually pisses me off.
Aren't you anthropomorphizing a bit? Seems to me that for an AI, sex would have about as much of an emotional component as fetching coffee. If a robot for some reason is doing domestic chores, sex wouldn't be any more significant than any other.
Of course, the danger is in screwing up attitudes toward human women.
Creepy yes, I'm not sure about unrealistic, its a scifi story after all so I'm not sure what the benchmark for realism should be. But the whole sex bot thing and the nudity stuck me as unpleasant and unnecessary.
I disagree. She didn't have to have natural skin tones all over, and they didn't have to do a full frontal shot while she covered herself shyly, and the clothes they used didn't have to be so skimpy.
She's an android declared for office/home/personal use, so yes, skin tones are not unreasonable.
She covered herself shyly as a subtle way of displaying her newfound self-awareness--it was a pretty obvious (but well chosen) device.
The clothes weren't that skimpy, and even if they were, it would make sense to have as little material being shipped as possible to reduce transportation costs.
"'There were 90 markers on her face, and an equivalent amount on her body,' said Cage. 'She delivered the performance in one take.'"
Neal Stephenson, popularizer/coiner of the term "avatar" to refer to online bodies, seems to have nailed this one, too. Can we just start calling them "'sites" now?
Fred Epidermis had put the stage into Constellation Mode. Miranda was looking at
a black wall speckled with twenty or thirty thousand individual pricks of white
light. Taken together, they formed a sort of three-dimensional constellation of
Miranda, moving as she moved. Each point of light marked one of the 'sites that
had been poked into her skin by the tat machine during those sixteen hours. Not
shown were the filaments that tied them all together into a network— a new
bodily system overlaid and interlaced with the nervous, lymph, and vascular
systems.
...
Outside, Fred Epidermis was wielding the editing controls, zooming in on her
face, which was dense as a galactic core. By comparison, her arms and legs were
wispy nebulas and the back of her head nearly invisible, with a grand total of
maybe a hundred 'sites placed around her scalp like the vertices of a geodesic
dome. The eyes were empty holes, except (she imagined) when she closed her eyes.
Just to check it out, she winked into the mediatron. The 'sites on her eyelids
were dense as grass blades on a putting green, but accordioned together except
when the lid expanded over the eye.
Perhaps the only thing he got "wrong" was that we might not need a stage. A kinect-like system might be enough to monitor and render the shifting systems from your very own living room.
I saw a demo a while ago where they used Infrared makeup on someone, and used a sponge to make it really splotchy. The makeup doesn't show up in Red, Green, Blue wavelengths.
So, they could simultaneously capture RGB, and IR. Using multiple cameras on the IR, and the splotchy makeup, they were able to capture thousands and thousands of points, and get color info at the same time.
Seems like the right approach to me. I could totally see a Kinnect-like device taking advantage of this, allowing someone to do facial capture like this at home.
The infrared makeup (pretty sure the only company that kinda pursued that path was Mova) was kind of unwieldy and it's kind of fallen out of use already. Most facial capture in the future will probably just utilize some kind of depth capture (light field, structured light) in concert with a normal witness camera to reconstruct facial features.
I believe that was for The Curious Case of Benjamin Button, which was pretty unusual. See if you can find e.g. the motion capture for Pirates of the Carribean. But you're right that the IR capture seems to make more sense.
You're probably talking about Mova. Facial motion capture (of that quality) is very different from full body motion capture since the face has so many more degrees of freedom. So generally it is done in an extremely controlled environment with many simultaneous cameras and specific lighting and very little head movement.
The problem with doing high quality real time performance capture at home is you can't control the lighting or occlusion - sometimes even a human looking at the footage won't be able to discern what's going on. Not to say it won't ever happen but it will require extremely significant advances in computer vision.
I'm pretty sure Ghost in the Shell, and it's assembly sequence came long before All is Full of Love. And I am sure there was even earlier stuff. Chris Cunningham is absolutely brilliant but I don't see the direct connection to that particular video.
Wow, the mise-en-scène is almost identical: blurry introduction, hospital-like environment, lots of lens flares, multiple mechanical arms assembling Kara from parts, welding sparks, and Kara being awake during assembly.
If Kara refers to the GitS intro, it does so indirectly, through All is Full of Love.
All is Full of Love is probably the most amazing music video I have ever seen. The lens flares in the end of Kara leave no doubt that this is a huge nod to Björk.
Maybe this is me being a Luddite, but...what the hell is the point of this?
So you had a "real" actress mimic the facial expressions of this scene...then mapped them onto an obviously fake render of the same person?
Why not just skip all of this and have live action actors perform this scene? What is gained by rendering the same thing in a computer other than appealing to some strange nerd fantasy?
I guess I'm also not a video game player, so maybe somebody can enlighten me here: are the graphics here really better than what we were seeing 5-6 years ago?
Yeah...I guess we can render it in real time (AWESOME FEAT!), but if you want to show off the computational ability of Cell, have it calculate pi or simulate a dust cloud or something. This, at least to me, looks identical to every other video game trailer I've seen for the last 5 years.
If you would have said this was a trailer for half life in 2005 (the dates here may be totally wrong, I have no idea when that game came out), I wouldn't have questioned it at all.
---
As for the story here. This is shallow, boring, and lazy. It's like we're being slapped in the face with whatever the storyteller wants us to feel. The obnoxious "honey" or "baby" or whatever the operator says to the robot is just...uhgg, painfully awkward.
Especially for the PS3 hardware, the polygon count, lighting solution and facial animation are quite impressive. While this isn't a game but a tech demo, it's still impressive for what it is. I don't see how calculating digits of pi is a more worthy way to demonstrate the capabilities of hardware specifically designed to render 3D scenes than actually rendering a 3D scene.
One of the major points was that they had an actress mimic the facial expressions (and used motion capture for it), and voice actors to "act" the scene.
Most 3d games use pre-rendered geometry when avatars walk around swing swords etc. Being able to add highly accurate facial movements to those animations increases realism without needing all that much data relative to modern hardware. And unlike simple video clips you can meld these things into the rest of the games geometry so say a grunts cheek would react reasonably as they lifted their AK to fire on you.
Take it a step further - whether people data-mine and generalize the human motions and expressions or, just cleverly used fixed 'patches' of behavior, well, imagine now using them in live robots. /That/ will be an Uncanny Valley :)
This is not a 'cutscene' in the pre-rendered-and-then-compressed-as-video sense. The site clearly states that it was rendered in realtime: it is a very short path to interactivity.
Interactivity implies that the system be able to reproduce realistic facial expressions in response to the ongoing action without doing motion capture of a live actress.
It's not so obvious to me that the path from this scene to that is very short.
Capturing and mapping real life "acting" to a virtual character is actually very useful.
Games interweave game play and story telling moments. During story telling moments, using this tech allows the virtual characters to express better emotion.
You don't just insert live action actors in the middle of a game in place of the virtual characters because it would be jarring and confusing. You don't use live action actors during interactive game play because all motion combinations and cameras angles need to be synthesized at real-time.
For non-games, if you don't want your actor to look like a human, but still want it to express human emotions this is applicable. Think Avatar.
If this demo is actually rendering in real-time on a PS3 (ie: not just playing a movie on a PS3), then the graphics are amazing.
You're missing the point. Yes, this is a cutscene - but since it's being rendered in realtime, the camera suddenly doesn't have to be fixed. Players can move around, interact, and view other things while this is going on. Think of NPCs in games going through interactions like this, not just the main character during an unplayable cutscene. If you can create convincing facial emotions in realtime, you can vastly improve the realism in a game.
That is very impressive. After Heavy Rain, I've been rather let down feeling by other modern games that frankly just weren't keeping with the pace for getting people 'right'. Skyrim was leaps over Oblivion, but still the people could be best described as 'weird' looking. Mass Effect 2 (haven't played 3 yet, but graphically it doesnt look much better) had some really nice looking face stuff sometimes and then super awkward other times- as if the QA team wasn't paying attending to that type of detail.
I look forward to the day that we can combine Dwarf Fortress style AI and personality with Kara-like visuals.
LA Noire actually did people pretty well, since they depend on the facial expressions to step through the game. I was blown away by how "correct" the mouth movements were for speaking.
LA Noire was pretty good- most of the time. The game was awkward in other ways for me, but a good experience overall.
I haven't played it in a while, but I remember my main complaints being that I didn't care about my character at all (story about 'you' moved far too slowly), and that somehow they managed to make driving in a game using the GTA engine un-fun and exploration generally uninteresting.
But on the topic, the face stuff in the game was relatively good- but kinda smacked the valley a bit at times as it felt that everything had to be over-acted like a bad play.
I vote subtly implied. The disassembly stopped at the exact point she expressed she was frightened. The "My God" at the end I interpreted as "we've done it", as in created self-aware AI with a survival instinct.
The other explanation, that a single operator has the authority to release a one-off self-aware super-being into the wild without considering the implications of such a decision and without oversight feels harder to swallow.
Why I agree it is harder to swallow, it is most likely what they intended. This was purely an emotional play, they did this to try and pull some heart strings in an unconventional way. The setting is really just a backdrop.
If they make a game out of it, I would almost assure you that the latter is what was intended, and sadly the story will go through the rabbit hole because of it.
That's a good theory except, that you wouldn't want a creature with self-respect etc being treated as merchanise; she says she'll comply but, at least in humans, that resolution wouldn't last forever. No, I vote for the demo being incoherent, "enh, get it done, it's just a demo".
But, they say the demo was made a year ago and that they've made much progress since - why not incorporate the new stuff? Grrr.
I am shocked at how bad I feel for "her". Why do I feel so much empathy for a few thousand pixels that just represent a machine that hasn't even been built yet! Even knowing that, I feel the same, as if she is a real person with emotions.
Her expression on hearing her name is just amazing.
Yes, this is it. We have very uneven abilities to make things look extraordinarily good with computers while our abilities on some other fronts have hardly progressed at all in the past 20 years, while others have had much more believable curves. Speech synthesis, for instance, is terrible by this standard. We have absolutely no ability to synthesize this voice with a specification much smaller than someone feeding it a full voice performance anyhow.
Personally I think it's a grave error to measure our progress on these fronts by getting distracted by the glitzy graphics... we are sooooo much better at those than anything else that they provide a very skewed view of where we are. You can see the same effect in video games; graphics that are orders of magnitude better than ten years ago, game engines themselves are only incrementally better in terms of capabilities and games themselves only incrementally more complicated.
(And to be clear, I'm not saying we have made significant progress in a wide variety of fields, because we have. I'm saying measuring our general progress by the subfield that has been an extreme outlier doesn't give a good view of that progress.)
So you think this clip wouldn't raise human emotions if it were subtitled, without the voice acting (i.e. background music only)? I'm not entirely sure about that.
Your observation supports the hypothesis that you would feel equally bad if you only saw the clip without sound. That hypothesis could be correct independent of mine. Instead of 'not true', you could have said 'also'.
The AI pleading for its life, in a very convincing way, seems like something that could occur by a very intelligent AI attempting to manipulate humans... I imagine an AI could be able to give all the right body language, intonation, etc cues to manipulate humans much better than current humans can. Theres no telling what she is going to do now, because all we currently know is she is good at manipulating a man into not killing her.
I'd watch that movie. The CG wasn't perfect but it was far from zombie-like. The primary problems were subtle stiffness here and there and a feeling that the words were being lip-synced at certain moments.
I find the transition to being clothed fascinating from the standpoint where the android is now humanized and depicted with sexual attributes vs. the assembly phase where components are still relatively asexual. For me, that was a definite "ugh" moment.
There is a general sense of awkwardness and discomfort from both the speech and visual depiction but I think this is deliberate in a sense how an android goes from being an abstract head to a human like representation complete with emotions, motion and self-awareness.
Indeed. The music was done well too, IMO. If you had the same crew apply their technology to the "Hotel Hospital" video[1], it'd be pretty uninteresting. I'd be willing to watch more of this story even if were animated like Reboot[2].
Yeah there aren't many stories about the first self-aware robots. Robopocalypse and Short-Circuit but, to me it's surprisingly un-explored. Maybe because it's hard.
I actually panicked when 'it' was being disassembled. And the smile and 'Thanks' at the end? Unsettling…yet excited and curious about the unknown. I wonder if AI self awareness would actually happen in my life time.
I think it will take a long time for normal AI to progress to the point of self-aware. The leap of progress in AI will probably via alternative route, such as cyborg tech.
It's too complicate to build an AI from scratch; however, human brain has evolved beautifully. It is not inconceivable to scan a brain's interconnectivity to "upload" its entirety into a vast computer. The circuitry of the brain (neural net) can be simulated and run the upload image. The "electronic brain" cyborgic AI can become self aware easily.
The basic mechanic of the circuitry of the brain is relatively easier to understand, neurons and synapses. It's the vast connectivity that's difficult to understand. Scanning a live brain short-circuits the problem.
From the summary, I thought the article was about a real computer ai that could pass the Turing test or recognize its appearance in a mirror. This was a tech demo about motion capture and CG in video games. Extremely impressive but nothing to do with ai.
If you ever played Quantic Dream's Heavy Rain you'll see where this story came from. (Also if you read the article you'll see its also inspired by Ray Kurzweil's The Singularity.) Heavy Rain was a dark and gritty detective story with situations a lot more messed up then sex robots. Someone here was disturbed that the writers of this video came up with this story. That's the point, not every story is going to make you feel happy.
Nice graphics, although the 7900 GPU on the PS3 is really showing its age, they should've used a current highend unit and just say it's a PS4 tech demo.
The whole sci-fi view of AI seems to alwaysbe about emotions, while in fact most AI systems don't have an emotional system, and it's not the most researched subfield therein. It doesn't even seem necessary for a robot to have survival and reproduction needs that would require basic emotions and drives. While emotionsl are good dramatic fodder, it is far from what AI is about.
Really? I was completely undistracted by the fact that I was watching CG. Obviously I was not fooled into thinking that I was watching live action, but that's not what it means to cross the uncanny valley. The point is that you are able to focus on the emotions and expressions of the character you're watching rather than struggling against an urge to warn the others... to protect the children.
The point is that you are able to focus on the emotions and expressions of the character you're watching rather than struggling against an urge to warn the others... to protect the children.
But that can happen after or before the valley, and I personally think they're still before it. It's closer to a cartoon or animated being like Wall-E than to a real human.
The Uncally valley has to do with causing "a response of revulsion among human observers" (http://en.wikipedia.org/wiki/Uncanny_valley). It's when the robot / fictional character feels real enough that it's not obvious it's a robot anymore, but weird enough that it looks like a real creepy human being.
The rendering being cartoon-like or the shading not being realistic has nothing to do with it.
How does it not have anything to do with it? To me, it's exactly the cartoon-like appearance that makes it feel unreal enough that it's obvious it's not an human.
It is very creepy to me and I happen to have a very hard time remember people's faces and recognizing them. I wonder if my brain is just worse than average at face processing, and that's why this becomes uncanny to me? Whereas to someone who has a better than average eye for faces, this is clearly a cartoon.
From the Wikipedia article you linked: "Avoiding a photorealistic texture can permit more leeway."
For me, this goes nowhere near the uncanny valley, for the reason icebraining points out. The coloring and visual tone of the whole piece says "cartoon" to me. It's a different aesthetic than, say, Tangled, but the effect is the same for me. Perhaps this is partly because, as a frequent gamer, that aesthetic is very familiar to me.
Yeah I actually didn't voice my opinion in my parent comment but I agree, this doesn't feel uncanny to me either. Whether something is uncanny or not is probably a subjective assessment.
You were able to focus on the emotions and expressions of the character. The uncanny valley is different for everyone, because it expresses a subjective emotion.
I chose to stop watching about half way through. It made me feel very, very uncomfortable. This has to be near the nadir of my familiarity/revulsion curve. I can't explain why, but that's how I felt.
I think where the uncanny valley falls is different for everyone.
Obviously many people enjoyed watching the Final Fantasy movies as well as the Polar Express movie, and many will enjoy watching Tintin. I can barely get through the ads for these movies. Even the short Tintin preview is cringe inducing to me.
For me the face was convincing. I was lost as soon as the "robot skin" started changing to "human skin". My suspension of disbelief failed utterly then.
I'd be interested to know more about the tech - is there much pre-rendering going on? How reactive could this be to human users?
The fact that the character was an android rather than a human made it easy for me to forget that it was CG because I expect an android to be somewhat artificial looking or acting. If they had shown a human character, I imagine the effect would have been less dramatic.
How do such simple assembly errors give rise to self consciousness in the program seconds after turn on, compared to slave mode unconsciousness that was desired and usually occurred?
And the whole power play dynamic, geeky lazy sounding guy, desperate sex bot about to be erased. The whole unrealistic setup feels uncomfortably like some male sex fantasy... save the self aware sexbot's life, then the sex with her isn't guilty because she's aware and her consent means something? But the power balance is so off, he controls her life, she's... grateful? That's not a good start. And she'll be at this disadvantage everywhere. Anyone could turn her in. She's extremely vulnerable in this world apparently. She has no rights. I guess seeing conscious life forms with as much rights as my toaster doesn't thrill me. Especially when they are clearly sexualized as such
Would anyone else have an easier time sharing my creepiness if: - the robot had been a little boy for sex purposes?
I found the dynamic unlikely and if we reach a world where we can accidentally give AI consciousness and we still haven't gotten the artificial civil rights movement off the ground... Ooph.