Login with Hive Keychain
Enter your Hive username to sign in securely.
Welcome to HiveComb
HiveComb runs on Hive — an open, decentralized blockchain where your posts, votes, and communities belong to you, not a company. To get started, follow these steps:
Create a Hive account
Set up your free account — it only takes a minute.
Install Hive Keychain
A browser extension that securely signs your transactions — your keys never leave your device.
Refresh & log in
Once Keychain is installed, refresh this page and click Login again.
Need help? Join our Discord and we'll help you get set up.
No account? Create one
Having trouble creating your account? Come to our Discord and we'll get you set up.
No posts found
Try adjusting your filters or wait for the worker to classify more posts.
No posts found
Try adjusting your filters or wait for the worker to classify more posts.
No posts found
Try adjusting your filters or wait for the worker to classify more posts.
Welcome to HiveComb!
Choose your default filters to see the content you care about most.
Languages
Categories
Sentiment
We need to talk about Ava.
I think that most of my friends have seen the movie Ex Machina or read Do Android's Dream of Electric Sheep. At least, you've probably seen Blade Runner or Westworld.
For this, I think that it's just appropriate to say, "We need to talk about Ava." Ava being the AI in Ex Machina. For those who haven't seen the movie, Ava behaves like a human in almost every observable way. She also looks like a human but for her exposed interior that she has for most of the movie. She's not even being subjected to a Turing test: we all know she'd pass. Rather, the movie is kinda about a reverse Turing test.
So, science fiction has dealt with AI for decades and built up serious ethical questions, few as good as Ex Machina. So, here's the dilemma that I want to pose: "When do we have moral obligations toward Ava?"
We're coming to a point technologically where I don't think that the question of Ava coming into existence is an "if." It's a "when." When that first Ava is created, we're going to be faced with massive ethical dilemmas. Does Ava have agency? Consciousness? Free will? Do we even have free will? Are those factors why we feel moral obligations to others? Do we assume that AIs have consciousness and behave accordingly; or, does Ava have to wait until we solve those mysteries before we treat her as well as another person? What does this say about how we treat each other and our fellow animals?
Again, I think we need to talk about Ava.
Report Misclassification
Why is this post incorrectly classified?
Comments