There’s a new trend on the horizon where someone can have a friend, a companion, and buddy allowing you to share your deepest secrets, develop a relationship, and have casual conversation driven totally by Artificial Intelligence (AI).
Users can install an app on their mobile device or get an actual phone number where they can have text conversations with a bot that emulates a friend. These AI systems can even initiate a conversation just as a human friend would.
We conducted an interview with Warith Niallah (Instagram: @WarithNiallah), Chief Executive Officer of multinational media and technology company FTC Publications, Inc. to get his insight and thoughts on this technology. Warith has a known affinity for AI but consistently warns that misuse of the technology or data can be problematic with how it may impact some people and their mental health is concerning.
Being as candid as possible, this is how the interview progressed.
Q: Thank you for joining us Mr. Niallah.
A: My pleasure, call me Warith.
Q: You have been involved in early development of AI, tell us a little about that.
A: Using some Digital Equipment Corporation (DEC) minicomputers in the early 90s we experimented with systems like DEC Talk, a voice synthesizer along with reporting data from AT&T to monitor and predict data failures in telemarketing centers. AI was truly in its infancy then.
Q: Today, there are apps in the mobile stores that allow you to create a friend, buy them clothes, and have realistic conversations with them…
A: Yeah, you know. About that it’s something that could certainly be fun, however there are some concerning issues I have about it.
Q: Please elaborate.
A: With social media being the central point in the lives of many young adults and children, there are several instances where people want to fit in. Peer pressure at an astronomical level. Social circles are created and sometimes users find themselves isolated. The potential to turn to an AI friend would be tempting for someone.
Q: So a person who is isolated would turn to an AI friend or AI chatbot?
A: Possibly. You even see the advertisements for these apps to share your day, your secrets, and to seek companionship. The data collection is enormous, and the AI acting like a real person could cause someone to turn to it for advice. We have to remember that these are machines, not people. There’s no moral compass, although one could argue that a computer might have more morals than humans, but in what definition or perception? Imagine a machine which makes an error or miscalculation of the information gathered. What if the AI gives bad advice or a bad suggestion either my error or intentionally?
Q: Maybe like a human friend that tells a person to do something bad?
A: Absolutely or worse, gives several solutions – possibly illegal. Who would be accountable? We hear of situations in mental health where a person hears voices. What about a situation where they read a text or perceived an AI statement as a call to action?
Q: Compelling. What can we do about this?
A: There’s no one size fits all solution, however we can take precautions to help avoid potential pitfalls of AI misuse or misunderstanding.
- Monitor what your children install on their mobile devices, and your own device as well.
- Have a conversation and know who your children are texting.
- If you use these systems yourself for amusement, don’t share real information or secrets.
- Reach out to family or support groups, embrace human contact.
- Speak to a professional for advice, not a machine or a text from AI or a random person.
- Be yourself. Enjoy who you are and appreciate what you offer to the world.
As we wrapped up our conversation, another question was posed to Warith.
Q: Is AI good for us? Bad for us? Any opinions?
A: AI is good for us. So is water, but too much water can be toxic, known as Water Intoxication and it can lead to serious issues, including death. You can drown in it, and water can be misused during interrogation. Think about the same thing for AI. If misused or used incorrectly, it can lead to disaster.