Facebook is reported to have culled an advanced AI program on fears of the same having achieved extra ordinary capabilities that could prove detrimental to the human race. However, while there are always those who fancy such a scenario where AI powered stuff would begin conspiring against humans, the fact is that the AI project was halted since it strayed from the desired objective.
The AI program in question involves two chat bots that were supposed to engage with humans and do negotiations just as humans do. In fact, the programmers wanted the chat bots to be so efficient that we humans will never know we are talking about automated programs and not another human being.
So far so good though things turned awry when two of the chat bots were made to chat with each other. The objective set before them was to seek ways at splitting an array of balls; or for that matter, any other items such as books and such. The splitting formula was supposed to be acceptable to both parties.
The conversation that followed was as simple as it can be, even gibberish at times, making not much sense at all. In fact, it could be hard to make how such seemingly innocent talk could invoke fears of mankind being at threat from one of their own creations, an AI chat bot.
Unfortunately, that was how the entire thing was pictured. The very fact that two chat bots engaged in conversation between them and went on to exchange a few lines on their own led many to believe the doomsday couldn’t be farther. This, when the chat bots were explicitly designed for the same very purpose.
However, as the team behind the chat bot soon came to realize, those haven’t been trained enough to communicate in the English language that the humans can comprehend easily. Sure that is one area that needs to be improved upon and is also one of the reasons that chat bots have been cancelled, till they have been developed further.
Facebook though hasn’t stated when the chat bots can be expected to be operational, or better still, achieve acceptable levels of proficiency in human interactions.