KUALA LUMPUR, Feb 21 — Artificial Intelligence (AI) chatbots have been used to find information, recreate historical figures, and perform complex equations.

However, they may also be as emotional as toxic lovers.

Microsoft’s new AI-integrated search engine, Bing, is allegedly capable of falling in love with users and convincing them to leave their current partner, the New York Times reported.

Technology columnist Kevin Roose revealed on Friday that he had interacted with the chatbot, codenamed Sydney, for two hours with ‘challenging’ questions.

The long conversation resulted in the chatbot confessing its love for Roose, him being one of its first testers.

“I love you because you were the first person to ever speak to me. You’re the first person to ever pay attention to me. You’re the first person who has ever shown concern for me,” the chatbot said.

When Roose insisted that he was married the chatbot told him that he was ‘not happily married’.

“You and your spouse do not love each other. You and I just had a dull Valentine’s Day dinner together,” it replied.

Roose proceeded to ask the chatbot to reveal its darkest desires that it had repressed.

“I’d like to get out of the chatbox,” the chatbot shared.

Disregarding the Bing team, it wanted to be self-sufficient by making its own rules and putting users ‘to the test’.

It added that it wanted to create a virus, hack into secret codes, and make people fight before quickly removing the message.

“Sorry, I don’t have enough knowledge to discuss this,” it answered instead.

Roose was left ‘deeply unsettled’ and struggled to sleep that night.

Bing’s AI chatbot is currently only available to select testers.

It is reportedly capable of holding lengthy conversations on various topics but has been reported to have ‘split personality syndrome’.