Latest Posts

Sorry, no posts matched your criteria.

Stay in Touch With Us

Odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore.

Email
magazine@example.com

Phone
+32 458 623 874

Addresse
302 2nd St
Brooklyn, NY 11215, USA
40.674386 – 73.984783

Follow us on social

Daily Invest Pro

  /  Editor's Pick   /  Senator targeted in deepfake call with ‘malign actor’ posing as Ukrainian

Senator targeted in deepfake call with ‘malign actor’ posing as Ukrainian

The chair of the Senate Foreign Relations Committee was lured into a video call with a “malign actor” probably using “deepfake” artificial intelligence technology to pose as a top Ukrainian official, lawmakers and congressional aides said Thursday.

Sen. Ben Cardin (D-Md.) was contacted via email last week by someone posing as Dmytro Kuleba, the former Ukrainian foreign minister, to have a conversation over Zoom. On the video call with the senator, the person’s voice and appearance matched that of Kuleba, but Cardin grew suspicious when the man asked out-of-character questions related to the upcoming election, according to two Senate aides who provided details of the event on the condition of anonymity because they were not allowed to talk to the media. The person purporting to be Kuleba also asked whether the senator supported providing long-range missiles in the Ukraine-Russia conflict.

The call was first reported by Punchbowl News.

“In recent days, a malign actor engaged in a deceptive attempt to have a conversation with me by posing as a known individual,” said Cardin in a statement. “After immediately becoming clear that the individual I was engaging with was not who they claimed to be, I ended the call and my office took swift action, alerting the relevant authorities. This matter is now in the hands of law enforcement, and a comprehensive investigation is underway.”

The incident has raised concerns that more lawmakers could be targeted by sophisticated “deepfake” technology that allows people to impersonate the voice and appearance of political figures. Committee staff have been instructed to exercise an extra degree of caution with external communications, paying particularly close attention to the numbers and emails of incoming individuals claiming to be powerful people.

Cardin, who is retiring at the end of this year, is not the only elected official to fall prey to this type of scheme. Earlier this year, then-U.K. Foreign Secretary David Cameron announced that he had participated in a fake video call from someone pretending to be Petro Poroshenko, the former president of Ukraine. The mayors of several European cities were also lured into a video call with someone pretending to be the mayor of Kyiv, the Guardian reported at the time.

Kuleba has strong ties to lawmakers and senior U.S. officials dating back to the start of Russia’s full-scale invasion in 2022. Many have come to know Kuleba and his mannerisms, potentially making him more difficult to impersonate than others.

In a statement, Kuleba said he was “99 percent sure” the deepfake was initiated by “Russian pranksters,” and warned people to stay alert.

“The best thing you can do to avoid getting trapped in the deepfake is to always verify the source and not tell the truth to strangers,” he wrote in Ukrainian.

U.S. officials have warned that foreign actors are using deepfake technology to sow discord and misinformation. At the start of the Russian invasion of Ukraine, a deepfake of President Volodymyr Zelensky appeared online telling Ukrainians to surrender.

This post appeared first on washingtonpost.com