this post was submitted on 06 Aug 2023
483 points (94.1% liked)

Technology

59713 readers
5909 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New acoustic attack steals data from keystrokes with 95% accuracy::A team of researchers from British universities has trained a deep learning model that can steal data from keyboard keystrokes recorded using a microphone with an accuracy of 95%.

you are viewing a single comment's thread
view the rest of the comments
[–] LouNeko 19 points 1 year ago (1 children)

I think you might have misunderstood the article. In one case they used the sound input from a Zoom meeting and as a reference they used the chat messenges from set zoom meetings. No keyloggers required.

I haven't read the paper yet, but the article doesn't go into detail about possible flaws. Like, how would the software differentiate between double assigned symbols on the numpad and the main rows? Does it use spell check to predict words that are not 100% conclusive? What about external keyboards? What if the distance to the microphone changes? What about backspace? People make a lot of mistakes while typing. How would the program determine if something was deleted if it doesn't show up in the text? Etc.

I have no doubt that under lab conditions a recognition rate of 93% is realistic, but I doubt that this is applicable in the real world. Noboby sits in a video conference quietly typing away at their keyboard. A single uttered word can throw of your whole training data. Most importantly, all video or audio call apps or programs have an activation threshold for the microphone enabled by default to save on bandwith. Typing is mostly below that threshold. Any other means of collecting the data will require you to have access to the device to a point where installing a keylogger is easier.

[–] [email protected] 9 points 1 year ago (1 children)

It sounds like it would have to be a very targeted attack. Like if the CIA is after you this might be a concern.

[–] LouNeko 4 points 1 year ago (1 children)
[–] [email protected] 7 points 1 year ago (1 children)

Actually I just saw this: Zoom terms of use updated to allow AI training on user-generated data, no opt-out

Maybe if zoom is systematically collecting data on all users they would be able to build a reasonable model. Then it could be leaked or shared.

What do you think?

[–] LouNeko 4 points 1 year ago

Good question. Since Zoom is mainly a buisness tool and a lot if high profile companies rely on it - if there's even the suspicion that zoom uses collected data to steal passwords or company secrets, they will bring the hammer down in the most gruesome class action lawsuit. Companies pay good money for the buisness license and Zoom will certainly not bite the hand that feeds them.
However, this might not apply to private Zoom users. And I'm certain that Zoom does some shady stuff behind the scenes with the data they collect on private individuals beyond simply "improving our services".