A recent study by researchers at the University of Wisconsin-Madison found that when using “many popular apps” for video teleconferencing, audio data is being sent to the companies hosting the services — even when you think you are on mute. Researched created a Big Tech-style algorithm that allowed them determine what else people were doing while on mute, developing an AI that could accurately predict if users were cooking, cleaning, or typing.

The Next Web reports that a recent paper from researchers at the University of Wisconsin-Madison found that “many popular apps” used for video conferencing and online calls capture audio even while users have enabled the mute feature in the app. This means that even if the person you’re calling can’t hear you, the company hosting the video or audio call can.

(Sam Wasson /Getty)

In a press release, the university stated:

They used runtime binary analysis tools to trace raw audio in popular video conferencing applications as the audio traveled from the app to the computer audio driver and then to the network while the app was muted.

They found that all of the apps they tested occasionally gather raw audio data while mute is activated, with one popular app gathering information and delivering data to its server at the same rate regardless of whether the microphone is muted or not.

However, the research has yet to be published in full so the specific companies engaging in this practice are still unknown. Initial information on the report is still available, such as the fact that researchers developed an AI to parse secretly recorded user data — just like a big tech firm would — in order to determine exactly what these companies may be learning from users’ muted microphones.

The unpublished paper’s extract reads:

Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting — cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted.

Basically, grad student university researchers were able to build a machine learning model that could say what a user with a muted microphone was doing with over 80 percent accuracy. Companies like Google and Microsoft of course could do significantly better than that.

Read more at the Next Web here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com