Zoom Video Communications, Inc. recently updated its Terms of Service to encompass what some critics are calling a significant invasion of user privacy.
Zoom's updated policy states that all rights to Service Generated Data are retained solely by Zoom. This extends to Zoom's rights to modify, distribute, process, share, maintain, and store such data "for any purpose, to the extent and in the manner permitted under applicable law.", including AI and Machine Learning.
They can't be that stupid. Many companies that use zoom do it to discuss with clients, in conversations that are expected to be private and confidential. Training an ai might mean leaking some of this content, unless it's an AI used exclusive internally at zoom. They better not use recording of meetings, unless they are ready to pay lawyers for years
Unrelated, you’re my favourite game ever and I love you and miss you and I’d pay so much for a pre-re server without a cash shop at maybe 2-5k total players.
I love you too and each winter I make a habit of hopping on a p-server and enveloping myself in nostalgia. I’m still waiting for an RO Lemmy community.
Let me know if you ever start playing again and I’ll be your buddy.
During the early part of 2020 I moved my classes onto Zoom. Because of this change, I'll be moving my video conferencing elsewhere. What's with companies destroying themselves while attempting to maximize profits. Just another reddifugee on kbin.social. SMH.
Yeah WTF? Companies that have medical data, other PII.... they're going to have to cancel their Zoom contracts right? I guess that's up to their lawyers to interpret/decide.
Considering courts have been using Zoom since COVID…I am dying to see what happens. Will Zoom clarify that it doesn’t collect/utilize meeting recordings to train AI? Is it going to exactly that and force anyone conducting sensitive business to find a new platform?
This feels like a remarkably bad move, both for privacy and for Xoom’s own business concerns, unless I’m missing something.
Won't be an issue for courts. In fact, the original text prediction software came from the Enron Corpus - a trove of emails gathered by the FERC during their lawsuit against Enron.
Now, companies will absolutely not be friendly about this. The number one reason most companies are hesitant to AI is because they don't want to risk giving sensitive data to an outside company.
Remember: the corporate meetings and university lectures are the tip of the iceberg of the kind of data Zoom has on people.
Zoom is used by teenage couples to call each other and hang out, which might turn into discussing sexual themes as teenagers dating often do.
Zoom is used by general-care doctors when their patients describe the rash on their anus.
Zoom is used by psychiatrists and therapists talking to their patients during some of the most vulnerable and precarious times of their lives.
Zoom is used by lawyers talking to their clients in all kind of cases, criminal, civil, divorce/family, inheritance, etc.
Zoom was used by actual fucking courts to hold actual fucking criminal trials. Like bruh the fucking US judiciary department couldn't have self-hosted one of the many open source and E2EE solutions?
The fact that they can do this with no oversight or regulatory bodies intervening is utterly ridiculous. Zoom has probably some of the most sensitive data of people's lives. It is not a social media platform where people know that they shouldn't put too sensitive information on, it was literally intended and marketed for people to use for sensitive communications. They shouldn't even be keeping any amount of data after the call ends, IMO, but using it to train an AI (to presumably sell later) is utterly morally bankrupt, and so are the regulatory agencies and lawmakers who could have intervened. Fuck you Zoom, fuck you FCC/FTC/whoever handles data privacy in the US. You want to ban TikTok because of its "national security implications" but don't bat an eye when it's a US company doing something far worse huh? Not implying I like TikTok, but TikTok doesn't have access to live court trials or doctor-patient discussions.
Yes, we shouldn't have used Zoom in the first place. But that ship has sailed and most people were forced to use it against their will if their company/university/doctor/lawyer/judge decides to use it, and/or they did not realize the terrible data security/privacy implications of using it. It's entirely unhelpful to victim blame and go "well you shouldn't have used Zoom then! Sucks for you" as I see so many people in the FLOSS/privacy community doing. Additionally, that also does not address the actual societal/legislative issues of them being allowed to keep that information and use it for profit.
Zoom was used by actual fucking courts to hold actual fucking criminal trials. Like bruh the fucking US judiciary department couldn’t have self-hosted one of the many open source and E2EE solutions?
It should be said that many business customers do use a self-hosted version of Zoom. I couldn't say for sure if every court did, but major government bodies definitely used something more than the free package, which come with different T&Cs.
Additionally, that also does not address the actual societal/legislative issues of them being allowed to keep that information and use it for profit.
This is the big issue in online user data. In no other instance in life can someone take something without offering anything in return. Yes, websites are usually free, but Microsoft collect your data from the software you already pay them for. Just like you can't build and sell a car without paying for the nuts and bolts, there needs to be clear legal infrastructure to prevent data businesses from getting away with taking everyone's data for free and using it to develop products for pure profit.
Do you know if they support disabling things like auto-equalization of audio or changing the bitrate? I use zoom for music lessons because they’re the only one I’ve ever found that will let me do that, which sucks because zoom really isn’t that great of an app
It’s not that easy, because you are dependent on the person / organization setting up the conference. Privately I would never use it, but I often follow webinars and information sessions hosted by an multinational organization, and these are always held via Zoom.
So it would be a “take it or leave it” approach. I would not even know to whom to refer to ask for an alternative option.
I should just tell our CEO then that I refuse to use Zoom and fail to show to any and all meetings as well as tasks which require Zoom? That's it then. Privacy is saved.
This kind of optimistic attitude ignores the reality that such actions are only effective when organized in mass. If I stop using zoom, it's pointless. It's only effective in the context of organizing with others.
The rich might want us to think that we are empowered to inflict change by merely making personal choices (which in reality are completely futile). In reality, our power shows when we're united.
Not to mention saying "just stop using Zoom" is pretty dumb if your company insists on using it. I've lobbied for other apps to be used with my company but they didn't budge mainly because we'll have to convince ALL our customers and clients to use whatever service we use as well.
Most People are lazy in general and doesn't mind getting exploited as long as they can't see or feel it.
Times have changed and you'll have to start acting as an individual. Take a stand.
For example- My friends used WhatsApp and I moved to signal, only few moved to signal and eventually others started to move. So someone has to take action for the change and the change starts with you.
I never started using zoom because of the original privacy concerns. It would have been great if people would have listened, but the truth is that most humans are just not that smart.
I don't even bother anymore. I just try to live my life in a way that corporations profit as little off of me as possible.
Went to Infocomm this year, the audio / video trade show, to get the lay of the land on next generation conference systems.
The two main elements at almost every booth were zoom and teams. Those two platforms have completely replaced IP based conferencing, and when the tide turns like this, no matter how bad the idea, we're stuck with it for five to ten years.
On both the hardware and software level, there just isn't an alternative for corporate scale conferencing.
I may be able to make the argument against zoom for privacy reasons, but I suspect teams isn't going to be any better.
I've always liked Google Meet better anyway. Hopefully Google don't decide to scrap it too, because while it seems like it is going to stick around, they do really like scrapping projects...
edit: Nevermind, I forgot people aren't allowed to have preferences.
@skilledtothegills would be forbidden for them to train on actual content from calls under EU law, as it would be in breach of the ePrivacy Directive (read alongside something called the European Electronic Communications Code, which gives similar obligations to 'over-the-top' providers as to classic telecoms). Not that US tech firms have a great history of adhering to EU law.
That’s a very zoomy thing to do given what we’ve seen the company do so far. If they can figure out a new way profit from stabbing you in the back, they absolutely will go for it.
I'd wager enterprise customers wouldn't want this either, and I won't be surprised if they demand to explicitly remove that clause on their contracts. Imagine all the highly confidential info and trade secrets being exchanged on company zoom meetings that can be harvested for AI use.
I used zoom for some hobby online meetups.
The majority used zoom.
I'm assuming that it's organizer's choice, which they're familiar at work. I'd love to spread a word for FOSS alternatives, but sadly I'm not the person organizes events.
The thing is for most people zoom is equivalent of video conference. Zoom has soared its publicity in WFH era, and Zoom decides to (like every corporation does) utilize it, milking every possible profit from it.
And people will still continue using it. Though, maybe you could increase pushback by telling these ‘nothing to hide’ people that this AI could be used to create deepfake porn of them?