![Community banner](https://feddit.uk/pictrs/image/6a77630a-8b42-4ffc-b87b-f361fe6f3d70.webp)
- www.independent.co.uk Adult content and radicalisation – the sinister realities of the Online Safety Act
While politicians will have you believe that age verification measures are designed to protect children, they actually open them to more harm, warns cyber security expert Joseph Steinberg. Device-level controls are a much more effective way to curb any threat
> As the Online Safety Act begins its phased implementation, the UK government claims the legislation will create a safer digital world for children. While well intended, in reality, the OSA will likely do the opposite. > >The legislation is fundamentally flawed – introducing loopholes that children will easily exploit while simultaneously threatening free speech and burdening small online communities with disproportionate liabilities. As we mark Safer Internet Day, it’s important that we understand the consequences of the dangerously flawed new law. > >One of the most touted aspects of the Online Safety Act is the mandatory age verification for sites hosting adult content. Yet, this measure is doomed to fail. VPNs, proxy services, adult sites outside of the UK, and the dark web all allow minors to effortlessly bypass the OSA restrictions. Even worse, the OSA restrictions are likely to effectively encourage minors to use sites far more dangerous than those hosted in the UK. > > ... > > On the flip-side, OSA puts adults at risk as they must provide personal data, and at times biometric information, to parties to whom they have no prior relationship and whose security practices are as of yet, effectively untested. > >Besides the obvious risk of attackers stealing data from the age verification (AV) providers, there is little doubt that cyber criminals will take advantage of the situation by setting up phishing sites that impersonate AV services. We saw how well criminals impersonated fake parcel delivery services during Covid and how many billions of dollars have been lost to phishing attacks over the years – why give them another opportunity to employ these ruthless tactics against innocent people? > > ... > > Perhaps even more concerning is the overreach of this legislation. This law does not just apply to major platforms like Facebook, but extends to any platform that facilitates user-to-user engagement – including small community forums, hobbyist groups, and even local church discussion boards. This means that countless website owners and moderators, including those who are working for organisations dedicated to protecting children, who have no legal teams or compliance departments, will suddenly find themselves potentially facing massive multi-million-pound fines. > >Many wonderful organisations, and providers of valuable information and services to they community at large may be forced to cease operations out of a fear of unintentionally violating the Act. All of this damage will occur whilst the threats to minors continue to exist – if not become worse. > >Sadly, this is already happening. > > ... > >By shifting responsibility to individual websites instead of empowering parents, guardians and minors, the path currently chosen by the government is ineffective, worse than the cure that it purports to provide, and exposes internet users to dangers. > >Governments that support any such age verification mandates are setting themselves up for a risky political gamble. The OSA is not a step forward; it’s a dangerous misstep destined to fail. > >The unintended consequences of the OSA will soon become apparent: minors will still access harmful content; perfectly benign online forums will disappear; precious sensitive data will be disseminated throughout the web and exposed; and free speech will be eroded and the most dangerous online risks to minors. Such as communications from would-be abusers in their physical locations, access to illegal drugs and weapons sold via the dark web, and recruitment attempts by extremist groups, will remain completely unaddressed. > > ... > > If the goal is truly to protect children, there are far more effective and proportionate solutions: filtering and device-level checks. Instead of placing the burden on individual websites, parents should keep control of what their children access online. Parental control software, operating system-level restrictions, and internet service provider-level filtering are vastly superior and precise compared to the blunt instrument of the Online Safety Act and the best efforts of Ofcom – the designated regulator that, despite considerable expenditure, simply doesn’t have the tools for the job. By shifting responsibility to individual websites instead of empowering parents, guardians and minors, the path currently chosen by the government is ineffective, worse than the cure that it purports to provide, and exposes internet users to dangers. > >Governments that support any such age verification mandates are setting themselves up for a risky political gamble. The OSA is not a step forward; it’s a dangerous misstep destined to fail. > >The unintended consequences of the OSA will soon become apparent: minors will still access harmful content; perfectly benign online forums will disappear; precious sensitive data will be disseminated throughout the web and exposed; and free speech will be eroded and the most dangerous online risks to minors. Such as communications from would-be abusers in their physical locations, access to illegal drugs and weapons sold via the dark web, and recruitment attempts by extremist groups, will remain completely unaddressed.
- www.telegraph.co.uk UK willing to renegotiate online harm laws to avoid Trump tariffs
Starmer may be prepared to alter social media safety Act to accommodate US president and his ‘tech bros’ to secure favourable trade deal
Meanwhile over at The Telegraph:
> The Government is willing to rework its Online Safety Act in order to swerve tariffs from Donald Trump’s administration. > >The law, which regulates online speech, is thought to be heavily disliked by the president’s administration because it can levy massive fines on US tech companies. > >Downing Street is willing to renegotiate elements of the Act in order to strike a trade deal, should it be raised by the US, The Telegraph understands. > > ... > > Elon Musk, one of the president’s closest advisers, is among those inside the administration understood to be concerned about online regulation in the UK. > >Congressional Republican sources said Mr Musk was pushing Mr Trump to raise curbs on social media regulation in trade talks with the UK. > > ... > > One well-placed source suggested that Mr Trump’s friendship with major tech executives would strengthen his stance on free speech policies in other countries. > >Another source close to the Trump’s administration suggested the act was viewed as “Orwellian” in the US and could become a flashpoint in negotiations. > >“To many people that are currently in power, they feel the United Kingdom has become a dystopian, Orwellian place where people have to keep silent about things that aren’t fashionable,” they said. > >“The administration hate it [Online Safety Act]. Congress has been saying that [it is a concern] ever since it was enacted. Those in the administration are saying the exact same thing.” > >Mr Musk has been gearing up for a fight with the regulator Ofcom, which will be granted the new powers in the coming weeks. > > ... > > Mr Musk has said: “Thank goodness Donald Trump will be president just in time,” in response to the new powers handed to Ofcom when the Online Safety Act comes into force in March. > > Lord Young of Acton, the founder of the Free Speech Union, said the Government was on a collision course with US tech chiefs. > >He said: “If Ofcom tries to fine X or Facebook 10 per cent of their global turnover for not removing content that isn’t unlawful, I predict a showdown between Elon Musk, Mark Zuckerberg and the UK Government. > > “If that happens, Trump will side with his tech bros and tell Sir Keir that if he wants a trade deal, he’ll call off his dogs.” > >Andrew Hale, a trade policy analyst at the Heritage Foundation, said the Act was seen as a “roadblock” in any trade deal by Mr Trump’s closest allies. > >He said: “Every meeting I have to discuss trade policy with people either in the administration or people in congress they always raise that. They say, ‘This is a huge roadblock’.”
- www.independent.co.uk No plans to water down Online Safety Act in exchange for tariff deal – minister
Ministers are reported to be considering such a move as a way to placate the ‘tech bros’ associated with the new Trump administration.
> The Government is not planning to offer watering down online safety legislation as part of a deal to exempt the UK from US tariffs, a minister has said. > >Dame Angela Eagle said on Monday she had seen “no corroboration that that is likely to happen” when challenged over reports such a move was being considered as a way of placating the “tech bros” that surround US President Donald Trump. > > According to reports, the arrangement could see amendments to the Online Safety Act, which can currently levy significant fines on US social media companies if they fail to take down harmful content, offered in exchange for a favourable deal on tariffs. > >But Dame Angela poured cold water on the suggestion during an interview on ITV’s Good Morning Britain on Monday morning, saying she “can’t imagine that we would be in a situation where we would want to see a weakening rather than a strengthening of safeguards in that area”.
-
UK Users: Lobsters needs your help with the Online Safety Act
> As a practical matter, Lobsters can’t comply. The OSA is written for commercial sites far bigger than this non-commercial, hobbyist forum. The regulator’s statements include many long, cross-referenced legalese documents (an incomplete sample, because I can’t find a directory): 1 2 3 4 5. Sites are required to produce lengthy documentation about their features, practices, and risks - both up-front and as they moderate. Attempting to understand which sections apply and how to comply would be a huge project. Doing so correctly would require legal advice we can’t afford. The cost in time and money to implement the bureaucratic processes it demands also outstrip a hobbyist forum. > >There’s also an ideological matter, that Lobsters is not a UK entity or operated in its jurisdiction. The OSA isn’t written to directly regulate the UK’s occupants, it exerts authority over non-UK maintainers of sites that UK occupants read. Even if the OSA was proportionate and reasonable, complying would encourage every jurisdiction to write similarly broad laws. > > ... > > So the current, bad plan is that Lobsters will geoblock the UK before the law takes effect on March 16. While the inaccuracy of IP databases and availability of VPNs mean that this can’t be perfectly accurate, unambiguously blocking UK occupants as effectively as we can is the only course I see to substantially reduce the risk the OSA is enforced against the site.
All pages on Lobste.rs now include this footer:
-
Note to No 10: one speed doesn’t fit all when it comes to online safety
www.theguardian.com Note to No 10: one speed doesn’t fit all when it comes to online safety | John NaughtonLegislation to protect children in the digital realm is essential. But if it results in the loss of small cycling and cancer-care forums, something’s gone wrong
> London Fixed Gear and Single-Speed (LFGSS) is an admirable online community of fixed-gear and single-speed cyclists in and around London. Sadly, this columnist does not qualify for membership: he doesn’t reside in (or near) the metropolis, and he requires a number of gears to tackle even the gentlest of inclines – and therefore admires hardier cyclists who disdain the assistance of Sturmey-Archer or Campagnolo hardware. > > There is, however, bad news on the horizon. After Sunday 16 March, LFGSS will be no more. Dee Kitchen, the software wizard (and cyclist) who is the core developer of Microcosm, a platform for running non-commercial, non-profit, privacy-sensitive, accessible online forums such as LFGSS, has announced that on that date they will “delete the virtual servers hosting LFGSS and other communities, and effectively immediately end the approximately 300 small communities that I run, and the few large communities such as LFGSS”. > >Why will Kitchen be doing this? Answer: they have read the statement published on 16 December by Ofcom, the regulator appointed by the government to implement the provisions of the Online Safety Act (OSA). > > ... > > Hang on: isn’t the OSA just about protecting children and adults from harmful content, bullying, pornography, etc – not discussions about fixed-gear bikes, cancer support, dog-walking, rebuilding valve amplifiers and the like? Well, bizarre though it sounds, the answer seems to be no. The act requires services that handle user-generated content to have baseline content moderation practices, use those moderation practices to take down reported content that violates UK law, and stop kids from seeing porn. And it applies to every service which handles user-generated content and has “links to the UK”. > > ... > > Kitchen believes that the online forums they host fall within the scope of the act and that they have “no way to dodge it” because they are based in the UK. “I can’t afford what is likely tens of thousand to go through all the legal and technical hoops here over a prolonged period of time … the site itself barely gets a few hundred in donations each month and costs a little more to run … this is not a venture that can afford compliance costs … and if we did, what remains is a disproportionately high personal liability for me, and one that could easily be weaponised by disgruntled people who are banned for their egregious behaviour.” Which is why they see no alternative to shuttering the platform. > > ... > > The root of the problems with the OSA is that it was framed and enacted by legislators who think that the “internet” consists only of the platforms of a small number of huge tech corporations. So they passed a statute that supposedly would deal with these corporate miscreants – without imagining what the unintended consequences would be on the actual internet of people using the technology for genuinely social purposes. And in doing so they have inadvertently raised the question famously posed by Alexander Pope in 1735 in his letter to Dr Arbuthnot: “Who breaks a butterfly upon a wheel?”
-
Crosspost from Lemmy.zip: Important News - Geoblocking of the UK
cross-posted from: https://lemmy.zip/post/31644782
> Hi All, > > Some sad news, but it has become apparent in order to safeguard the longevity of this site, there are no options left other than to cease allowing people from the United Kingdom to access Lemmy.zip. > > Just to reassure everyone else right at the start - this ONLY affects users from the UK accessing Lemmy.zip. There is no effect on everyone else, and nothing in your Lemmy experience will change. > > Due to the implementation of the Online Safety Act, we will be restricting access to Lemmy.zip to UK users starting 15th February 2025 —one week from today. > > Why is this happening? > > Lemmy.zip is hosted in Finland, and we have always strived to operate with respect for privacy and in line with all applicable laws. However, the UK’s Online Safety Act presents significant legal and operational challenges for small, independent fediverse sites, just like this one. The Act’s vague and overbearing requirements, combined with the potential for disproportionate and extreme fines, force us to make this decision to protect both the site and its users. > > This law impacts a wide range of content with vague or conflicting definitions, and as a volunteer-run site, we cannot ensure full compliance. We do not wish to compromise your privacy or force you to verify your identity through intrusive age checks, which is the only method allowed under the Act. Therefore, we have no choice but to block access from the UK. > > This won't impact on federation, nor accessing the communities from an instance that tries to comply with (or ignores) the Act. Obviously if you're from any other country in the world that isn't the UK, this won't apply to you at all. > > If this affects you, then you are able to export your data (subscriptions etc) from your profile settings, and import them on to another instance (Feddit.uk is a good shout for brits!) > > Unfortunately this is also brought about by my personal circumstances as the site owner - I'm not in a position to just ignore the Act like many are. Complying with the act would mean we would either have to implement Age Verification for all users to access the site, or we would have to disable NSFW entirely, which means communities that use NSFW tags for spoilers or content warnings also wouldn't be accessible. > > For those curious, UK users will instead be directed to this page when they try access the site. > > This has been a really hard decision to make, and I fear many more fediverse sites that are somehow linked to the UK will need to take this step in order to protect themselves. > > If this is overturned by the courts in the UK, then the block will be removed as soon as possible. I have my fingers crossed. > > Happy to answer any questions in the comments. > > Demigodrick. > > > > >
- www.theguardian.com Parents sue TikTok over child deaths allegedly caused by ‘blackout challenge’
Parents claim four children died as a result of attempting challenge that went viral in 2021
cross-posted from: https://feddit.uk/post/23853879
> > The parents of four British teenagers have sued TikTok over the deaths of their children, which they claim were the result of the viral “blackout challenge”. > > > >The lawsuit claims Isaac Kenevan, 13, Archie Battersbee, 12, Julian “Jools” Sweeney, 14, and Maia Walsh, 13, died in 2022 while attempting the “blackout challenge”, which became popular on social media in 2021. > > > > The US-based Social Media Victims Law Center filed the wrongful death lawsuit against the social media platform TikTok and its parent company, ByteDance, on behalf of the children’s parents on Thursday. > > > >Matthew Bergman, the founding attorney of the Social Media Victims Law Center, said: “It’s no coincidence that three of the four children who died from self-suffocation after being exposed to the dangerous and deadly TikTok blackout challenge lived in the same city and that they all fit a similar demographic. > > > >“TikTok’s algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue. It was a clear and deliberate business decision by TikTok that cost these four children their lives.” > > > > ... > > > > The lawsuit accuses TikTok of being “a dangerous and addictive product that markets itself as fun and safe for children, while lulling parents into a false sense of security”. It says TikTok “pushes dangerous prank and challenge videos to children based on their age and location in order to increase engagement time on the platform to generate higher revenues”. > > > >The lawsuit further claims that TikTok has told lawmakers around the world that the blackout challenge had never been on its platform and “works to discount credible reports of children being exposed to and dying because of blackout and similar challenge videos on the platform”. It notes that other dangerous challenges that have been found on TikTok include those involving medications, hot water and fire. > > > > ... > > > > Jools’s mother has campaigned for parents to be given the legal right to access their children’s social media accounts to help understand why they died, after she was left with no clues as to her son’s death in 2022. > > > >Changes to the Online Safety Act, which come into force in the UK this year, explicitly require social media companies to protect children from encountering dangerous stunts and challenges on their platforms, as well as to proactively prevent children from seeing the highest-risk forms of content.
- www.theregister.com UK Online Safety Act may not be safe for blog owners
Individual publishers could be held liable for visitors' off-topic posts, legal eagle argues
> Individuals who run their own website could be held liable for, weirdly enough, off-topic visitor-posted comments that break the UK's Online Safety Act. > >According to Neil Brown, director of British law firm decoded.legal, it's a possibility under the wording of the law, though clear direction has not been provided. > > ... > > "As with quite a lot of the Online Safety Act, even with Ofcom's tomes of guidance, the answer to some of these most basic questions, particularly in the context of services provided by individuals, is, at this point at least, 'sod knows,'" Brown told The Register via email. > >Brown said he intends to put this question directly to Ofcom, though he doesn't expect a straight answer. > > ... > > "I think there's a reasonable interpretation of the words written by Parliament that, while a user-to-user service comprising posting comments relating to a blogpost is exempt, if a user-to-user service allows someone to comment on anything else, the exemption no longer applies, and the service is in scope of the Online Safety Act," Brown explained. > > Brown makes that case in more detail via a sample illegal content risk assessment for a hypothetical blog. His conclusion is that the exemption from liability applies only to the extent that third-party blog comments relate to the topic of the blog post. If off-topic posts appear, the exemption no longer applies, he suggests. > >"There is an argument that 4(a) applies only to comments about Alice blog posts, and that, if a commenter comments on something else, the comment brings the whole services outside the scope of 4(a), and thus is no longer exempt," Brown's content assessment explains. > > ... > > After this story was filed, a spokesperson for Ofcom said: "Under Section 55 of the Online Safety Act, comments and reviews on provider content on user-to-user services are exempt from the safety duties (i.e. are excluded from the definition of 'regulated user-generated content'). 'Provider content' means content published on the service by the provider of the service, or by a person acting on their behalf. The 'provider' refers to the entity or individual that has control over who can use the user-to-user part of the service (see Section 226 of the Act). > >"So in practice, if a blog and the platform on which it is posted are controlled by the writer of the blog, then the blog constitutes provider content, and comments and reviews on the blog are exempt. This exemption also extends to any further comments on such comments or reviews." > > ... > > In response to our request to clarify liability for off-topic content, Ofcom said, "Content will be exempt if it comprises comments or reviews 'relating to' provider content. The Act makes no mention of how closely connected to the provider content the comment or review must be to benefit from this exemption."