Here for discussion of all things association football/soccer! Rules - No
bigotry - including racism, sexism, ableism, homophobia, transphobia, or
xenophobia. - Be respectful, especially when disagreeing. Everyone should feel
welcome here. - No porn. - No Ads / Spamming. - No piracy Other Football/S...
With a 10:3 ratio in support for migrating the community off of lemmy.world as suggested by @[email protected] in this post, we will run a trial period of migrating the community over to [email protected]
Trial period
We will run a trial period of 2 months on lemm.ee to see how stable the instance is. Since all of you who discussed potential instances only had good things to say about the instance, expect the trial period to just be formality though.
The FootballAutoMod has been configured to automatically lock down any new posts and tell the posters to repost in the new community. While I could lock the community completely, I think this solution might help inform those of you who maybe don't check in on the community regularly.
It's a moderately large EU-based instance with little to no drama as far as I know. The admin team seems to be competent and maintains a neutral stance with few defederations across the fediverse. The instance also got multiple recommendations amongst our users.
If you are referring FootballAutoMod it's a very basic bot I wrote for posting weekly discussion threads. The tagging of people were just a simple script I wrote to get a list of everyone who have interacted with the community in the last 365 days and then splitting the list up to get within the 10k character limit for comments.
Maybe you should create a post to share your script for users that doesn't know and the lemmyverse. It may be simple but for mod/admin it's a huge change for our life as we post a post, a second one, then close and lost people on the way.
On mastodon, when your change account, migrate on another server, it invite users to follow you on your new address and update it for you. And, i think i will suggest it for Piefed, a integrated functionality will be very helpfull. :)
With this script, can you get the list of subcribers ?
get_posts() may be missing from the lemmy_session.py file though
def get_posts(self, *, community: int | str | LemmyCommunity, sort: str = 'New', page: int = 1) -> dict[str, any]:
""" Gets the posts of a community.
:param community: The ID of the community to get the posts of, can also be a LemmyCommunity parseable
string/object.
:param sort: The sorting method of the posts, by default 'New'.
:param page: The page number of the posts, by default 1.
:return: The response JSON of the request as a dictionary.
"""
if isinstance(community, LemmyCommunity) or isinstance(community, str):
response: Final[requests.Response] = self.srv.get_posts(community_name=community, sort=sort, page=page,
limit=50)
else:
response: Final[requests.Response] = self.srv.get_posts(community_id=community, sort=sort, page=page,
limit=50)
if response.status_code != 200:
raise requests.exceptions.HTTPError(response.text)
return response.json()
Excuse the ugly code, it was written as a one-off
import os
import time
import datetime
from json import load, dump
import requests
from lemmy_references import LemmyCommunity, LemmyUser
from lemmy_session import LemmySession
session: LemmySession = LemmySession(website='https://lemmy.world/',
username='TestUlrikHD',
password='---',
end_script_signal=None)
posts: list[dict[[str, any]]] = []
cutoff_date: datetime.datetime = datetime.datetime.now(datetime.timezone.utc) - datetime.timedelta(days=365)
page_count: int = 1
loop_break: bool = False
while True:
post_response: dict[str, any] = session.get_posts(community=LemmyCommunity('football', 'lemmy.world'),
page=page_count)
page_count += 1
for post in post_response['posts']:
if datetime.datetime.fromisoformat(post['post']['published']) > cutoff_date:
posts.append(post)
else:
loop_break = True
break
if loop_break:
break
user_dict: dict[str, dict[str, any]] = {}
for post in posts:
user_dict[str(LemmyUser(post['creator']['actor_id']))] = {'post': True, 'post_id': post['post']['id']}
comments = session.get_post_comments(post_id=post['post']['id'])
for comment in comments['comments']:
user: str = str(LemmyUser(comment['creator']['actor_id']))
if user not in user_dict:
user_dict[user] = {'post': False, 'post_id': comment['post']['id'], 'parent_id': comment['comment']['id']}
del user_dict[str(LemmyUser('[email protected]'))]
del user_dict[str(LemmyUser('[email protected]'))]
with open('user_dict', 'w', encoding='utf-8') as file:
dump(user_dict, file, ensure_ascii=False, indent=4)
def log_reply(usr: str) -> None:
user_list: list[str] = []
if os.path.isfile('reply_list.json'):
with open('reply_list.json', 'r', encoding='utf-8') as file:
user_list = load(file)
user_list.append(str(usr))
with open('reply_list.json', 'w', encoding='utf-8') as file:
dump(user_list, file, ensure_ascii=False, indent=4)
for username, user in user_dict.items():
time.sleep(1)
try:
#if user['post']:
# session.reply(content='migration message', post_id=user['post_id'], parent_id=None)
#else:
# session.reply(content='migration message', post_id=user['post_id'], parent_id=user['parent_id'])
log_reply(usr=LemmyUser(username).str_link())
except requests.HTTPError as e:
print(f'Failed to send message to {username} - {e}')
and this part creates txt for easy copy pasting for tagging.
from json import load
with open('reply_list.json', 'r', encoding='utf-8') as file:
user_list: list[str] = load(file)
loop_count: int = len(', '.join(user_list)) // 9500 + 1
for i in range(loop_count):
with open(f'reply_list_{i}.txt', 'w', encoding='utf-8') as file:
print(len(' '.join(user_list[i * len(user_list) // loop_count:(i + 1) * len(user_list) // loop_count])))
file.write(', '.join(user_list[i * len(user_list) // loop_count:(i + 1) * len(user_list) // loop_count]))
Thank a lot, i will share on our Meta community so we can save it and setup an migration bot. :)
I also shared this idea so maybe Piefed can integrate it into its core with a migration button where you add the new address and subcribers will receive a notification to join the new one.