Skip Navigation
klin klin @kbin.social
Posts 7
Comments 23
Introducing Llama 2 - Meta's Next-Generation Commercially Viable Open-Source AI & LLM
  • honestly kinda hyped! just got it working on llama.cpp and the fact that it’s commercial use is insane

  • Full screen picture home?
  • what do you mean full screen picture mode? like have the thumbnails be bigger and below the title?

  • This Atremis App icon hits real hard…
  • lmao it’s so amazing

  • Jayson Tatum Reportedly Recruiting Damian Lillard to Join Celtics
  • what is going on lol

    Posted with Artemis (beta)

  • Artemis private beta rollouts have begun!
  • LETS GOOOOO

    Posted with Artemis (beta)

  • Commenting now works in Artemis!
  • HELLO FROM ARTEMIS!!

    Posted with Artemis (beta)

  • Some updates for BotIt: I have it running indefinitely on DigitalOcean + other things.
  • good stuff!!!! i can’t wait to dive back into this when i have a bit more time :)

  • Since most posts for Artemis have been about iOS, here's a little preview for you Android users!
  • design still WIP/rough — we’re rushing to get the core functionality implemented! tho agreed IMO the button and proportions can def be adjusted

  • Baichuan 7B reaches top of LLM leaderboard for it's size?

    github.com GitHub - baichuan-inc/baichuan-7B: A large-scale 7B pretraining language model developed by BaiChuan-Inc.

    A large-scale 7B pretraining language model developed by BaiChuan-Inc. - GitHub - baichuan-inc/baichuan-7B: A large-scale 7B pretraining language model developed by BaiChuan-Inc.

    baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligent Technology. Based on the Transformer architecture, it is a model with 7 billion parameters trained on approximately 1.2 trillion tokens. It supports both Chinese and English, with a context window length of 4096. It achieves the best performance of its size on standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).

    GitHub: https://github.com/baichuan-inc/baichuan-7B

    Hugging Face: https://huggingface.co/baichuan-inc/baichuan-7B

    1
    Toggling pagination on/off for comments
  • infinite scroll (in settings) works for posts so im sure this can be implemented for comments too

  • feature wishlist
  • yup that’s me! it’s gone through several reviews and i envision it to hopefully go in once ernest has some time to breathe!

    if u want to see gifs of it in action: https://codeberg.org/Kbin/kbin-core/pulls/167

  • FYI sort by "Top Day" if you're tired of seeing the same top posts from 2-3 days ago
  • even better.. if you're a constant refresher like me do top + 3hr

  • *Permanently Deleted*
  • man i see u everywhere otome chan! (not in a bad way)

    love that the community is so active :D

  • Using this magazine as a dev center?
  • totally ok with deleting this article too if it's disruptive

  • open_llama_13b trained on 1T tokens now available !

    huggingface.co openlm-research/open_llama_13b · Hugging Face

    We’re on a journey to advance and democratize artificial intelligence through open source and open science.

    Exciting stuff!

    0
    Introducing Kbin Link: Navigate between communities with ease
  • would be awesome if we could get this integrated natively — do u wanna open a PR or issue to get this in?

  • Welcome to /m/localllama! Links & FAQ

    Run LLaMa locally

    FAQ

    [LINKS & FAQ WIP]

    0

    SlimPajama: A 627B token cleaned and deduplicated version of RedPajama

    www.cerebras.net SlimPajama: A 627B token, cleaned and deduplicated version of RedPajama - Cerebras

    Cerebras has built a platform for push-button training of large language models that can accelerate time to insights without having to orchestrate across a large cluster of small devices.

    SlimPajama: A 627B token, cleaned and deduplicated version of RedPajama - Cerebras

    Sounds like Cerebras will be training a model soon based on this dataset, will likely rival OpenLLaMa and RedPajamas models – thoughts??

    0
    Collapse comment thread?
  • TBH that's not too hard to do, but it goes back to the whole mobile friendliness :(

    IMO i think the upside of mobile friendliness outweigh the times people accidentally tap on touchpad, and i personally think just having the header being a touch target is a little too small for mobile users.. so i prefer the whole comment being a target

    obviously if folks disagree heavily i could probably adjust but i'm curious what @ernest thinks

  • /kbin meta @kbin.social klin @kbin.social

    kbin.social is.. fast again?

    been using it in and out and now it's... much faster? on both mobile and desktop :O am i just crazy or did @ernest bless us with a miracle \<3

    are you guys feeling this too?

    7
    /kbin meta @kbin.social klin @kbin.social

    Is there a way to view the LIST of magazines i’m subscribed to on kbin?

    I see that you can change your feed to only what you’re subscribed to, but I’m pretty used to checking specific subreddits on reddit on the favorites page for Apollo — can’t seem to find a way to do the same with kbin

    i see random magazines and posts on the sidebar, seems like favorite list or subscribe list would be good to be around that area but higher

    1
    /kbin meta @kbin.social klin @kbin.social

    Is there a way to view the LIST of magazines i’m subscribed to on kbin?

    Is there a way to view the LIST of magazines i’m subscribed to on kbin?

    I see that you can change your feed to only what you’re subscribed to, but I’m pretty used to checking specific subreddits on reddit on the favorites page for Apollo — can’t seem to find a way to do the same with kbin

    i see random magazines and posts on the sidebar, seems like favorite list or subscribe list would be good to be around that area but higher

    \#kbinMeta

    2