Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TH
Thioether @lemmy.world
Posts 0
Comments 7
"AI is going to eat itself: Experiment shows people training bots are using bots"
  • Interesting thought. Thought experiment based on a hypothetical:

    • If neural networks operate very similarly to the brain, of which we are finding more and more evidence recently (Sorry for lack of evidence, google will help)
    • Then is training on other A.I. generated data any worse than training on humans?

    In this case is intelligence a combination of the quality of training data (in humans for example “The Martians” like Von Neumann and Erdos etc) who had world class tutors. Then of course the base architecture of the neural network - basically the quality of the underlying architecture. Obviously high in these scientists. Where does the transformer architecture lie? Linguistically it seems to have almost mastery. Elsewhere it doesn’t seem to be great - what happens when multi-modality comes into play? Are current iterations linguistic savants? When multi modality comes into play do we get something greater?