• 0 Posts
  • 79 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2023

help-circle

  • People sometimes act like the models can only reproduce their training data, which is what I’m saying is wrong. They do generalise.

    During training the models are trained to predict the next word, but after training the network is always effectively interpolating between the training examples it has memorised. But this interpolation doesn’t happen in text space but in a very high dimensional abstract semantic representation space, a ‘concept space’.

    Now imagine that you have memorised two paragraphs that occupy two points in concept space. And then you interpolate between them. This gives you a new point, potentially unseen during training, a new concept, that is in some ways analogous to the two paragraphs you memorised, but still fundamentally different, and potentially novel.







  • HorseRabbit@lemmy.sdf.orgtothe_dunk_tank@hexbear.netThank god
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 months ago

    The whole misinformation angle is bullshit. It’s such a lib mindset to think your enemies are just misinformed.

    The majority of people at Jan 6 were business owners or upper management. They came from urban communities that had seen a decline in the white population. https://d3qi0qp55mx5f5.cloudfront.net/cpost/i/docs/Pape_AmericanInsurrectionistMovement_2022-01-02.pdf

    Like every fascist movement Jan 6 was the petite bourgeois attempting to shut down democracy and install a dictator that would stop demographic changes and keep the current hierarchies unchanged despite the falling rate of profit and the resulting monopolisation / need for the growth of the working class.

    But if you refuse to see it as a systemic, almost inevitable process, then you’re left thinking the Jan 6 people were just CraZzyYy. Maybe it’s that damn internet the kids are always on!