Thioether

joined 1 year ago
[–] Thioether 24 points 1 year ago

Combination of VC money drying up and fear of LLM sucking up their future revenue streams. I think the former is the the logical driver and the latter is the secret fear.

[–] Thioether 4 points 1 year ago

Hope Lemmy works out

[–] Thioether 1 points 1 year ago

Unequivocally yes. What if it smells? What if it’s noisy and shakes?

[–] Thioether 4 points 1 year ago

Who the fuck goes Southport for an ice cream

[–] Thioether 21 points 1 year ago

Yeah but didn’t NATO specifically stage the other day that any form of radioactive contamination of member states will result in troops on Russian soil?

[–] Thioether 9 points 1 year ago
[–] Thioether 1 points 1 year ago

Interesting thought. Thought experiment based on a hypothetical:

  • If neural networks operate very similarly to the brain, of which we are finding more and more evidence recently (Sorry for lack of evidence, google will help)
  • Then is training on other A.I. generated data any worse than training on humans?

In this case is intelligence a combination of the quality of training data (in humans for example “The Martians” like Von Neumann and Erdos etc) who had world class tutors. Then of course the base architecture of the neural network - basically the quality of the underlying architecture. Obviously high in these scientists. Where does the transformer architecture lie? Linguistically it seems to have almost mastery. Elsewhere it doesn’t seem to be great - what happens when multi-modality comes into play? Are current iterations linguistic savants? When multi modality comes into play do we get something greater?