I know there’s other plausible reasons, but thought I’d use this juicy title.

What does everyone think? As someone who works outside of tech I’m curious to hear the collective thoughts of the tech minds on Lemmy.

  • agent_flounder@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 months ago

    Yes except human brains can learn things without the typical manual training and tweaking you see in ML. In other words, LLMs can’t just start from an initial “blank” state and train themselves autonomously. A baby starts from an initial state and learns about objects, calibrates their eyes, proprioception, movement, then learns to roll over crawl, stand, walk, grasp, learns to understand language then speak it, etc. of course there’s parental involvement and all that but not like someone training an LLM on a massive dataset.