Hey everyone, I’m looking for a way to use an open source local large language model (LLM) on Linux, particularly on low-spec hardware like Raspberry Pi, to generate lengthy, coherent stories of 10k+ words from a single prompt. I recall reading about methods described in scientific papers such as “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. These papers used GPT-3, and since it’s been a while since then, I was hoping there might be something similar made using only open source tools. Does anyone have experience with this or know of any resources that could help me achieve long, coherent story generation with an open source LLM? Any advice or pointers would be greatly appreciated. Thank you!

  • Political Custard@lemmygrad.ml
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    8 months ago

    I keep a bookmark to this AI detector on my browser bar https://gptzero.me/

    I just do random checks for AI generated stuff, especially if it’s a news site I’ve not seen before. There are use cases for AI in text creation but I think it’s very limited - a summary production (with references), for example. Instead I am exited for medical advances, space study, cases where there are huge datasets of good quality and not datasets that are the result of huge dragnets of shite from the internet.

    People are quite right to be cynical about AI created stuff; we’re near the zenith of a hype-cycle at the moment.

    You’ll be getting no down vote from me.