Hey everyone, I’m looking for a way to use an open source local large language model (LLM) on Linux, particularly on low-spec hardware like Raspberry Pi, to generate lengthy, coherent stories of 10k+ words from a single prompt. I recall reading about methods described in scientific papers such as “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. These papers used GPT-3, and since it’s been a while since then, I was hoping there might be something similar made using only open source tools. Does anyone have experience with this or know of any resources that could help me achieve long, coherent story generation with an open source LLM? Any advice or pointers would be greatly appreciated. Thank you!

  • INeedMana@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    5 months ago

    If you want something local and open source, I think your main problem will be the number of parameters (the b thing). ChatGPT-3 is (was?) noticeably big and open source models are usually smaller. There is, of course, an exchange about how much the size of the model matters and how the quality of the training data affects the results. But when I did a non-scientific comparison ~half a year ago, there was a noticeable difference between smaller models and bigger ones.

    Having said all of that, check out https://huggingface.co/ it aims to be like GitHub for AIs. Most of the models are more or less open source, you will only need to figure out how to run one and if you have some bottlenecks on PI