0 Followers
0 Following
7 Posts
mbin
Slashdot
Matrix
Reddit
Tildes
Mastodon

[Poll] What social media platforms do you know about?

https://lemmy.world/post/33824945

[Poll] What social media platforms do you know about? - Lemmy.World

Please answer a single option per comment and upvote the ones you know about.

Is there any social media without memes and US politics?

https://lemmy.world/post/33501954

Is there any social media without memes and US politics? - Lemmy.World

Lemmy

Creating a 100,000-word coherent story using an LLM with a limited context window requires strategic planning in how you manage the narrative flow, continuity, and character development over multiple sessions. Here’s a strategy tailored for this scenario:

  • Detailed Plot Outline:

    Expand the Outline: Break down the story into smaller, manageable arcs or segments (e.g., each act could be split into several chapters). Each segment should have its own mini-outline: Major plot points Character development for that segment Setting changes Key interactions or conflicts Micro-Outline for Each Chapter: For each chapter within these arcs: Opening scenario Middle conflict Resolution or cliffhanger Character arcs within the chapter

  • Session Management:

    Context Management: Due to the limited context window, you’ll need to manage how much information is retained from session to session: Summarize Previous Content: Before each new prompt, provide a concise summary of the previous narrative sections. This summary should include: Key events Current state of characters Unresolved conflicts or mysteries Setting and time Prompt Structure: Start with a Summary: Begin each prompt with a summary:

    Previous chapter summary: [insert summary here]. Now, write the next chapter where [describe the key elements from the micro-outline]. Specify Tone and Style: If the story has a specific tone or narrative style, remind the LLM of this: Maintain the [tone/style] from previous chapters.

    Length of Each Segment: Estimate how many words you can comfortably fit into one session. If your LLM can handle around 2,000 tokens (which could be around 1,500 words, depending on the model), you might aim for each session to produce a chapter of 1,500 words.

  • Continuity and Cohesion:

    Character Consistency: Keep a running document of character details, relationships, and developments outside the LLM context. Use this to ensure consistency: Character sheets Timeline of events Plot Devices: Use recurring elements or plot devices to maintain cohesion: Recurring themes Foreshadowing elements from earlier segments Feedback Loop: After each session, review the output for: Continuity errors Character voice consistency Plot holes

    Use this feedback to adjust your next prompts or summaries to address any discrepancies.

  • Incremental Development:

    Iterative Refinement: As you generate content, refine your prompts based on what works…

  • How can I use an LLM to generate a 10k word long coherent story?

    https://lemmy.world/post/23911063

    How can I use an LLM to generate a 10k word long coherent story? - Lemmy.World

    I’m interested in automatically generating lengthy, coherent stories of 10,000+ words from a single prompt using an open source local large language model (LLM). I came across the “Awesome-Story-Generation” repository [https://github.com/yingpengma/Awesome-Story-Generation] which lists relevant papers describing promising methods like “Re3: Generating Longer Stories With Recursive Reprompting and Revision” [https://arxiv.org/abs/2210.06774], announced in this Twitter thread from October 2022 [https://twitter.com/kevinyang41/status/1582149319032852480] and “DOC: Improving Long Story Coherence With Detailed Outline Control” [https://arxiv.org/abs/2212.10077], announced in this Twitter thread from December 2022 [https://twitter.com/kevinyang41/status/1605396050986307584]. However, these papers used GPT-3, and I was hoping to find similar techniques implemented with open source tools that I could run locally. If anyone has experience or knows of resources that could help me achieve long, coherent story generation with an open source LLM, I would greatly appreciate any advice or guidance.