Skip to content

Design and architecture

2024-07-09

llm-assistant

I am currently refactoring the code after having create tests for it. I am currently working on the sm.py module.

  • Register listeners in the Task: So they can callback when they finished or stopped.
  • Get EventSignalers: This is my solution for the task.is_stopped(). The idea is to remove the dependency of task in the prompt method. Instead, we register an event signaler and we pass the function is_set from the EventSignaler instead of a whole class.

2024-06-28

word-guru is a dependency of ChatGPT dictionary

word-guru itself uses llm-assistant behind the curtains to make their LLM prompt requests.

word-guru is a CLI application derived from a collection of prompts. We can envision a feature in llm-assistant to automatically generate CLI applications from a collection of prompts.

Workflow example

  1. Use llm-assistant session and custom features to test new prompts.
  2. As soon as we are satisfied with those prompts, we encapsulate them within a python package, for example, word-guru.
  3. The encapsulated package is an independent project in the sense that it has an API library and a CLI application.
  4. The encapsulated package incarnates a version of a collection of prompts.