Design and architecture
2024-07-09
llm-assistant
I am currently refactoring the code after having create tests for it.
I am currently working on the sm.py module.
- Register listeners in the Task: So they can callback when they finished or stopped.
- Get EventSignalers: This is my solution for the
task.is_stopped(). The idea is to remove the dependency oftaskin thepromptmethod. Instead, we register an event signaler and we pass the functionis_setfrom the EventSignaler instead of a whole class.
2024-06-28
word-guru is a dependency of ChatGPT dictionary
word-guru itself uses llm-assistant behind the curtains to
make their LLM prompt requests.
word-guru is a CLI application derived from a collection of
prompts. We can envision a feature in llm-assistant to
automatically generate CLI applications from a collection
of prompts.
Workflow example
- Use
llm-assistantsessionandcustomfeatures to test new prompts. - As soon as we are satisfied with those prompts, we
encapsulate them within a python package, for example,
word-guru. - The encapsulated package is an independent project in the sense that it has an API library and a CLI application.
- The encapsulated package incarnates a version of a collection of prompts.