well emotions always were a feature of the livingrimoire. looks like my initial assumption was correct,
emotions are simply algorithmic parts.

meaning one part of the objects which build an algorithm.

to get the current emotion:

so if you shallow ref the chobit in to a skill attribute, that skill gains awareness of the emotion, which lets you do stuff like setting a facial expration animation or robotic facial animation, or other things like utilize less power when she is sad for example.


    def input(self, ear: str, skin: str, eye: str):
        result = self.brain.getLogicChobitOutput()
        if not len(result) == 0:
            self.app.lbl["text"] = self.app.tes.add_new_lines(result, 50)
            match self.brain.logicChobit.getSoulEmotion():
                case "APVerbatim":
                case "APMad":
                    self.app.btn.config(image=self.app.photos[self.mode][self.app.rnd.getSimpleRNDNum(3) + 4])
                case "APHappy":
                case "APSad":
                    self.app.btn.config(image=self.app.photos[self.mode][self.app.rnd.getSimpleRNDNum(3) + 12])
                case "APShy":
                    self.app.btn.config(image=self.app.photos[self.mode][self.app.rnd.getSimpleRNDNum(3) + 16])
                case _:
                    self.app.btn.config(image=self.app.photos[self.mode][self.app.rnd.getSimpleRNDNum(len(self.app.photos) - 1)])

to add emotions I simply copy pasted alg parts (Mutatable class) and renamed them. yes that's all it took.
next if for example I wanted her to be happy when i say "good girl" I used:
self.algPartsFusion(4, APHappy(self.ggReplies.getAResponse())), with 4 being the algorithm priority(1->5).

APHappy is actually APVerbatim with a dif name.

usually i build an algorithm with self.setSimpleAlg(self.replies.getAResponse()), which uses the default APVerbatim, a neutral emotion.

seeing the emotions match the active skills that send them, adds life to the bot.