Home Artists Posts Import Register

Downloads

Content

WARNING: this is for advanced users that want to do very custom things with Alive Chat and requires editing a file. I plan to do cool customizable stuff from the UI and this was built mostly as tools for me to get that done. But in the meantime I know quite a few people want to do their own thing in vam and some get mad at me that I don't explain stuff, so I'm outlining here some of the ways in how Alive chat can be used to achieve a lot more and very custom stuff.

WARNING 2: I only worked on this a few hours months ago and changed a ton of stuff since  so there will likely be bugs and stuff that I'll change very soon when I revisit this side of things and push it forward.




INFO

It's possible for a while now to make anything in vam happen when you say anything particular in alive chat. Play animations or change clothes or change scenes, lights, etc. As well as control the AI narrative in a way that I don't think it's possible in any other implementations yet, by injecting custom replies/behavior on specific commands/phrases.

I didn't get too deep into it before and didn't promote it because it's not the purpose of the plugin to be a DIY tool. And this approach has a bit of a learning curve in figuring it out, requires editing a file rather than setting stuff up in vam, so I don't exactly recommend it, it's not fun I imagine for a lot of people.



WORKS WITHOUT AI TOO

I built the Alive Virtual Assistant and the chat functionality in such a way that it will work even without Text AI running. So all this stuff works with vanilla vam too, it's basically like an old-school chatbot. Doesn't requiring having AI running.



WORKS WITH AI TOO

If AI is running, the Virtual Assistant layer can be used as a way to steer the AI in specific directions as well as giving it some custom knowlege. If you write something, the Virtual Assistant will first try to handle it and respond if there's a rule for what you typed. If that's the case, then whatever the assistant answered with will be part of the chat and incorporated by the AI. So you might ask the AI to pick a number. By default the AI will pick something a bit random. But you can add a rule for the assistant to always pick 7. So when talking to the AI, it will always pick 7. Then if you ask the AI why it picked it, the AI will make something up as to why it picked 7 and act as if it picked it, even though it was actually something forced by the Virtual Assistant layer.



HOW TO ADD COMMANDS

It requires changing a file in a text editor, it's a bit unconventional but it's considerably more flexibile and more powerful than doing it directly in VAM imo. When you first run Alive it will create a local file in Saves/Alive AI/Brain/custom.rive.js . In that file you can add rules for the Virtual Assistant.

The Virtual Assistant already has some similar files I made as an example that can be found in: Saves\PluginData\Aeternum\Ae\Html\brain\rives examples.rive.js and alive.rive.js. The examples file you can copy inside your custom file as a template. The alive file contains some more commands that I added to the Virtual Assistant.

The files are javascript and work with jQuery too, so people with basic knowledge about that can follow the examples and do A LOT with it. Even connect the AI to the internet, get it to know the weather for example via an AJAX request, or local services, do home automation, etc.

The code syntax looks weird but it is simple and anyone can just copy paste the examples and adapt them to their liking.



RIVESCRIPT

The first and main part of the virtual assistant brain files are rivescript rules. These are rules like when I say this you say that. They have some extra operators for wildcards, options, etc. (https://www.rivescript.com/docs/tutorial)

For example:

+ pick a number
- I pick 7.

You can put these rules in the custom file, similar to the example file, and reload the plugin. For example:

RIVESCRIPTS.push(`
+ test1
- You said test1.
`);

The virtual assistant will know that when you say test1 (or "Test1", "Test1." etc) to answer with "You said test1."



TRIGGERING ANYTHING IN VAM

You can trigger vam UIButtons from chat too. You can make for example a button that toggles the lights in your scene and add this to the custom file:

RIVESCRIPTS.push(`
+ test1
- You said test1.

+ switch the lights
- OK! {js}brain_vam_trigger(RIVE.chat,"
MyVamLightsButton");{/js}
`);

Now when you say "Switch the lights", the Virtual Assistant will respond with "OK!", and it will also try to trigger the UIButton with the name "MyVamLightsButton" from the scene, if it exists.



TRIGGERING ALIVE UI STUFF

Triggering anything via vam is pretty cool but for some things you might run into trouble with that. For example you might want to do a command for the character to dress in a clothing preset you made. You can add an UIButton and put there a trigger to change the preset on the atom Person1. That will work but that command will only work in that scene where the button is, and only for that atom Person1.

To overcome that and make it work for any person atom in any scene, to make it a general command that will work anywhere everytime, you can use the Alive events to trigger things, basically simulating the actions from the UI. If it's possible to do something from the Alive UI, it's possible to trigger it as well. Steps:

1. Do the action in the UI (select a person and change clothing preset)

2. Go to Triggers app and copy the event (might take a bit of figuring out based on the name, it might not always be the first/more recent one)

For the clothing example the event name is changewardrobe and the value something like:

{"action":"changewardrobe", "value":"Custom/Atom/Person/Clothing/Preset_MyPreset.vap","preset":"MyPreset"}

3. Now you can add it to the custom file like this:

RIVESCRIPTS.push(`
+ test1
- You said test1.
+ switch the lights
- OK! {js}brain_vam_trigger(RIVE.chat,"MyVamLightsButton");{/js}
+ change to evening dress
- {js}custom_change_to_eveningdress(RIVE.chat);{/js}

`);

function custom_change_to_eveningdress(callback, payload){
   var answer = "Ok.";
   aevent({"action":"changewardrobe", "value":"Custom/Atom/Person/Clothing/Preset_MyPreset.vap","preset":"MyPreset"});
  callback(answer, payload);
}

For this last example there's also an extra function custom_change_to_eveningdress that was created at the bottom. So when you write "change to evening dress", the assistant will call this function and answer with "Ok". In aevent() you can copy-paste the event from the Triggers app eariler. In answer you can put some extra text too for the reply of the Virtual Assistant. You can also add it directly in the rule like in the 2nd example where "OK" is directly in the rule. But if you do it like in the 3rd example, in a javascript function, you can format the text more via javascript like doing random answers or some more complicated logic.

It's possible also to use ChatGPT to update the code, it's pretty good at javascript. For example giving it the 3rd example and asking it to change it so that it returns a random number, instead of "Ok":

And of course you can skip the aevent part (that forces the clothes to change) and have functions that only format and return text. The example above could be edited like:

function pick_random_number(callback, payload){
   var randomNumber = Math.floor(Math.random() * 100) + 1;
   callback (answer,payload);
}

and then have a rule like:

+ pick a number please
- I pick: {js}pick_random_number(RIVE.chat);{/js}




WILDCARDS

In some of my other examples you'll see stuff like:

+ demo calculate * [please]
- {random}It's|It is|That's|Easy.{/random} {js}brain_math_calculate(RIVE.chat,"<star1>");{/js}{random}.|!{/random}

this will look for when you say things like:
"Demo calculate 1+2" or
"Demo calculate 1+2 please"

"please" is optional because it's between []. Meanwhile, * is a wildcard that will look for anything else following "Demo calculate".

In the response, the assistant will responde with a random choice from "It's", "It is", "That's","Easy", and then call the javascript function created lower named brain_math_calculate. One thing to note is that this function has a parameter <star1>. <star1>,<star2>, <star3> etc will be replaced with what was found in the first *, second *, third * etc in the rule above.



FILE

I attached the file with the examples above, that you can replace in Saves/Alive AI/Brain. For the second example you'll need to have an UIButton with the name "MyVamLightsButton" in the scene. For the third a clothing preset named "MyPreset". But you can change these values with some button and some clothing preset you already have if you want to test them. For more advanced stuff I recommend checking my examples files in Saves\PluginData\Aeternum\Ae\Html\brain\rives examples.rive.js and alive.rive.js . They're all quick examples but they outline some various ways of doing more. You shouldn't edit those though because they get overwritten with each version.

EXISTING COMMANDS THAT I MADE WITH THIS SYSTEM

  • reset pose or reset pose please - makes the person reset the pose
  • move to me or come to me or come here - makes the person move close to the camera
  • move to center or go to center  or recenter - makes the person go to the center of the scene
  • move to flag - this i messed up and need to fix the rule
  • turn around - makes the model turn around
  • turn left or turn right - turns in that  direction
  • smile or frown - these are broken, need to fix them
  • move away/closer/left/right X or move up/down/left/right X meters - moves in that direction in X meters
  • echo X - makes the model repeat X
  • change pose standing/sitting/lying/crouching X - changes to the Xth pose of that category
  • change pose standing/sitting/lying/crouching - changes to a random pose of that category
  • trigger X - triggers the UIButton with the name X
  • demo time - tells the time
  • demo calculate X - tries to calculate X formula (only basic operations)
  • demo async X Y Z - demo for a complex multi-component command that triggers 3 functions, waits for all 3 of them to finish, and makes the Assistant answer when they finished. X,Y and Z are the number of seconds of artificial delay to add to each before they respond. There's a bug here with the response due to some cleanup aimed for AI messages getting triggered by the ":" character


If you have any questions, or need help with any of this stuff, just let me know.

Files

Comments

Vitlam

Jeez, that's amazing, this will push the things we can do so much further, I need to find some free time to play with this, thanks for the tutorial.

thomas d

This is all good stuff and I have tried a the button trigger and it does work. I like the idea of the time, and I need set up 5 or 6 good poses. I will try this soon.

thomas d

When I use OB I would like to hide the UI, when I do it does not work and I cannot unhide it. am I missing something. I wish, I could set the UI for chat and then close it and it still work. Don't mind me, I have even sent ideas to Elon Musk for SpaceX, ... still waiting for his response.

thomas d

I finally tried the Lama2 OB chat, I found TheBloke/Luna-AI-Llama2-Uncensored-GPTQ to be very good, even on my crappy setup. I only tried a few, some did not work of course, and some would repeat stuff over and over. I need to remember to try them with different android cards. Thank you for posting that info.

bashr

Amazing work sir. Thank you.

bashr

Rive Script Tutorial Link: https://www.rivescript.com/docs/tutorial