Home Artists Posts Import Register
The Offical Matrix Groupchat is online! >>CLICK HERE<<

Content

Writing a new basic and straight-forward quick guide on how to install text-generation-webui (platform for local text AI) because some stuff changed from my last post about it and the old post might be more confusing now.

STEP 1 Download

Download and extract https://github.com/oobabooga/text-generation-webui/archive/refs/heads/main.zip

I don't know if it's still a thing but you might want to extract in a path that doesn't have spaces to avoid errors down the road. Instead of "C:\my folder" use something like "C:\my_folder".

STEP 2 Install

Run start_windows.bat and it will start installing, the process takes a minute. When you see a message like this:

enter the letter for your option (A if you have an Nvidia card) and press enter

When you get asked 'Do you want to use CUDA 11.8 instead of 12.1? Only choose this option if your GPU is very old (Kepler or older).

For RTX and GTX series GPUs, say "N". If unsure, say "N".' for most Nvidia GPUs you'll need to type "N". For very old GPUs type "Y".

Once it finished, you'll see a message showing you the link for the local UI:

You can select it and right click to copy it to clipboard. If you load that in a browser, you'll see the UI. To close text-generation-webui, you can just close the command prompt. To start it, you can run start_windows.bat. It's recommended after installing, to restart it!

STEP 3 Download a model

You'll need some AI models to talk to. In the UI, in the Model tab in the Download model field add this: TheBloke/Llama-2-7B-Chat-GPTQ and hit download.

At the time of writing this in some builds the downloader might stay stuck even at 100%, you can just refresh the page if it takes more than 10-20 minutes because it likely finished but the UI got stuck. Once a model is downloaded, you can hit refresh next to the models list and pick the model and in Model Loader pick ExLlama.

USE IT

To start using it, when you start up the text-generation-ui you'll have to pick a model. From the Model tab, pick the model from the dropdown.

In the Model Loader pick ExLlama. Then click on the Load button. Once it loads, you can go to the Chat tab and start talking to the model. In the Parameters > Character tab you can customize the character (for the browser chat experience).

USE IT WITH ALIVE

You'll need to enable the api feature for Alive to be able to connect to the AI. You can do that by editing the file CMD_FLAGS.txt and writing in it --api (either just that or add it as a new line at the end). After you do that you should close and start again the application (close the command prompt and run again start_windows.bat)


Comments

JAKB0

Thank you, solved a few problems from the last time I tried