Home Artists Posts Import Register

Content

Howdy fellow translators,

I've got some super exciting news to share with you all about the OpenAI translation feature that I've been working on. I've been grinding away for the past few days, upgrading the OpenAI-chat and OpenAI-Text Completion add-ons. Let me tell you, the initial build of these translators was far from perfect. But fear not, my friends, because I've made some awesome progress!

Here's the deal: Generally, I can just dump a ton of text in one go to the translator backend and then break it down after receiving the translations. It's usually a killer technique that works like magic on most translation machines because they're so consistent at handling row delimiters (shoutout to Google and DeepL). But guess what? ChatGPT, that smarty-pants AI, decided to get a little too creative. Long story short, this resulted in some translations not being returned in the expected format and causing the post-processing parser to go kaput. It's like that one translator we all know (yes, Sugoi Translator, I'm looking at you!).

Now, I totally understand if you're feeling frustrated with this little hiccup. Trust me, I'm right there with you. I've been working my tail off these past few weeks, scouring every nook and cranny for a solution. And guess what, my friend? I've finally found a viable solution!

Introducing the new feature: Max concurrent request and Max row per concurrent request. By setting Max row per concurrent request, we can send translations one by one, ditching those pesky parsing errors that creep up during translation. But hold on, amigo! Sending translations one by one ain't the smartest move in most cases. It can take forever compared to batching translations. So fear not, because I've got your back with the Max concurrent request option. Now you can decide how many translations get sent off simultaneously to the Open-AI backend server.


I've put this feature through the wringer, and I'm happy to report that it's delivering some seriously satisfying results. Every line I sent for translation came back like a champ (well, except for a couple that OpenAI couldn't handle because it was having an overload errors). But hey, no worries there! If something goes haywire on the server side, it doesn't cost us a dime, and we can just give it another shot. With a solid 95% success rate for batch translation, I'm pretty stoked about what we've achieved.

To unlock this rad new feature in OpenAI, I had to do some major tinkering with the TranslatorEngine and CodeEscape classes. So, make sure your Translator++ is up to date to get in on the action with OpenAI ChatGPT translation and Text Completion.

This new update for the ChatGPT addon will be available soon. So keep an eye out and make sure to download it through the Add-on installer.

Thanks a million for your patience and support as we keep pushing the boundaries of OpenAI translation. I can't wait to share more exciting updates with you all real soon.


Happy Translating,

Dreamsavior

Comments

BlackGooLover

Nice, Especially with how (relatively) cheap gpt turbo is it should be possible to create a ton of quality translations.

Anonymous

Invalid number of translation returned. Expected 1, 0 is generated! Can not fetch translation with error

Anonymous

I have been experiencing this error and cannot use ChatGPT properly

ErichVManstein

how to join your discord?

lsx0327

Invalid number of translation returned. Expected 1, 0 is generated!