Home Artists Posts Import Register

Content

Hello everyone.


I started this to look back at what I have done while looking at the works I have made and uploaded to Twitter. I'll try to put it all together with some thoughts and some brief commentary!

I was actually going to do everything I've done so far all at once, but it's become a huge amount of work, so I'll be making separate articles.

- 2014 -

I started uploading my work using Unity and other tools in about 2014.

- I guess this is the first time around. I don't even know how to use Unity itself yet, so I'm starting by hand. (Looking at it now, the way the colors are adjusted is not so good.)


- I just started VR development as well. I think it was Oculus DK2 at this time.


- I was doing a lot of experimenting with the look and feel of Unity. I felt like I started with what I could do to familiarize myself with Unity.


- 2015 -

- Looks like they made a VR game! That's the one I tried to make. It's a VR game where you move around by bobsledding, but I stopped making it halfway through because I got really VR-sick.


- I was experimenting with materials just as Unity5 became available. The concept of PBR was introduced as a standard feature, so I was experimenting with expression in combination with Emission and other light-emitting systems.


- This was around the time when I started to really experiment with how far I could go in creating animated expressions in real time. I was experimenting with adding animated screen effects in addition to animated texture expressions. (I'm still doing the same thing now, aren't I?)

I wrote an article about this on Qiita. (Japanese only)

https://qiita.com/MuRo_CG/items/c417ef6d6cbeed3dd42b

- I was trying to get it to the point where it would work on android.

- I made VR Live using Leap Motion because I didn't have an Oculus hand controller yet, I was testing and making it thinking that the day would come when I would watch a live performance in VR.


- I made a shooting game like VR invader game since GearVR was released. It was surprisingly fun.


- I was exploring various expressions on Unity. I was trying to see if it would be possible to add another texture on top of the toon-like model using Projector to create a thicker paint job.


- 2016 -

- I was experimenting with using Oculus and Leap Motion to control the character. I made it so that the camera would also switch in real time, creating an atmosphere similar to a TV show. (I guess you could call me a Vtuber now.)


- The one I made because I thought it would be fun to see it in VR with a focus on the material. It's an experiment to increase the sense of reality.


- We tried to incorporate into the system the behavior that people unconsciously open their mouths and eyes when they look at themselves in a mirror; by becoming a character in VR and moving their head, their facial expressions automatically change.


- The one I made because I thought it would be possible to do the facial expression changes I mentioned earlier using only character control, instead of moving the character as myself.

I thought it would be interesting to combine it with AI control. It is an experiment to make the AI have emotions.


- 2017 -

- If I had to say it now, I would say it is a Vtuber. I have combined various facial expression controls that I have made so far.


- This is the prototype of PlayAniMaker. At the time, I made it with the idea that it would be fun to move characters around in VR. It's like playing with dolls.


- I made this one because I thought it would be interesting if I could film the game as a cameraman in VR. The character in the space suit can be moved by controller operation.


- This is a reworked version of the one I mentioned earlier, picking up only the parts of the VR where you can shoot as a cameraman. I added camera functions and adjusted screen effects.

I think I made it in one day. It is the prototype of the later MakeItFilm.


- "MakeItFilm" was created based on the idea of a cameraman in VR to make it possible to create movies in VR. I came up with the idea of a movie creation tool by adding a function that allows you to be a performer in addition to a cameraman.

There is a video of the actual creation on YouTube.

https://www.youtube.com/watch?v=Hv80W1cfOnI


- This is where PlayAniMaker begins.


- This is the one I made to see if it is possible to express emotions while the facial expressions move automatically and in a good feeling. The facial expressions are finely controlled according to the position of the sphere. I made this model inspired by a previous experiment in which facial expressions changed in response to a mirror.


- This is an experiment to see if it is possible to express more emotion by adding body movements in accordance with automatic changes in facial expressions. I thought it would be interesting to make a music video or something.


- I tried to create the image of eating ice cream. The character's movements are controlled by my movements in VR, and I tried to express the character by switching facial expressions at the right moment. I thought it would be useful for a variety of meals if the facial expressions switched with the movements in real time.


- I created a demo reel to summarize what I have made so far.

You can see it on YouTube.

https://www.youtube.com/watch?v=xU8RHeJTQdI


That's it for Vol. 1!

Vol. 1 now summarizes 2014 - 2017. To be continued in Vol. 2

Files

Comments

No comments found for this post.