Showing Museum Multiverse to the World!

screen_1080x650_proj_vision

This pass weekend I was able to show Museum Multiverse and my other projects to New York City at Microsoft HQ. The event was part of Playcrafting‘s Halloween Expo, An event with over 150 indie games and over 1,200 attendees. Now thats a lot of the play-testers.

halloweenexpo2016header

I was with my team for the expo, so we were able to get a big room for our projects. The projects that are out, Don’t Look Away and Witchualistic and our projects in development, Museum Multiverse and The Take.

I got over a hundred play-testers to tryout Museum Multiverse along and got valuable feedback from the experience.

Here is what I learned.

  1. My first major puzzle is still too hard for a lot of players.

Screen Shot 2017-09-11 at 11.47.46 PMI had to give a lot of hints to people for my light reflection puzzle. This means I should have another puzzle to leading up to this more complex challenge. I have created a easier puzzle before this challenge in order to help the player understand what to do in the room players had trouble in. I got this advice from one of my friend who made a hit game called LineLight, Brett Taylor. He recommended that I should strip away the noise from the puzzle which will help the player understand the mechanic, so I came up with this:
Screen Shot 2017-11-01 at 10.44.46 PM This is a cleaner noise free puzzle which would give the player the understanding of the light mechanic.

     2. Players loved the 2D section but the controls need improvement.

giphy (4) Jumping and landing is slippery in 2D. I think I can fine tune some Unity parameters to fix this problem.

    3. I still have a lot of work to do!

There is so much to do, but this is exciting! Everyone enjoyed the project and some people even came back to play the experience again!

I will continue to update you all on my progress on Museum Multiverse stay tuned on twitter, facebook, and on the site.

See you in VR!
GearVRScreen

Revisiting and Revising

This week entailed finishing off the final puzzles of Muesem Multiverse and playtesting the game with people new and old to VR. I finished off my hiding locker scene with the context of this project. I made this scene earlier in the Launch Pad program but the project that this scene belonged to actually broke which meant this scene was lost until I recreated it this week. I think this scene is important to convey the switches between 1st and 3rd person view within some parts of the game. I also got feedback on one of the puzzles I have been working and of course some people have problems with it. I took the feedback a fault on my puzzle design. It is my job as the designer to create a fun understanding experience for most so I did a couple of things to improve the puzzle.

1 I gave the player more feedback when they are I the right track.

Screen Shot 2017-09-04 at 1.35.27 AM.png

2 I provided clues in the environment

Screen Shot 2017-09-04 at 1.25.15 AM.png

3 Giving a awesome reward once the player finishes the puzzle.

Screen Shot 2017-09-04 at 1.31.55 AM.png

I think after these changes players will have a better experience in one of the first puzzles of the game. I will be working the rest of the week on deploying the app and testing it through the store.

Entering The Multiverse

museum_intro.gif

Let me start by saying there is still much work to do on Museum Multiverse, but it is coming along. This week the team has been working on post processing effects in order to create a cinematic warping effect when entering paintings in the museum. Unity has an awesome new system for post processing effects but they are not compatible with Android. Our solution is to work with legacy image effects in order to make the scene look great on the Gear VR.

I have also been working on the notion of incorporating 2D gameplay into VR. I have created a pretty good proof of concept and have now added that portion into the game. I want Museum Multiverse to be a departure from the normal VR experience on the Gear VR store and I think this section will be a refreshingly fun experience for players.

    giphy (4).gif

We also added a new member to our ranks of Museum Multiverse, Mikei Huang, a talented VR and Visual Designer. His work portfolio includes cool VR projects like Kuru Kuru Sushi VR and Back Seat Baby. He has been working with me on the cover art and creating visual consistency in Museum Multiverse. I am very happy to have such a talented member of the New York City gaming community on my team.

We also completed the models of the main character(s) for the game. Up to this point we have been using a simple cubed character as placeholder for most of development but it will be good to finally switch him for the main character. We will miss Mr. Cubes but we are happy to have our character so close to being finalized. The Character Modeler and Animator, Ethanis a talented artist with works in many visually stunning titles. Checkout his twitch channel where he works on projects live and his amazing GDC talk on low ploy development. We’re excited to have his work in Museum Multiverse.

CharactersMM.jpg CharacterWSecurity.jpg

Our next steps on the roadmap are to connect all scenes scene together and playtest playtest, playtest – and then more playtesting. The more we learn about how players organically behave in our game the better Museum Multiverse will be. One of our goals in playtesting is discovering what players enjoy as well as what they don’t understand. We hope to incorporate  these findings before the September 9th due date.

Until Next time…

giphy (5).gif

PlayNYC and the Awesome Feedback of 100’s

This weekend the team went to PlayNYC. Play was NYC’s first dedicated games convention and it felt a lot like PAX in it’s early days according to game veterans.

PLAYNYCStage.jpg We got to showed off an interactive trailer of The Take. This mostly had the mission briefing and traps you can set in the room. The players of the experience of course did not listen to anything from the mission briefing and instead they mostly had fun throwing things around an stacking books on the desk.

TopFloor.jpg

We had a great time had a ton of feedback and we are now ready to add this to the game.
20170819_093700.jpg

A New Way to Hear? fmod in Prod

In Museum Multiverse I knew to make the project truly come together I would need an amazing soundtrack to captivate the player. Thanks to Niko Korolog and his work with adaptive music in my game I now have music that will suck the player in from the start to end of the demo. Niko used a program called FMOD to create an adaptive soundtrack. FMOD is a sound effects engine for video games and applications developed by Firelight Technologies, that play and mix sound files of diverse formats on many operating systems, to learn more about this awesome application visit their site.

Screen Shot 2017-08-10 at 1.18.09 PM.png

This program gives me the control to shut off layers of music at my choosing and turn on other ones through code. To get started on learning this magic I’d recommend this awesome tutorial from FMOD on integrating this middleware into Unity.

I cannot wait to continue incorporating this adaptive soundtrack into Museum Multiverse.

A Mini Game Becomes a Game

Well it’s been a week, and we’ve been able to implement decent rotation of the game objects on the X and Y axis using the Gear VR’s touchpad. Now when we hold onto objects using the trigger, we can swipe left and right to rotate them on the Y axis, and swipe up and down to rotate them on the X axis. We’ve even been able to pick up the rotation of the controller itself to rotate the objects on the Z axis whenever we roll the controller with our wrists. This motion gives instant feedback and a real sense of connection to the objects.

Screen Shot 2017-08-07 at 12.00.50 AM.png

After we implemented these basic controls, we decided to finally put some of our friends in a room with simple rigidbody objects, and told them to experiment and explore as much as they wanted. One of my 3D modeling friends, Jose, was excited to finally see some of the models he made inside of a game.

Screen Shot 2017-08-07 at 12.01.00 AM.png

After switching the headsets and controllers back and forth between our other game developer friends Andy and Rob, Jose noticed that some of his models were missing. When he thought that they might have been glitched outside the room, Rob said that they weren’t glitched at all, and that he hid them somewhere in the room. He then challenged Jose to find them in two minutes. This led to all of us hiding and finding objects for the next half hour or so. We ended up getting pretty sidetracked. It was simple, but in a refreshing sort of way. Rob commented how this should just be a game in its own right and we all sort of agreed.

Screen Shot 2017-08-07 at 12.01.08 AM.png

Screen Shot 2017-08-07 at 12.01.18 AM.png

I’ve decided that given the scope and timescale of Museum Multiverse in its current state, I’m going to instead focus most of my time on this new concept. I’ll still work on Museum Multiverse with Ernest, but for the Oculus Launch Pad program I’m going to be diverting my efforts towards this now. Rob and Jose came up with a name for it already – “The Take”, and are currently working on fleshing out a spy theme and some design documents for it.

Screen Shot 2017-08-07 at 12.01.27 AM.png

Screen Shot 2017-08-07 at 12.01.38 AM.png

 

Week 7: Sketching out the Main Character

20170730_230451.jpg

This week I focused primary on creating the main character for the game. I knew the general character I wanted for the project but I did not know how to go about starting his design. From my art past, which was very long ago, I focused on making caucasian main characters because thats what the resources from my lessons taught.

20170730_230514.jpg 20170730_230527.jpg

However, I knew from the start of planning this project I wanted a African American boy as the main character of this project so I had to dig for inspiration on this front. This was very challenging due to the limited number of good main characters that happen to be black let alone video game characters. I looked at character like Huey Freeman from the Boondocks and the children from Playdead games like Inside and Limbo get a general idea of what I wanted.

20472315_10213828174328665_1543716794_o.jpg

I wanted to show the character a scared and vulnerable, so I went with this hunched over scared look. Then continued with more angles and ideas about the character.

20495731_10213828176408717_2001912788_o.jpg

I want the character to have a Afro, but after a fruitful conversation with Ernest we came to the conclusion that his fro should not be the biggest thing on him so I shorten his hair to make him and his actions the most memorable part of the character not the fro.

20495539_10213828176048708_1909181074_o.jpg

20495586_10213828166808477_1201395462_o.jpg

I still have a lot of work ahead of me for making this character work but I believe we are on the right track and we are weeks away from making this character 3D modeled and animated. So more to come soon!!!

Week 6: Controls are Working and The Take is taking form!

This week I have been mostly fighting with the controls and have gotten something working finally!

3.jpg

What seemed to have been the problem was that within my player move script I needed to register the movement for the player’s input on the Gear VR Controller and add it to the movement of the player. After the player received this movement I then had to set that same movement to 0 all within one update function.

 

// This add the force from the Gear VR Track Pad on the OnEnable Function

void localTouch(InputTouch touch) {

touchHorizontal = touch.currentTouchPosition.x;
touchVertical = touch.currentTouchPosition.y;

}

// VR Controller
float h = touchHorizontal;
float v = touchVertical;

// Set Them Back to 0f so they don’t move the character anymore
touchHorizontal = 0f;
touchVertical = 0f;

// Do all my movement logic after….

This is looking good for now. I will take more time for refinement in controls in the future but progress is being made!!!
Speaking of progress being made… I have still been play around with the idea I had from last week as well.

Over the last week we’ve been working on getting the Gear VR controller to work within our scene and interact with objects. It’s been slow going, but I think we’re finally making some progress. So far, I’ve been able to make a raycasting object that is attached to the Gear VR controller in the scene. This raycaster can detect designated objects, temporarily change their material as it hovers over them, and teleport/lock them to the impact point of the laser with a press of the trigger. Because we’ve given these objects rigidbodies, they will move and bump into other objects in the scene as we move the controller around. Then, when we let go of the trigger, the object will fall to the ground with physics. I’m a little hesitant to rely so heavily on physics in a mobile VR game like this, but anyone who has gotten a chance to try this out has responded positively to these simple but intuitive forms of interactions with objects in 3D space.

My next goal is to figure out how to make the rotation and precise manipulation of objects feel tight and refined. This is the most important part because I want the players to appreciate the detail of the art they’re picking up, and they can’t do that unless the controls feel smooth and intuitive.

Week 5: Putting it All Together

cometogether.jpg

This week I finally got to put the player into the first level. My idea for this level is to make the player wake up in a room and find a way out. This will be an introduction to the controls. There will also be a puzzle to get out of the room, this will show a general understanding of the controls by the player. The player need to mater some basic commands in order to continue within the game.

Screen Shot 2017-07-16 at 5.52.52 PM.png

This warehouse section is the part of the game where player will be waking up and start the experience. The player moves around pretty comfortably and the scene looks great, but most of the shaders within this scene are Unity’s standard shader and that is not good for mobile VR. Currently our drawcalls are around 40 for this scene but some areas the drawcalls are nearly 70. This needs to be fixed before we move on within the next level. However, I have hope we can fixed this soon. There are some prototypes I’ve been working on that only has 9 drawcalls if I can figure out how that is being done my hope is that Ernest and I can use that knowledge within the next scenes.

Screen Shot 2017-07-16 at 5.53.04 PM.pngHowever for now this is excellent progress and I cannot wait to continue on Museum Multiverse.  What I have to do next is get my controller scripts working with the character this has been harder than I thought but I will get this working and it will be great when it I do!

 

 

CRAZY THOUGHT

Early today, I started to wonder how Museum Multiverse would play if experienced from a first person camera. While I know that first person platformers are not the most praised of game genres, I thought about the focus on art and how players might be able to better appreciate the art if viewed from a first person perspective.

Screen Shot 2017-07-14 at 12.18.55 AM.png

We decided that over the next couple of weeks we’ll create and experiment inside a small mock scene in Unity, focusing more on utilizing the Gear VR controller and manipulating objects by picking them up and turning them around. What if we could pick up a piece of art, pull it in and out, turn it around, and fully appreciate the detail in each piece? Then we can intersperse sections of fast-paced third person platforming action with quieter times of first person appreciation and exploration of art. We don’t have any of the art assets in this room just yet, so we’ll be using simple geometric shapes and common room items to get the feel and controller first.

Screen Shot 2017-07-16 at 7.02.05 PM.png

Screen Shot 2017-07-16 at 7.02.21 PM.png

I’ll continue to work on my third person platforming section, but I can’t rest until I throughly test this first person idea.

Week 4: Handling Camera Movement in VR

FOV.jpg

I am facing a problem in Museum Multiverse, the third person camera still feels weird following the player. I want to make sure the movement is smooth and comfortable for the camera and player, so I have been looking around for solutions and I think I found one.

Limiting the player’s peripheral view reduces the motion sickness of movement. I learned this from a paper I found on the subject by Ajoy S Fernandes and Steven K. Feiner at Columbia University. Basically the solution they learned from experimentation had a real reduction of motion sickness by limiting the view based on player speed.

So how do we do this in Unity?

First we would need to import the older Image Effects into Unity from the asset store. We are really just looking for the Vignette And Chromatic Aberration script. After you import this add this to your main camera. Once this script is added you need to set everything on the script to 0, you will only be playing around with the vignetting option.

Screen Shot 2017-07-09 at 3.11.17 PM.png

Try playing around with the Vignetting values to see how it effects your camera!

Screen Shot 2017-07-09 at 3.14.53 PM.png

Next we are going to write a script to adjust the Vignetting based of the Camera speed.

using System.Collections;
using System.Collections.Generic;
using UnityStandardAssets.ImageEffects;
using UnityEngine;

public class FOVLimiter : MonoBehaviour {
    private Vector3 oldPosition;
    public float MaxSpeed = 6f;
    public float MaxFOV = .7f;

    public static float CRate = .01f;
    public static float RateCutOff = .25f;

    // max .7 Vignetting

    private VignetteAndChromaticAberration fovLimiter;
    // Use this for initialization
    void Start () {
        oldPosition = transform.position;
        fovLimiter = GetComponent<VignetteAndChromaticAberration> ();
    }
    
    // Update is called once per frame
    void Update () {
        Vector3 velocity = (transform.position  oldPosition) / Time.deltaTime;
        oldPosition = transform.position;

        float expectedLimit = MaxFOV;
        if (velocity.magnitude < MaxSpeed) {
            expectedLimit = (velocity.magnitude / MaxSpeed) * MaxFOV;
        }

        float currLimit = fovLimiter.intensity;
        float rate = CRate;

        if (currLimit < RateCutOff) {
            rate *= 3; //fast rate since the field of view is large and fast changes are less noticeable
        } else {
            rate *= .5f; //slower rate since the field of view changes are more noticable for larger values. 
        }

        fovLimiter.intensity = Mathf.Lerp (fovLimiter.intensity, expectedLimit, rate);
    }
}

So what the heck is the Field Of Vision (FOV)Limiter script is doing? We are grabbing the distance the player has traveled each frame to find the speed of the player and calculate how much of the field of vision should be limited based on the player speed. So remembering some key points from the paper, the rate of the FOV transition can be faster at some points when the field of view is large because fast changes are less noticeable, but FOV changes are more noticeable for larger values.

Right now this is working pretty well but I know this is only step one to making a great 3rd person VR camera. Next week I will be focusing on making a smarter camera that can follow the player without getting stuck on walls.

If you would like to learn more about limiting camera view to prevent motion sickness and other VR tips I would recommend checking out FusedVR these guys are great!