The Schick Hydro Indie Game Jam

Last week Schick Hydro partnered with Playcrafting to put on a indie game jam at Simple Machine in New York City. The planning and announcement for this jam was confidential until a little ago, so I could not talk about till recently.

We came up with a cool game called Calkarious.

Calkarious is a cooperative top-down shooter where two players must defend a powerful brain coral under constant siege by bioluminescent creatures.


Set inside a massive sphere of water in deep space –– Calkarious
involves rapid decision making and quick maneuvers as players struggle
against 6 different colored enemies.


Each enemy can only be killed by a shot matching its own color, and players
will need to constantly switch up their attacks to survive.


A third player can even take control of the brain coral itself (with the mouse), and
change its color to absorb similarly colored enemies. Absorb enough
enemies, and you’ll be able to release a devastating pulse attack.


The brain coral can only take three hits before it is destroyed. Players need to work
together to defend this magnificent coral for as long as possible and
achieve the highest score.


The team for this game jam comprised a group of rockstars game designers and developers! Our team consisted of three of Eos Interactive’s team members Jose, Bobby and John. We also had a the VR game design rockstar behind Paulo’s Wing , Kevin Harper. Last but not least, the last member of the team was me.

One the most interesting constraints we had for the game jam was getting our content to work inside of an arcade cabinet. The arcade cabnet was called Polycade and it was made for creating custom games on a custom cabinet.


We wanted to use every button in the arcade cabinet for our game.

Polycontrols Our goal of utilizing every button led us to the game mechanic of shooting enemies with the corresponding color and moving the brain coral with the ball on the arcade cabinet to get the brain coral out of danger.


Calkarious is free to play today on itch and will be available to play in an arcade cabinet, the best way to play it, on December 15th a Playcrafting’s 2017 Bit Awards.

Screen Shot 2017-11-26 at 3.13.20 PM

Come root on my first VR project Don’t Look Away for the Bit Awards.

Any who this past weekend was fun but I cannot wait to return to Museum Multiverse, my VR puzzle platforming epic! I have had so many cool things to show everyone about the game soon!

Showing Museum Multiverse to the World!


This pass weekend I was able to show Museum Multiverse and my other projects to New York City at Microsoft HQ. The event was part of Playcrafting‘s Halloween Expo, An event with over 150 indie games and over 1,200 attendees. Now thats a lot of the play-testers.


I was with my team for the expo, so we were able to get a big room for our projects. The projects that are out, Don’t Look Away and Witchualistic and our projects in development, Museum Multiverse and The Take.

I got over a hundred play-testers to tryout Museum Multiverse along and got valuable feedback from the experience.

Here is what I learned.

  1. My first major puzzle is still too hard for a lot of players.

Screen Shot 2017-09-11 at 11.47.46 PMI had to give a lot of hints to people for my light reflection puzzle. This means I should have another puzzle to leading up to this more complex challenge. I have created a easier puzzle before this challenge in order to help the player understand what to do in the room players had trouble in. I got this advice from one of my friend who made a hit game called LineLight, Brett Taylor. He recommended that I should strip away the noise from the puzzle which will help the player understand the mechanic, so I came up with this:
Screen Shot 2017-11-01 at 10.44.46 PM This is a cleaner noise free puzzle which would give the player the understanding of the light mechanic.

     2. Players loved the 2D section but the controls need improvement.

giphy (4) Jumping and landing is slippery in 2D. I think I can fine tune some Unity parameters to fix this problem.

    3. I still have a lot of work to do!

There is so much to do, but this is exciting! Everyone enjoyed the project and some people even came back to play the experience again!

I will continue to update you all on my progress on Museum Multiverse stay tuned on twitter, facebook, and on the site.

See you in VR!

A New Way to Hear? fmod in Prod

In Museum Multiverse I knew to make the project truly come together I would need an amazing soundtrack to captivate the player. Thanks to Niko Korolog and his work with adaptive music in my game I now have music that will suck the player in from the start to end of the demo. Niko used a program called FMOD to create an adaptive soundtrack. FMOD is a sound effects engine for video games and applications developed by Firelight Technologies, that play and mix sound files of diverse formats on many operating systems, to learn more about this awesome application visit their site.

Screen Shot 2017-08-10 at 1.18.09 PM.png

This program gives me the control to shut off layers of music at my choosing and turn on other ones through code. To get started on learning this magic I’d recommend this awesome tutorial from FMOD on integrating this middleware into Unity.

I cannot wait to continue incorporating this adaptive soundtrack into Museum Multiverse.

A Mini Game Becomes a Game

Well it’s been a week, and we’ve been able to implement decent rotation of the game objects on the X and Y axis using the Gear VR’s touchpad. Now when we hold onto objects using the trigger, we can swipe left and right to rotate them on the Y axis, and swipe up and down to rotate them on the X axis. We’ve even been able to pick up the rotation of the controller itself to rotate the objects on the Z axis whenever we roll the controller with our wrists. This motion gives instant feedback and a real sense of connection to the objects.

Screen Shot 2017-08-07 at 12.00.50 AM.png

After we implemented these basic controls, we decided to finally put some of our friends in a room with simple rigidbody objects, and told them to experiment and explore as much as they wanted. One of my 3D modeling friends, Jose, was excited to finally see some of the models he made inside of a game.

Screen Shot 2017-08-07 at 12.01.00 AM.png

After switching the headsets and controllers back and forth between our other game developer friends Andy and Rob, Jose noticed that some of his models were missing. When he thought that they might have been glitched outside the room, Rob said that they weren’t glitched at all, and that he hid them somewhere in the room. He then challenged Jose to find them in two minutes. This led to all of us hiding and finding objects for the next half hour or so. We ended up getting pretty sidetracked. It was simple, but in a refreshing sort of way. Rob commented how this should just be a game in its own right and we all sort of agreed.

Screen Shot 2017-08-07 at 12.01.08 AM.png

Screen Shot 2017-08-07 at 12.01.18 AM.png

I’ve decided that given the scope and timescale of Museum Multiverse in its current state, I’m going to instead focus most of my time on this new concept. I’ll still work on Museum Multiverse with Ernest, but for the Oculus Launch Pad program I’m going to be diverting my efforts towards this now. Rob and Jose came up with a name for it already – “The Take”, and are currently working on fleshing out a spy theme and some design documents for it.

Screen Shot 2017-08-07 at 12.01.27 AM.png

Screen Shot 2017-08-07 at 12.01.38 AM.png


Week 7: Sketching out the Main Character


This week I focused primary on creating the main character for the game. I knew the general character I wanted for the project but I did not know how to go about starting his design. From my art past, which was very long ago, I focused on making caucasian main characters because thats what the resources from my lessons taught.

20170730_230514.jpg 20170730_230527.jpg

However, I knew from the start of planning this project I wanted a African American boy as the main character of this project so I had to dig for inspiration on this front. This was very challenging due to the limited number of good main characters that happen to be black let alone video game characters. I looked at character like Huey Freeman from the Boondocks and the children from Playdead games like Inside and Limbo get a general idea of what I wanted.


I wanted to show the character a scared and vulnerable, so I went with this hunched over scared look. Then continued with more angles and ideas about the character.


I want the character to have a Afro, but after a fruitful conversation with Ernest we came to the conclusion that his fro should not be the biggest thing on him so I shorten his hair to make him and his actions the most memorable part of the character not the fro.



I still have a lot of work ahead of me for making this character work but I believe we are on the right track and we are weeks away from making this character 3D modeled and animated. So more to come soon!!!

Week 6: Controls are Working and The Take is taking form!

This week I have been mostly fighting with the controls and have gotten something working finally!


What seemed to have been the problem was that within my player move script I needed to register the movement for the player’s input on the Gear VR Controller and add it to the movement of the player. After the player received this movement I then had to set that same movement to 0 all within one update function.


// This add the force from the Gear VR Track Pad on the OnEnable Function

void localTouch(InputTouch touch) {

touchHorizontal = touch.currentTouchPosition.x;
touchVertical = touch.currentTouchPosition.y;


// VR Controller
float h = touchHorizontal;
float v = touchVertical;

// Set Them Back to 0f so they don’t move the character anymore
touchHorizontal = 0f;
touchVertical = 0f;

// Do all my movement logic after….

This is looking good for now. I will take more time for refinement in controls in the future but progress is being made!!!
Speaking of progress being made… I have still been play around with the idea I had from last week as well.

Over the last week we’ve been working on getting the Gear VR controller to work within our scene and interact with objects. It’s been slow going, but I think we’re finally making some progress. So far, I’ve been able to make a raycasting object that is attached to the Gear VR controller in the scene. This raycaster can detect designated objects, temporarily change their material as it hovers over them, and teleport/lock them to the impact point of the laser with a press of the trigger. Because we’ve given these objects rigidbodies, they will move and bump into other objects in the scene as we move the controller around. Then, when we let go of the trigger, the object will fall to the ground with physics. I’m a little hesitant to rely so heavily on physics in a mobile VR game like this, but anyone who has gotten a chance to try this out has responded positively to these simple but intuitive forms of interactions with objects in 3D space.

My next goal is to figure out how to make the rotation and precise manipulation of objects feel tight and refined. This is the most important part because I want the players to appreciate the detail of the art they’re picking up, and they can’t do that unless the controls feel smooth and intuitive.

Week 4: Handling Camera Movement in VR


I am facing a problem in Museum Multiverse, the third person camera still feels weird following the player. I want to make sure the movement is smooth and comfortable for the camera and player, so I have been looking around for solutions and I think I found one.

Limiting the player’s peripheral view reduces the motion sickness of movement. I learned this from a paper I found on the subject by Ajoy S Fernandes and Steven K. Feiner at Columbia University. Basically the solution they learned from experimentation had a real reduction of motion sickness by limiting the view based on player speed.

So how do we do this in Unity?

First we would need to import the older Image Effects into Unity from the asset store. We are really just looking for the Vignette And Chromatic Aberration script. After you import this add this to your main camera. Once this script is added you need to set everything on the script to 0, you will only be playing around with the vignetting option.

Screen Shot 2017-07-09 at 3.11.17 PM.png

Try playing around with the Vignetting values to see how it effects your camera!

Screen Shot 2017-07-09 at 3.14.53 PM.png

Next we are going to write a script to adjust the Vignetting based of the Camera speed.

using System.Collections;
using System.Collections.Generic;
using UnityStandardAssets.ImageEffects;
using UnityEngine;

public class FOVLimiter : MonoBehaviour {
    private Vector3 oldPosition;
    public float MaxSpeed = 6f;
    public float MaxFOV = .7f;

    public static float CRate = .01f;
    public static float RateCutOff = .25f;

    // max .7 Vignetting

    private VignetteAndChromaticAberration fovLimiter;
    // Use this for initialization
    void Start () {
        oldPosition = transform.position;
        fovLimiter = GetComponent<VignetteAndChromaticAberration> ();
    // Update is called once per frame
    void Update () {
        Vector3 velocity = (transform.position  oldPosition) / Time.deltaTime;
        oldPosition = transform.position;

        float expectedLimit = MaxFOV;
        if (velocity.magnitude < MaxSpeed) {
            expectedLimit = (velocity.magnitude / MaxSpeed) * MaxFOV;

        float currLimit = fovLimiter.intensity;
        float rate = CRate;

        if (currLimit < RateCutOff) {
            rate *= 3; //fast rate since the field of view is large and fast changes are less noticeable
        } else {
            rate *= .5f; //slower rate since the field of view changes are more noticable for larger values. 

        fovLimiter.intensity = Mathf.Lerp (fovLimiter.intensity, expectedLimit, rate);

So what the heck is the Field Of Vision (FOV)Limiter script is doing? We are grabbing the distance the player has traveled each frame to find the speed of the player and calculate how much of the field of vision should be limited based on the player speed. So remembering some key points from the paper, the rate of the FOV transition can be faster at some points when the field of view is large because fast changes are less noticeable, but FOV changes are more noticeable for larger values.

Right now this is working pretty well but I know this is only step one to making a great 3rd person VR camera. Next week I will be focusing on making a smarter camera that can follow the player without getting stuck on walls.

If you would like to learn more about limiting camera view to prevent motion sickness and other VR tips I would recommend checking out FusedVR these guys are great!

Week 3: Getting the Oculus Controller Working and 360 Videos are Awesome

Let me start this post by saying 360 photography is awesome!


I have received my New Gear 360 Camera this week and I have been taking photos everyday since and it is so much fun to capture the entirety of the moment in a picture and my life has some really silly moments!


Also this week, I have incorporated the Gear VR controller into Museum Multiverse. I wanted to find a comfortable way of moving a player in 3D space and I think I found it with Easy Input for Gear VR from the Unity asset store. This made moving a Gameobject very easy in Unity with the Gear VR controller. I have decided to use the trackpad on the Gear VR controller to move the player around because it feels more like a joystick on a normal controller.

giphy (3).gif

My next challenge is linking up my character’s movement animations with the controller’s movement.

giphy (2).gif

I also worked with Ernest to get the layout of the museum world. If you’d like to learn more about this he has written an excellent blog post on the subject here!



Oculus Launch Pad Week 2: Down in VR Trenches

This has truly eventful week within the world of VR for me. Along with creating my first scene for Museum Multiverse, I’ve done my first public talk about VR at Game Devs of Color in New York City. I have been working with Ernest Walker another of the Oculus Launch Padders in order to create a very polished rendition of the project here is what we have learned.

There are about nine clear scenes I see in this launch pad demo, so I started with creating the most straight forward scene I saw from my storyboards. Within this scene the player has switched perspectives from 3rd person into first person. This was the first task getting the Oculus plugin from the Oculus website in order to have the first person perspective.

My next task was to create a convincing scene for the player. To paint the picture of what is going on in this scene, the player is currently being chased by a monster in the museum. The player finds a couple of lockers in the employee’s section of the museum and hides within one of the open lockers in order to hide from his pursuer.

Screen Shot 2017-06-20 at 2.02.51 AM.pngTo make this convincing, but also to cut back for Gear VR I created the scene around only what the player will see. Luckily, this was great for me because the player’s sight is limited due to hiding within a locker in the scene. Afterwards Ernest created really high quality models for the interior of the locker. This added a level of immersion I could not do without him. His knowledge of 3D environmental modeling is very impressive and I am very excited to be working with him.

This is the working prototype of the scene with everything together.

test with cubes:

test with a monster:

final product:

There is still a lot of work to do but we are making progress and it feels good.

On another note, I did a talk about VR in Games and Beyond for Game Devs of Color. This was a micro talk designed to educate people on what is VR, tips on VR development, lessons from the trenches and advice for starting their project in VR. If you’re interested in seeing my talk I am at 2 hours and 36 minutes in the livestream.

Releasing My First Video Game Don’t Look Away VR

On January 25 2017 My friends and I released a VR experience called Don’t Look Away for the Gear VR.


We have been working on this project for a accumulation of 7 months and for many of us on the team it was our first video game we have ever released. It was a long and difficult journey. We learned, so much and had so much fun, but it was time to set our little VR baby out in the wild for others to enjoy the fruits of our labor. We released our project only expecting to get maybe 500 downloads of our experience. However we were pleasantly freaked out to see that our downloads for Don’t Look Away were over 67,000 after the first two weeks. Soon after showing Don’t Look Away at PAX East our downloads reached 100,000+.

Our pet project was now unprecedented success and we had no idea why we were more successful than our mentor’s project Swing Star VR. In hind sight I think we did couple of things right that set us up for success. I’d like to share these tips in the hopes it can help future indie game devs looking to make their dreams come true as well.


  • Tip 1: Reaching out to Youtubers before the game release.

Before our game’s release we contacted a ton of Youtuber’s who specialized in reviewing or doing commentary on VR games. We offered them a free copy of the game before release to review our project and told them them they cannot release the video until a day or two before the game came out on the Oculus store. It is important to give the access early to Youtuber’s in order to entice them into creating a video for your game. The Youtuber that releases a video first on a new game gets the most views, so it is a race to finish their work and getting the video ready for the embargo date. This helps them with getting viewers excited to see a new game before its release which bumps up their subscribers and it help you as the creator to get your game in front of as many people as you can within your release window. This way potential buyers can see your game through a Youtube lets play and pick the same game up when its released later that week. How did we get the contacts of all those Youtuber’s you might ask? It was just a process of going on Youtube and finding reviews we liked then contacting them through their contact page on Youtube.


  • Tip 2: Going to PAX East

Going to events like PAX, E3 or Play NYC is an awesome way to get hundreds of people playing your game within a small amount of time. Going to these events are great because you get to see real people trying out your game. You can see the low and high points of your game. It is also just rewarding to see people react to your work. Seeing the excitement and fear on a player’s face for a feature or level you’ve worked on for months is a necessary cathartic exercise in making interactive art. Along with this benefit, if you are unable to regularly play test your game in a smaller venues like Playcrafting’s Expos or NYU’s Playtest Thursdays on the East coast,  going to these event are necessary in order to finding and fixing level and design problems within a game. Usually we as developers are so focused on making our project that we create things in vacuums. Showing your project gives you the real world player feedback you’d need to improve the experience and the gratification that your work matters.


  • Tip 3: Lean on your support groups in your area

We in NYC have an organization called Playcrafting NYC and the community in this group was very important in not only getting Don’t Look Away ready for release, but actually forming the group to start the project! Everyone on the team of Don’t Look Away I either met or heard about through Playcrafting. The Playcrafting community was made to empower game developers to make and finish games. Organizations or meetup groups like this are rooting for your success because it’s a win-win scenario for all parties involved. If a hit game is released within the community that drives more buzz around that group and if you’re making a game you can lean on the community for getting the word out, getting help on hard problems or just finding talented committed people to join your team. If you do not have a community group like this around you, I’d say make one. I guarantee you someone within your area either wants to make a game or app and does not know where to start or is making a project siloed and could use a community of even just one for support. Groups like this start with just a few people and grow, so if you’re making a game find or make a group to support your process on finishing your experience.

Currently, Don’t Look Away is near is a couple thousand shy of 200,000 downloads and players of the experience have collectively spent over 930 days within our work. Honestly, it is easy to put a my face to this project as the lead creator of it, but projects like Don’t Look Away almost always has a team that makes it happen and that is why I’d like to take this time out to thank everyone who has worked on this project and stuck with it to the end. Andy LohmannBobby (Robert Canciello), Jose Zambrano, Andrew Struck-Marcell,  Sean Hyland and everyone that help in their small way. Don’t Look Away would not have been without you all.

We are getting ready for our next project so stay tuned to our progress on Twitter and Facebook

Try out Don’t Look Away (here).