The Schick Hydro Indie Game Jam

Last week Schick Hydro partnered with Playcrafting to put on a indie game jam at Simple Machine in New York City. The planning and announcement for this jam was confidential until a little ago, so I could not talk about till recently.

We came up with a cool game called Calkarious.

Calkarious is a cooperative top-down shooter where two players must defend a powerful brain coral under constant siege by bioluminescent creatures.


Set inside a massive sphere of water in deep space –– Calkarious
involves rapid decision making and quick maneuvers as players struggle
against 6 different colored enemies.


Each enemy can only be killed by a shot matching its own color, and players
will need to constantly switch up their attacks to survive.


A third player can even take control of the brain coral itself (with the mouse), and
change its color to absorb similarly colored enemies. Absorb enough
enemies, and you’ll be able to release a devastating pulse attack.


The brain coral can only take three hits before it is destroyed. Players need to work
together to defend this magnificent coral for as long as possible and
achieve the highest score.


The team for this game jam comprised a group of rockstars game designers and developers! Our team consisted of three of Eos Interactive’s team members Jose, Bobby and John. We also had a the VR game design rockstar behind Paulo’s Wing , Kevin Harper. Last but not least, the last member of the team was me.

One the most interesting constraints we had for the game jam was getting our content to work inside of an arcade cabinet. The arcade cabnet was called Polycade and it was made for creating custom games on a custom cabinet.


We wanted to use every button in the arcade cabinet for our game.

Polycontrols Our goal of utilizing every button led us to the game mechanic of shooting enemies with the corresponding color and moving the brain coral with the ball on the arcade cabinet to get the brain coral out of danger.


Calkarious is free to play today on itch and will be available to play in an arcade cabinet, the best way to play it, on December 15th a Playcrafting’s 2017 Bit Awards.

Screen Shot 2017-11-26 at 3.13.20 PM

Come root on my first VR project Don’t Look Away for the Bit Awards.

Any who this past weekend was fun but I cannot wait to return to Museum Multiverse, my VR puzzle platforming epic! I have had so many cool things to show everyone about the game soon!

Showing Museum Multiverse to the World!


This pass weekend I was able to show Museum Multiverse and my other projects to New York City at Microsoft HQ. The event was part of Playcrafting‘s Halloween Expo, An event with over 150 indie games and over 1,200 attendees. Now thats a lot of the play-testers.


I was with my team for the expo, so we were able to get a big room for our projects. The projects that are out, Don’t Look Away and Witchualistic and our projects in development, Museum Multiverse and The Take.

I got over a hundred play-testers to tryout Museum Multiverse along and got valuable feedback from the experience.

Here is what I learned.

  1. My first major puzzle is still too hard for a lot of players.

Screen Shot 2017-09-11 at 11.47.46 PMI had to give a lot of hints to people for my light reflection puzzle. This means I should have another puzzle to leading up to this more complex challenge. I have created a easier puzzle before this challenge in order to help the player understand what to do in the room players had trouble in. I got this advice from one of my friend who made a hit game called LineLight, Brett Taylor. He recommended that I should strip away the noise from the puzzle which will help the player understand the mechanic, so I came up with this:
Screen Shot 2017-11-01 at 10.44.46 PM This is a cleaner noise free puzzle which would give the player the understanding of the light mechanic.

     2. Players loved the 2D section but the controls need improvement.

giphy (4) Jumping and landing is slippery in 2D. I think I can fine tune some Unity parameters to fix this problem.

    3. I still have a lot of work to do!

There is so much to do, but this is exciting! Everyone enjoyed the project and some people even came back to play the experience again!

I will continue to update you all on my progress on Museum Multiverse stay tuned on twitter, facebook, and on the site.

See you in VR!

A New Way to Hear? fmod in Prod

In Museum Multiverse I knew to make the project truly come together I would need an amazing soundtrack to captivate the player. Thanks to Niko Korolog and his work with adaptive music in my game I now have music that will suck the player in from the start to end of the demo. Niko used a program called FMOD to create an adaptive soundtrack. FMOD is a sound effects engine for video games and applications developed by Firelight Technologies, that play and mix sound files of diverse formats on many operating systems, to learn more about this awesome application visit their site.

Screen Shot 2017-08-10 at 1.18.09 PM.png

This program gives me the control to shut off layers of music at my choosing and turn on other ones through code. To get started on learning this magic I’d recommend this awesome tutorial from FMOD on integrating this middleware into Unity.

I cannot wait to continue incorporating this adaptive soundtrack into Museum Multiverse.

Week 7: Sketching out the Main Character


This week I focused primary on creating the main character for the game. I knew the general character I wanted for the project but I did not know how to go about starting his design. From my art past, which was very long ago, I focused on making caucasian main characters because thats what the resources from my lessons taught.

20170730_230514.jpg 20170730_230527.jpg

However, I knew from the start of planning this project I wanted a African American boy as the main character of this project so I had to dig for inspiration on this front. This was very challenging due to the limited number of good main characters that happen to be black let alone video game characters. I looked at character like Huey Freeman from the Boondocks and the children from Playdead games like Inside and Limbo get a general idea of what I wanted.


I wanted to show the character a scared and vulnerable, so I went with this hunched over scared look. Then continued with more angles and ideas about the character.


I want the character to have a Afro, but after a fruitful conversation with Ernest we came to the conclusion that his fro should not be the biggest thing on him so I shorten his hair to make him and his actions the most memorable part of the character not the fro.



I still have a lot of work ahead of me for making this character work but I believe we are on the right track and we are weeks away from making this character 3D modeled and animated. So more to come soon!!!

Week 4: Handling Camera Movement in VR


I am facing a problem in Museum Multiverse, the third person camera still feels weird following the player. I want to make sure the movement is smooth and comfortable for the camera and player, so I have been looking around for solutions and I think I found one.

Limiting the player’s peripheral view reduces the motion sickness of movement. I learned this from a paper I found on the subject by Ajoy S Fernandes and Steven K. Feiner at Columbia University. Basically the solution they learned from experimentation had a real reduction of motion sickness by limiting the view based on player speed.

So how do we do this in Unity?

First we would need to import the older Image Effects into Unity from the asset store. We are really just looking for the Vignette And Chromatic Aberration script. After you import this add this to your main camera. Once this script is added you need to set everything on the script to 0, you will only be playing around with the vignetting option.

Screen Shot 2017-07-09 at 3.11.17 PM.png

Try playing around with the Vignetting values to see how it effects your camera!

Screen Shot 2017-07-09 at 3.14.53 PM.png

Next we are going to write a script to adjust the Vignetting based of the Camera speed.

using System.Collections;
using System.Collections.Generic;
using UnityStandardAssets.ImageEffects;
using UnityEngine;

public class FOVLimiter : MonoBehaviour {
    private Vector3 oldPosition;
    public float MaxSpeed = 6f;
    public float MaxFOV = .7f;

    public static float CRate = .01f;
    public static float RateCutOff = .25f;

    // max .7 Vignetting

    private VignetteAndChromaticAberration fovLimiter;
    // Use this for initialization
    void Start () {
        oldPosition = transform.position;
        fovLimiter = GetComponent<VignetteAndChromaticAberration> ();
    // Update is called once per frame
    void Update () {
        Vector3 velocity = (transform.position  oldPosition) / Time.deltaTime;
        oldPosition = transform.position;

        float expectedLimit = MaxFOV;
        if (velocity.magnitude < MaxSpeed) {
            expectedLimit = (velocity.magnitude / MaxSpeed) * MaxFOV;

        float currLimit = fovLimiter.intensity;
        float rate = CRate;

        if (currLimit < RateCutOff) {
            rate *= 3; //fast rate since the field of view is large and fast changes are less noticeable
        } else {
            rate *= .5f; //slower rate since the field of view changes are more noticable for larger values. 

        fovLimiter.intensity = Mathf.Lerp (fovLimiter.intensity, expectedLimit, rate);

So what the heck is the Field Of Vision (FOV)Limiter script is doing? We are grabbing the distance the player has traveled each frame to find the speed of the player and calculate how much of the field of vision should be limited based on the player speed. So remembering some key points from the paper, the rate of the FOV transition can be faster at some points when the field of view is large because fast changes are less noticeable, but FOV changes are more noticeable for larger values.

Right now this is working pretty well but I know this is only step one to making a great 3rd person VR camera. Next week I will be focusing on making a smarter camera that can follow the player without getting stuck on walls.

If you would like to learn more about limiting camera view to prevent motion sickness and other VR tips I would recommend checking out FusedVR these guys are great!

Week 3: Getting the Oculus Controller Working and 360 Videos are Awesome

Let me start this post by saying 360 photography is awesome!


I have received my New Gear 360 Camera this week and I have been taking photos everyday since and it is so much fun to capture the entirety of the moment in a picture and my life has some really silly moments!


Also this week, I have incorporated the Gear VR controller into Museum Multiverse. I wanted to find a comfortable way of moving a player in 3D space and I think I found it with Easy Input for Gear VR from the Unity asset store. This made moving a Gameobject very easy in Unity with the Gear VR controller. I have decided to use the trackpad on the Gear VR controller to move the player around because it feels more like a joystick on a normal controller.

giphy (3).gif

My next challenge is linking up my character’s movement animations with the controller’s movement.

giphy (2).gif

I also worked with Ernest to get the layout of the museum world. If you’d like to learn more about this he has written an excellent blog post on the subject here!



Releasing My First Video Game Don’t Look Away VR

On January 25 2017 My friends and I released a VR experience called Don’t Look Away for the Gear VR.


We have been working on this project for a accumulation of 7 months and for many of us on the team it was our first video game we have ever released. It was a long and difficult journey. We learned, so much and had so much fun, but it was time to set our little VR baby out in the wild for others to enjoy the fruits of our labor. We released our project only expecting to get maybe 500 downloads of our experience. However we were pleasantly freaked out to see that our downloads for Don’t Look Away were over 67,000 after the first two weeks. Soon after showing Don’t Look Away at PAX East our downloads reached 100,000+.

Our pet project was now unprecedented success and we had no idea why we were more successful than our mentor’s project Swing Star VR. In hind sight I think we did couple of things right that set us up for success. I’d like to share these tips in the hopes it can help future indie game devs looking to make their dreams come true as well.


  • Tip 1: Reaching out to Youtubers before the game release.

Before our game’s release we contacted a ton of Youtuber’s who specialized in reviewing or doing commentary on VR games. We offered them a free copy of the game before release to review our project and told them them they cannot release the video until a day or two before the game came out on the Oculus store. It is important to give the access early to Youtuber’s in order to entice them into creating a video for your game. The Youtuber that releases a video first on a new game gets the most views, so it is a race to finish their work and getting the video ready for the embargo date. This helps them with getting viewers excited to see a new game before its release which bumps up their subscribers and it help you as the creator to get your game in front of as many people as you can within your release window. This way potential buyers can see your game through a Youtube lets play and pick the same game up when its released later that week. How did we get the contacts of all those Youtuber’s you might ask? It was just a process of going on Youtube and finding reviews we liked then contacting them through their contact page on Youtube.


  • Tip 2: Going to PAX East

Going to events like PAX, E3 or Play NYC is an awesome way to get hundreds of people playing your game within a small amount of time. Going to these events are great because you get to see real people trying out your game. You can see the low and high points of your game. It is also just rewarding to see people react to your work. Seeing the excitement and fear on a player’s face for a feature or level you’ve worked on for months is a necessary cathartic exercise in making interactive art. Along with this benefit, if you are unable to regularly play test your game in a smaller venues like Playcrafting’s Expos or NYU’s Playtest Thursdays on the East coast,  going to these event are necessary in order to finding and fixing level and design problems within a game. Usually we as developers are so focused on making our project that we create things in vacuums. Showing your project gives you the real world player feedback you’d need to improve the experience and the gratification that your work matters.


  • Tip 3: Lean on your support groups in your area

We in NYC have an organization called Playcrafting NYC and the community in this group was very important in not only getting Don’t Look Away ready for release, but actually forming the group to start the project! Everyone on the team of Don’t Look Away I either met or heard about through Playcrafting. The Playcrafting community was made to empower game developers to make and finish games. Organizations or meetup groups like this are rooting for your success because it’s a win-win scenario for all parties involved. If a hit game is released within the community that drives more buzz around that group and if you’re making a game you can lean on the community for getting the word out, getting help on hard problems or just finding talented committed people to join your team. If you do not have a community group like this around you, I’d say make one. I guarantee you someone within your area either wants to make a game or app and does not know where to start or is making a project siloed and could use a community of even just one for support. Groups like this start with just a few people and grow, so if you’re making a game find or make a group to support your process on finishing your experience.

Currently, Don’t Look Away is near is a couple thousand shy of 200,000 downloads and players of the experience have collectively spent over 930 days within our work. Honestly, it is easy to put a my face to this project as the lead creator of it, but projects like Don’t Look Away almost always has a team that makes it happen and that is why I’d like to take this time out to thank everyone who has worked on this project and stuck with it to the end. Andy LohmannBobby (Robert Canciello), Jose Zambrano, Andrew Struck-Marcell,  Sean Hyland and everyone that help in their small way. Don’t Look Away would not have been without you all.

We are getting ready for our next project so stay tuned to our progress on Twitter and Facebook

Try out Don’t Look Away (here).

Time Management Frontend vs Backend

Screen Shot 2016-07-25 at 2.47.07 PM

I was having the biggest problem dealing with time on my small weather app. I created the logic on my Rails backend to take the current time format that time object into a 12 hour format then return the formatted date string on the frontend of my app. This worked perfectly… until I deployed the application. I used the Heroku to deploy and host the app, but within the deployment I would gain 4 hours on my app giving my shiny cool new app the wrong time.

Screen Shot 2015-06-03 at 11.01.39 AM

As a junior dev, at the time, I knew this must have been a problem based on where the Heroku servers where located; However I had no idea how to handle this in Rails and I was too new to JavaScript handle it on the frontend. So… to solve this I monkey patched a solution, I subtracted the date by 4 hours  and called it a day. The problem was solved until daylight savings rang it’s ugly head. Once daylight savings happened I had the wrong hour now matter what the time zone the app was opened in.

I really did not understand how to solve this problem until I learned more about the magic of JavaScript on the frontend of an application. I learned that the time will always be right for the user’s location on the user’s browser so I just asked the browser for the time.

Screen Shot 2016-07-25 at 3.16.29 PM.png

by doing this I by pass the backend, server, and Heroku which messed up the time on the app.

Now Forcast gif shows the right time and all is right in the word… for now.

Running a Simple Script in Ruby

Hi Everyone!

So I recently had to write a script in Ruby at my job to scrape a webpage for information when I realized… Wait how do you run a ruby script? I was so use to using Rails Server command to run my programs in Rails and had no idea how to do this in a non rails/ Sinatra setting. So to do this of course after you have made your script with a “touch ‘your script name'”.

When you want to run you Ruby script you’d enter the command

ruby -r "./your_script_name.rb" -e "YourClassName.your_method_name 'any parameters in your method'"

So what is going on here when you run the code is your telling the terminal to use ruby then the “-r” tell the terminal to run this program that is followed by the file name which needs the correct location in your computer; in this case the file and location would be:


The “-e” tells the program to go into the the ruby file then run the class method in this case the class would be


And the method name would be (with any addional params. If the method has no params just don't include anything after the method call)

your_method_name 'any parameters in your method'"

I hope this would clear up the mystery in running your own simple scripts.

I made a App… again!

So I have been working on this really cool idea for a app ever since Tim Holman came to speak to my class. I created a working prototype of my App. What the App does is gets the user latitude and longitude using the geolocator gem.

Screen Shot 2015-06-03 at 3.17.01 PM

After the app has your latitude and longitude it can now use the forecast io api to get the weather from where the user is based on the lat and long.

Screen Shot 2015-06-03 at 3.24.12 PM

This information is in a JSON that the application can now parse for all the goodies needed to give . With those goodies the app can find a relevant picture for a hash that I created with urls of images related to that weather and make a call to the gify api to find a nice gif to boot.

Screen Shot 2015-06-03 at 3.27.59 PM

Then on the rendered html the background is sent to the back and the gify is embedded in day’s description using Tim’s Giflinks.

Screen Shot 2015-06-03 at 11.01.39 AM