VAIL VR is a virtual reality multiplayer competitive shooter. Emphasizing tactical gunplay, high-caliber combat, and collaborative teamwork. During my time at AEXLAB, I supported many different programmers which I loved because it really brought us close as a team. We would all want feedback on our code and you can say it was reviewed at least twice before being seen by the Lead Programmer or CTO. That environment may not have been as efficient as it could be but it really encompassed the word indie at heart. Everyone loved their job and you can feel the passion while you talk to anyone. Now lets get to the my specific contributions:
As a Gameplay Programmer at AEXLAB, I was responsible for a variety of tasks that contributed to the development and improvement of the game. I started with little experience in network programming, but gained extensive knowledge while testing my gameplay features using "Clumsy". In addition, I worked on bringing Unreal Blueprint classes to C++, implementing the functionality of the grenade system, adjusting Hand Poses for various objects, fixing gameplay bugs identified by playtesters, and creating platform compatible controller models for spectator reference. I also expanded the functionality of compatible headsets using the VRExpansionPlugin, communicated with 3rd party software companies to address bugs, and identified the source of EOS Subsystem/Plutosphere/Wwise crashes and bugs. To optimize the game, I worked on the codebase, VFX, audio, and texture. I also extended platform support to standalone devices using Jenkins and set up app upload and distribution on the Meta Quest Store.
The McDonough-Atkinson Ammonia Unloading Training Simulation is a VR training application created for the plant workers of Southern Company. The purpose of the application is to guide new employees through the ammonia unloading procedure from beginning to end in a virtual environment where there is room to make mistakes.
As a Software Engineer contracted by Southern Company, I was responsible for a wide range of tasks related to the development of a VR training prototype. Specifically, I created the VR mechanics for equipment preparation and object focusing, designed and implemented a virtual pad for users to complete task checklists, and all interactable actors necessary for users to complete the training procedures. I also extended UI menus, and introduced MoCap animations on NPC characters. I optimized the code and environments to ensure smooth gameplay and user experience. Additionally, I created level transitions to provide a seamless user experience, and fixed all bugs identified in the tutorial, preparing, and unloading procedures. I also created custom audio effects as needed and added a custom plugin for dynamic spline hoses to improve the prototype's realism. To ensure ease of use, I introduced ladder climbing mechanics that are comfortable for users new to gaming or VR and many help cues that would lead the user to perform the intended task. I also incorporated ease of use features based on user feedback to improve the training prototype. Finally, I created a main menu to provide users with easy access to the different training procedures.
Learning how to handle animals before students are prepared can be dangerous for both the animal and the handler. The Unity College XR Innovation Lab has developed a simulation where students can interact safely with an animal and study how they react to their handling tactics In this assignment students will learn how to recognize animal body language and behavior, assess how human behavior impacts animal behavior, how to respond appropriately to animal body language and behavior, conduct an animal behavior assessment and identify low-stress handling techniques for human-animal interaction.
As a Software Engineer contracted by Unity College, I was in charge of preparing a prototype of this application as close to production as possible given a short time constraint. In a team of 2, I was able to complete the following tasks:
Level Design, all UI (Main, Pause, Settings, and Control Menu), the interaction mechanics (UX) with the animal and the reactions the dog would have to those interactions. On top of that I was able to prototype a VR version building on the existing codebase.
The year is 20xx. Milk has become one of the most efficient sources of energy in the galaxy. Enter the Robo Legion, Robotic overlords of zenmos prime, hellbent on stealing the Milk for themselves. The Robo Legion is set to take all of Earth's cattle, in turn, acquiring a limitless supply or milk. You are 'The Sheriff'. A watchful protector of these bovine creatures. Armed with your time limited psychic abilities and trusty six shooter, you must protect the cattle from the steely claws of these robotic menace.
As a Gameplay Programmer contracted by Good War Games, this was a different experience because it was based on a mobile platform. This changes how you use Unreal Engine in more than a couple ways. First, I needed to ensure that the game runs well on devices with lower hardware specifications which means I need to reduce CPU/GPU memory usage and maintain good performance. This required me to bring this blueprint designed game to C++ in its entirely. Second, UI/UX must be intuitive and easy to navigate on smaller screens and touch-based interfaces because the input is completely different. Lastly the testing aspect of the development process is very different. Compatibility testing on different devices, performance testing, and battery life testing are all essential to have a production ready product. My main focus on this game was updating the codebase and optimizing.
Fire360 is a VR Firefighter Command Simulation. The goal of the application is to help firefighters-in-training reinforce procedural behaviors
to then use on the job. The firefighters are then guided by a trainer throughout the simulation through various kinds of obstacles. This
version of Fire360 takes place in a virtual environment that represents a single story residence and multi-story apartment complex in Miami. Below are the capabilities of the simulation:
The simulation contains 12 Crews of Firefighters assigned to a vehicle. Vehicles are made up of 1 command, 4 engines, 3 quint, and 4 rescue
vehicles. The trainer can assign tasks to these NPC Crews according to the commands sent from the “Tactical” or the first firefighter to show
up to the scene of a fire. The tasks they are able to do are as follows:
The simulation is controlled by a trainer using an iPad app that was also developed by the I-CAVE team. It is used as a control center for all actions that take place within the simulation. The capabilities of this iPad app are as follows:
I worked for 3 years on this project going working on fixing the single story residence simulation to implementing new mechanics in the multi-story complex simulation. I worked on almost every aspect from creating models using photogrammetry to the implementing the fire and smoke system which was a combination of programming, vfx, audio, and animation. It was a really great experience and I will always be grateful for Steven Luis from Florida International University who gave me the opportunity to start developing my skills at the I-CAVE where my passion for developing virtual reality applications was able to grow.
What is “Community”? This is the central question of this VR experience proposed by FIU CARTA. In fact, it’s many things, including an
opportunity for student growth, faculty research, and an exploration of the future of higher education and looking at ways in which our
species approaches the environment and interpersonal connections, and develops and assessed human agency and behaviors by reflecting on the
4C’s: Creativity, Collaboration, Critical Thinking, and Communication.
FYE, First Year Experience Community VR is an application developed with Unreal Engine. The team was able to create a virtual environment representing the everglades. In this environment students have 15 minutes to create a structure using a limited number snapping blocks before the timer runs out. The goal is to practice the 4C’s and to analyze how it helped the process in creating their custom structure.
For this project, I worked on many different aspects of the production pipeline. This project was a bit different than the others here since I was Project Lead and met with the product owners and planned with the development team in order to deliver meaningful goals every sprint. I worked with Maria Constantini to plan the logistics of user testing during the COVID-19 Pandemic in which we had to take every precaution to make sure it was a sterile experience throughout. I also helped develop this program with Edward Gonzalez on mechanics, UI, and networking.
This project was contracted to us by Unity College. We are noticing that there are more universities that are started to enhance their curriculum with virtual courses and sometimes virtual activities or assignments. Virtual Forest was made for students in the field of conservation. The application lets these students step into a day in the life of a conservation scientist where they would enter a virtual deciduous forest equipped with a laser rangefinder. The student will then locate and count a specific species and use that information to estimate that species population. Our goal in this project was to do everything possible to make it run on the lowest spec possible and levels in between.
The project was already made but it was up to a team of two (Edward and I) to analyze the project and research the best tools to optimize all aspects and then deploy it in a way where mac and PC users of all kinds can all enjoy a consistently smooth experience. Using tools like MeshCombine, NatureRenderer, and our own custom scripts we were able to optimize to the point were we had a surface pro 4 running 120fps running on Vulkan. We debugged the entire simulation from analyzing frametimes and modifying the rendering pipeline accordingly to taking all scripts running on every frame and calling them when needed. It definitely one of the funs projects I've worked on.
A business investor funded this project. This one allowed me to test the limits of the Oculus Quest's native tracking hardware. Using VR Expansion Plugin and Antilatency hardware, the team developed a prototype application using Unreal Engine. We provided extensive reports on the margin of error within a mm and provided realtime solutions to mitigate drift in-game. Although we cannot discuss much further since it's a private IP, I can say that it was really interesting finding out how much we can push the quest's hardware. Although its just a Qualcomm Snapdragon 835 processor with 4GB of RAM, it is capable of awesomeness!!
This project was contracted to us by Helms Systems. Using Unreal Engine, Soulkeeper VR sets out to be an ambitious VR RPG. To this day, it will live as one of my favorite projects simply because everything we did was driven by a simple goal. That goal was for the user to have the best experience within each action. There's very few projects where developers can take their time and develop mechanics through the journey of trial and error. I was brought onboard after the first early access release and the team was looking to overhaul every aspect of the game. I had the honor of animating the player's hands and hand poses for the objects the player picked up (before we had automatic hand IK posing lol). I also worked on the AI of wildlife and farm animals. Additionally, I worked on optimizing the visual effects. I have always been a generalist when I work in these projects because I usually find myself in short handed teams. It is within these teams, that the original vision stays anchored and I had the pleasure of working with very talented people whom I learned from.
If you're looking at my portfolio, there's a chance you're doing some cool XR work I would love to be a part of. To give you some perspective, I am looking for a company/team that will help me grow as an engineer, expand as an artist, and thrive as a creator. Check out my work and don't hesitate to CONTACT ME if I can support your team.