This project was the end result of a portion of my study abroad trip to Poland. Together
with 13 other students from Northeastern, we worked with around 28 students from the
Polish-Japanese Academy of Information Technology to develop games from game jam concepts
centering on topics of Polish historical or cultural significance. My team was a collective
group of 10 students split across the programming, art, and music disciplines. Together, we
overhauled the game jam concept and developed Oil City in Unity, a city-builder factory game in line
with Factorio where you must maintain the Polish oil industry through its historical hardships
and global triumphs.
In this project, I took on the role of a programmer and game designer. The main thing I contributed
to the project was the implementation of the pipe system and time manager.
In order for the game to function, there needed to be some system in charge of delegating game ticks to objects that needed them. This included various time-related managers such as the TimelineManager, in addition to the main mechanic of the game: oil flow. To create a generalized tool, I made a TimeManager script that would store its registered objects in a forest list; the roots of the trees being the only gameobjects to receive the ticks. For objects like the Woodcutter's house, it would remain in the list as a singleton tree. For things like the Train Station (often the root of the oil system), it would receive the tick, then recurse down its members to pass the invocation on to them as well. This kept every system deterministic in their execution order (i.e. in an oil system, the consumer wouldn't somehow go before the producer), which made the game easy to debug and play.
I was in charge of making the pipes, a crucial part of the game. To best fit with the requests of players and QA testers, I had
to rework the system's implementation to allow for different types of connections alongside ambiguous connections and temporarily invalid
connections. Pipes had to flow between two Flowable structures (a building or another pipe), maintain a valid connection between
said structures (i.e. flowing from a valid direction into a valid direction and containing the expected fluid type of either Kerosene or Oil
for that kind of connection), and possess the ability to be "flipped" on demand in case a player made a mistake in placing them down.
This was a complicated mechanic that gave me a bit of trouble. The problems I had mainly revolved around pipes of zero or one length (created by
connecting to adjacent pipes), and creating understandable syntax for documenting the obscure problems that popped up. The former proved to create
a lot of issues due to my definiton of pipes have a left/right-hand side endpoints. A singleton pipe had the two positions be the exact same, meaning
that a further distinction with pipe flow direction AT those endpoints was necessary, from which made it determinable what tile the singleton pipe was
flowing in to or out from.
In retrospect, a lookup table of tile positions to Flowable objects would've been a better approach for this
system rather than a hard-coded tree as it would've made the coupling/decoupling issues trivial to solve, if only
a bit more tedious to access.
A solo personal project (WIP) in which I challenged myself to create a modular
fighting-game framework. The main focus of this was particularly on the structural challenge of
designing asset code structures that are easy for designers to create stuff with, but also
powerful in their modularity. I evaluated several approaches to implementing move data, but
settled with Scriptable Objects and repurposing Unity’s Mechanim State Machine Animation States.
I used ripped sprites from Scott Pilgrim VS The World: The Game so I could focus on the
programming. Every script is custom-written in Unity, with the only package being InputSystem.
This project is being made in Unity, with the only package currently in use being the InputSystem package. I chose this package because I wanted to have a non-programmatical coupling between input and behaviors. With InputSystem's serializable unityevents for input, this was a clear choice.
In order to achieve a smooth gameplay feel, I wrote my own input-buffering system to prevent buttons from being lost and to allow players to input actions during the downtime of another. This buffer used two lists in parallel to keep track of input history and decay period before expiration, using binary insertion to add inputs quickly to the lists. I didn't use a priority queue for the input histories based on the decay time left because the queue would need to reorganize itself often. Additionally, the queue would need to remove items that may not be at the front in the event of an input expiration. This would've gotten messy, so I stuck with two paired lists and the invariant that they were related by index.
As mentioned in the project description, I used sprites ripped from a game in order to develop the framework. These sprites were organized on a sprite sheet in uneven columns and rows, making the slicing very difficult. To circumvent the time required to take manual screenshots of each frame, I wrote a python script to detect and flood-fill individual sprites, writing the data to separate image files. This allowed me to drastically cut down on the time required to set up the art.
When designing this kind of game, I had considered using AnimationEvents in the past as a means of
storing information for callbacks and triggering functions. However, this would prove to be a bad design choice
as the information was hard-coded into animation clips, making them impossible to reuse without copying the
stored information. Additionally, the animation event system uses reflection to activate callbacks, a practice
which scales poorly when lots of components or children are in the object's hierarchy.
To fix this, I used a SO design of "Tracks" with "Sequences". Sequences contained Tracks which contained KVpairs
of frame timestaps paired with a generic value. A hitbox sequence for example would have hitbox data at certain frames,
indicating that at the given frame timestamp, that hitbox should appear. In order to run a Sequence, I used Unity's Mechanim
State Machine monobehavior base script as a sort of launching point. Animation states would have their respective clips for certain
attacks, but would all use the same state script. This state script was in charge of executing the attack's respective SequenceSO, running
through every Track with every Animator update.
To help with the development of Tracks and Sequences, I created a dev scene in Unity to "scrub" through an animation and its Sequence.
This would play back the animation and Sequence in parallel, sending the event invocations from the Sequence to debug mocks of the same
components, which would report the information to the console or draw things in the scene to represent the gathered data.
This was very helpful in cutting down the amount of time I spent wading through code and SOs for bugfixing, and I think
it looks pretty neat.
A solo personal project puzzle game about matching Dall-E generated pictures to their original prompts. The main goal of the project was to finish and polish a project completely. Some sub-goals were creating an easily-approachable game, extensible dialogue system, dynamic and interesting NPCs, and variable levels of win conditions rather than a cut-and-dry right-or-wrong puzzle system.
This project was made in Unity 2019, using Blender for modeling and animation with Maya as an animation touch-up and ASCII fbx exporter. All assets aside from the AI art were self-made. The art itself was generated with Dall-E.
In order to bring the (simple) characters to life, I programmed collision-avoidant roaming behaviors for the NPCs using the Navmesh component just to pathfind rather than pathfind AND traverse. Additionally, I gave the models shapekeys for various emotions to incorporate in their dialogue boxes. The models also had Rigging multi-aim constraints so that the NPCs could look at objects in the scene as they passed by.
It wasn't really necessary for the game's mechanic, but I wanted cheeky NPCs, so I made a dialogue system. As one does. It was a fully-custom made system that allowed for variable text speeds, in-editor dialogue editing, and easily-extensible world-events during dialogue, such as animations, flag triggers, gameobject toggles, etc.
A multi-semester project for a class I took called Game Studio. In a mock studio environment with 40 other students, I continued work on a student-led project. I primarily contributed to the codebase with custom utility tools (e.g. facilitating puzzle design with line rendering utilities) and refactoring of old student code, but I also worked on writing and implementing dialogue through the external Inky application. I worked in a small team with 6 other interdisciplinary students in which we would create weekly tasks, following the Agile workflow with Jira.
This game was developed in the Unity engine. The main software I used aside from that is Inky, a narrative tool that has a Unity plugin. Some of my work was adapting dialogue to this plugin, as well as bugfixing implementation details and creating how-to guides for the dialogue writers.
For this project, we used git and Github Desktop for source control, operating on sprint-branches to maintain a clean working tree with a stable build at all times. Additionally, to support the mock studio environment, we used Jira to delegate and organize tasks with the story point system. We would also have scrums with our designated groups and disciplines.
The original code for moving elements such as platforms was hard-coded to only allow two points of movement, and ONLY platforms could move. This was pretty limiting, so I redesigned the moving element system to be a generic component that allowed for multiple anchor points to traverse between, accessible in the editor. Additionally, I added different wrapping types so designers could easily change the movement, behavior, and start position of platforms.
In order to help level designers in the creation of "power lines" to represent energy flowing in our puzzle system, I created an editor script tool that would use the line renderer component to generate a prefab with the baked line mesh between a set of points. The line generation used a custom-written pathfinding algorithm and supported several creative parameters. I also created guides to use for this tool as well as the dialogue implementation pipeline for artists and dialogue writers.
A game jam submission for Ludum Dare 54. Working in a team of 5, I led development on a movement-shooter with a limited-space twist. I mainly contributed programming and game design, but also mentored others in learning Blender for animation and modeling, and guided the programming division with tasks and code-design structures and interfaces.
The project used Unity 2019 as it's main engine. I helped others by providing answers to their questions on implementation details and Unity systems as well as operating as the lead programmer. For Blender, I instructed the team's animator in how to use the software in addition to giving them some tips for satisfying and clean animations.
Most of the project was spent on development, so near the end of the jam, I designed and implemented 7 levels to highlight key aspects of our game: the movement, shooting, and our wall-jump mechanics. it was a fun challenge to design levels that force the player into discovering the mechanics in a natural way in addition to being interesting to explore.
The design paradigm of our game. All of the developers on our team wanted this game to be speedrunnable, so we wanted to make a quick, fast experience where players had to rely on their reaction times (and failing that, their memory of the map) in addition to their creativity to find routes through our levels.
A short summer project where I challenged myself to create a non-kinematic state-machine-based character controller that was different from the standard implementation in two ways:
This was a Unity project in which I used a model from player model from the Unity Asset Store and a bunch of pre-packed Mixamo animations. This was to cut down on the time I spent working on the animations so that I could focus on the technical side of things.
As was the main goal of the project, I created a non-kinematic character controller that used the floating-collider implementation to avoid the quirks of stairs and slopes. Additionally, I made the controller a monobehavior composition-based state machine so that states could be added and removed at runtime. I was met with a lot of difficulties about execution order and "assessing" information about the world, but it was a fun challenge. There are a lot of parameters to tweak for this implementation, too.
Below is the prototype of the game's main mechanic alongside a second mechanic demonstration in the finished product.
The end result of my senior project. My goal was to create a small, polish, proof-of-concept for a puzzle game all about attraction and repulsion. To do this, I learned new tools like ShaderGraph and the HDRP rendering pipeline. I also used practices like Inheritance and modular OoD in order to create game mechanics like my reusable puzzle system and the gravitational-pull system. The latter allowed me to extend the game's original mechanic to affect the player as well, which I was pleasantly surprised by.
This was the first project in which I tried my hand at creating some shaders. Rather than use the HLSL shader
langauge, I started with ShaderGraph (from the HDRP render pipeline) in order to get an understanding of how
shaders work. I created some basic shaders for illuminated tiles, water, and "disslution grids" (dissolving energy fields).
Much like the rest of my projects, I used Unity and Blender for the game engine, asset creation, and animation creation. This was
the first time I used the InputSystem package as well.
This game was made with puzzles a la Portal 2; objects would give or receive power. This led me to create an extensible interface and class structure to easily create new kinds of puzzle objects (a button, a dissolving gate, a cube-former, etc.) and connect them together in the scene.
This was the first time I wrote a character controller for Unity from scratch. I made a non-kinematic rigidbody player
controller that I could easily impart external forces to through my main game mechanic in order to make the movement
feel fluid and natural while also having easy control over parameters. I also made it use a state machine to avoid
disorganized code, which worked out pretty well for my first time.
Additionally, I utilized camera overlays to produce a User HUD. It was a cool technique that taught me a lot about how
cameras render and cull objects, including how view frustums sometimes cull things within view and how to avoid that.
The final project for our Object-Oriented Design class. With two other students, we made a JavaFX-based bullet journal using design patterns (MVC). The goal of the project was to create an easily extensible codebase with modular implementations that would be compounded upon with further iterations of the assignment. One of the lightbox images includes the UML diagram we made for the code layout of our project.
Working in a group of three allowed for me to get some experience working with git source control. We used separate branches per person, merging when relevant tasks overlapped. Java and JavaFX were used to create the application, with Gradle as the main library/asset manager and build tool. It also provided build tasks for unit testing, which our team used to validate the functionality of the backend.
In order to implement the functionality of being able to read and write schedules to files, I worked with FXML's file browser
dialog to select file paths, using java's File I/O library to read from and write to selected paths. Additionally, we implemented
password locks for these files, preventing them from being read until a valid password was given.
For the user interface design, we used javaFX for its ease of implementation and coupling with back-end functionality. We ended up
forgoing the tradition MVC design in favor of custom root UI controllers that extended layout elements. This allowed us to easily
create UI at runtime without the worry of having to set up connections and the like.
Additionally, we wanted to avoid having a complicated user interface for our program. To achieve this, we made use of many custom dialog
popups so that information would be displayed on throw-away windows that wouldn't clutter the main view.