Thursday 11 August 2016

Grow Proof of Concept



Grow is a Vive game about plants that think they are machines.




When working with VR, most of the old design conventions for screen based applications go out of the window. Normal interaction and feedback methods simply don't work, so you have to design less for someone using a computer, and more for a person standing in a room, interacting directly with whatever it is you are having them interact with. In some ways this makes things easier, in others it makes things harder, but the core requirement remains the same; to create clear and intuitive methods of interaction with your application. Grow was an experiment in that.

I wanted to create a game about synergies, about an ecosystem, or about machines. Something that met halfway between puzzling and exploratory, that rewarded the player for figuring things out. No manuals, tutorials, or text, the player would learn everything through play. At first, the player has a single plant and a watering can. Most players will do the obvious thing and water it, and see that doing so causes it to bear fruit.

If you want to play it "as intended", without spoilers, skip to the bottom and download the game from there. I'll wait.

Every plant in Grow has a function. It requires something, and produces something. The first plant, the Goodnut plant, requires water and produces nuts. The second plant, the Spigot, eats nuts and produces water. The two lean on each other, and discovering this relationship is key to understanding how the plants work. Of course, a player can just plant a pretty garden if they like, but they are likely to stumble across the interactions eventually.


Feedback on how these work needs to be very clear. Plants that need nuts have mouths, and plants that need water have flowers that bloom when they are watered. You can see both of these effects in the example above, and savvy players might notice that placing them close enough together will keep the area watered indefinitely. What do do with all this water?

Make a factory.



In Grow, players are only limited by the size of the garden. As long as they can fit the plants in, they can have as many as they want. Plants can only be planted on the soil, but can be moved about and re-positioned as needed. Here we see a line of Goodnuts being planted on either side of a Shufflebush. The Shufflebush works like a conveyor belt, requiring water and pushing items along. It is shaped like a half pipe, and can be placed anywhere, even in midair, which opens up some interesting opportunities.


Later plants introduce different resources, such as syrup. Syrup cannot be moved by hand, so players will need to figure out a suitable way to get it where they need it.


Players can create simple, efficient gardens, crazy Rube-Goldberg machines, or anything in between. I tried to create a sense of wonder and discovery in finding out how things work, and while there is a "goal", it is left deliberately vague; if a player is having fun, I don't want to get in the way with my silly rules.

Grow could be potential expanded with more plants, more plots to work with, and more areas to explore. Getting the player and resources from one plot to another is a particularly interesting challenge. There are a lot of tweaks that could make it better, such as a better, physics enabled pickup system, so I am likely to keep updating it.

EDIT: Here is a newer, better demo, shown as vrLAB in Brighon. https://drive.google.com/open?id=0BzvAo5Z23YJSZHl6YWp6Y0FxVWc

You can download the old prototype here. The menu button opens and closes your plantsack, and the trigger grabs. You can quit the game by pressing Esc.


Wednesday 25 May 2016

Touching a virtual world: thoughts on Leapmotion

I'm working on something odd but fun...



I've been learning my way around the Leapmotion integration in Unreal, and have started putting together a little game to test it out. With the new drivers, the hand tracking is very good, and I'm pretty excited about potential applications for it in future.

I've seen the Vive and Oculus motion controllers, and while they seem very cool, there's something very tactile about being able to use your own hands. Leapmotion uses an infrared camera to do that, and as such is the most intuitive VR interaction system I've seen to date.

There are a couple of downsides, of course. The lack of tactile feedback can make some actions, including fine manipulation quite difficult or unsatisfying. You can get around this with some really solid audiovisual feedback, but its still only meeting the problem half way.
Because of the nature of the device, it can only see what you see. If you move your hand out of the visible area, your hand no longer exists. This has obvious issues, but can also occur if your hand becomes occluded by anything, including your other hand. This makes complex gesture recognition really difficult.
Finally, you have the double edged sword that occurs when you consider accessibility. A system that fully utilities complete hand control will not be usable by people who do not have full use of both hands, effectively locking them out from using that software. Where with other input devices you could remap controls, that's simply not viable here; your controller is your hand.

That said, I still think it's a neat little device, and pretty good value for what it is. Now it's time to see what I can do with it.

Monday 2 May 2016

Siege

Siege was a self imposed challenge.

With most of my work going into my lecturing job, I end up not having a lot of finished pieces that I can show off, beyond the odd experiment or two. I decided to give myself something to do that would both fix that problem and push my skills a bit. The challenge was this: To build a VR experience in Unreal, in a very short period of time. In this case, it was over a weekend, in whatever spare hours I could get.


The task had a number of challenges to overcome.

Firstly was assets. With such a limited amount of time available, creating custom models and textures did not seem viable. My focus was on creating an interesting VR experience, so the majority of time needed to be spent in the Unreal Editor actually building the scene. The solution was to use ready made assets. I decided to go with the excellent Infinity blade assets by Epic games. The modular props included are both high quality and flexible, and so were a solid choice to work with. I would still need to create some custom assets, mainly materials and particle systems, but the asset pack was an excellent foundation. It also helped direct the theme of my scene: a castle under siege.

Notably absent from the asset packs are sound assets. There are a few for some ambiance and generic combat sounds, but not enough to create the type of soundscape I was going for. While I love sound design for environments, the time needed to source and implement appropriate sounds made it a low priority, outside the scope of the project. I chose to leave sound out, rather than do a half-job on it, and focus on other aspects.

The player themselves was the next major concern. Working with VR means that most design rules go out of the window, particularly dealing with UI. Scale feels very different when viewing a scene through a HMD, and the way a human looks around an environment is very different to the way someone explores with mouse or thumbstick control on a screen. The eye is drawn to colour and movement, so I planned to use this to help direct the player towards areas of interest. This became more important due to the lack of audio feedback; I was relying almost entirely on visual cues. I made the choice very early on not to allow the player to move. Not only was it not necessary for the project, but it allowed me to focus on just what the player would see from that one location. This helped with optimization and the planning of the more dynamic elements.

Another challenge was getting the lighting right. Unreal's default HDR options look great on a monitor, but don't make any sense when using a HMD. Your own eyes are doing their own exposure compensation, so having the in engine exposure layered on top of that can be a little disorientating, especially in a night scene like this one. With the automatic exposure disabled in engine, I had enough visual consistency to play with some very dim lights, that add a lot to the atmosphere. The screenshots end up looking quite dark, but it looks great in VR. I could have also gotten around this by setting the scene in daylight, but a night scene allowed for flaming projectiles to be spotted with way more ease, which helped with player direction.



The dynamic elements were put together with Blueprint. Nothing particularly complex here, just some matinee and triggers. The trick was to get them to draw the player's attention. The player starts facing in the right direction to see the volley of fire arrows, which set fire to the hay bales. Shortly afterwards, they should see the trebuchet shot fly overhead and destroy the tower, a fragment of wall landing near the player. Each element of the sequence is designed to draw the player's vision smoothly from one event to the next. 



I'm pretty happy with how things turned out, though obviously I had to make a lot of compromises due to the time constraints. Some of the particle systems react oddly in dim lighting, and the collision on the arrow volley does not occur unless the player was looking in the right direction when it activates, something I only noticed while demoing the scene to others. I could have played with scale a little more to make the VR experience a bit better. The plaza looked fine while in editor, but looks a bit empty when in VR, as increased FOV lets you have more things closer to the player without it feeling crowded. A little more polish, a longer dynamic segment, and of course, sound, are all elements I would have liked to include.

The final scene had a total of 15 lights, 444 meshes (not counting those painted on terrain), 11 emitters, and 10 brushes.

If you want to see the scene for yourself, it's available here:

https://drive.google.com/file/d/0BzvAo5Z23YJSSUJPY1VTM1d1Yzg/view?usp=sharing



I've tested it with an Oculus DK2, but should work with any HMD. Upon launch if it does not run on a HMD you should be able to enable it with Alt+Enter. If you do not have a HMD available, it will still work with mouse controls.
During the siege, visual cues should help aid you with where to look, though the initial arrow volley can be a bit tricky to notice on a HMD. Looking between the two statues (directly forwards) when you press space will help.

Wednesday 26 September 2012

Masters- Fairbourne

Fairbourne was my old masters project.

It was a slow, atmospheric first person horror, that tried to manipulate the player using fear as a design tool

Kinda worked too.