2013-08-06

Using Emerging Hardware & Software for Inspiration (2013)

I presented at the Assembly 2013 computer festival a seminar session called Using Emerging Hardware & Software for Inspiration.

In the presentation I explained an idea that from emerging stuff you should find novelty from an interesting data transformation and some leverage from use of existing standards. I also touched some motivational aspects to use emerging hardware and software and tips for approaching. Several small case studies from personal experience was included as well.

The slides are available both as PDF and in executable form (Win32, built with Unity). Note that the slides were only built to support the speech.

PDF slides
Executable Slides (Win32)

I will add YouTube link here once the Assembly event organizers upload seminars there.

2013-07-25

Update on 1GaM

I haven’t posted about my entries to “One Game A Month” for a while, but I’ve still added some prototype things there each month. Here’s an update with links to G+ posts I originally mentioned them.

Update early 2019: Since Google+ is shutting down, I edited this post to contain content from the original G+ posts rather than just link to them.


1gam_may_findthehemisphere
May: My first try playing around with Unity and the Oculus Rift SDK. Not much gameplay there, but with a bit of goodwill you can think of this test as a game, where you have to “Find the hemisphere”.

Continuing with observations about Oculus Rift HMD.

It’s good to pay some extra attention to aligning eyes with the lenses. It’s quite easy to be a bit off vertically, at which point things appear blurrier than they should. At some point I kept shortening over-the-head strap, trying to fix that problem. Until I realized I should actually lengthen it a bit and tighten the horizontal strap instead.

A bit more testing of OVR SDK with Unity… I noticed I get considerable amount of extra tracker lag if I use the Direct3D 11 renderer. I don’t know much about the innards, but my wild guess is that maybe there’s longer swap chain or something. So I guess it’s better to disable the D3D11 for now.

Here’s my first test to play around with OVRPlayerController from the SDK. Better to start with the basics. :) If you like, you can think of this as a game where you have to find the hemisphere.

Windows executable: (link deleted, the build is obsolete)


1gam_jun_chessman
June: Proto_Chessman – Minimal gameplay prototype, asking the question: What if you take chess pieces out from the traditional board and put them somewhere else? (There’s pile of bugs, but you can get the idea.)

Here’s one minimal take on that idea. You’re playing a single random white piece, and your task is to eat 5 opponent pieces before they get you. There’s addition of landscape types — water, walkable stuff and impassable rock. Dandy types of pieces don’t want to go to water, and none of them can pass deep water or rock.

There’s pile of subtle bugs so just hit restart if it goes sour somehow. Didn’t have time to sort through them since the one-game-a-month deadline is looming. :)

(link deleted) build is obsolete, as it required Unity web player


1gam_july_betripled
July: Betripled – Game mechanic prototype about what you could get if you combine the swap-match-3 game logic with matching-cluster-removal (e.g. Bejeweled & Sega Swirl). The swap-match-3 generates single new tiles for the removal-part at bottom. (Note: here’s no score count or ending.)

It’s a game mechanic prototype about what you could get if you combine the swap-match-3 game logic with matching-cluster-removal (e.g. Bejeweled & Sega Swirl). The swap-match-3 generates single new tiles for the removal-part at bottom. (Note: here’s no score count or ending.)

I made the prototype using Processing, but for some reason it didn’t work with as a web .js version, so what you can get is a Win32 download for an executable version (you need to have Java installed).

Scroll down the game list to find the “Betripled” one.
(link deleted; something was changed in the onegameamonth website and my profile page stopped working)

2013-05-31

Initial and random observations about Oculus Rift

My Oculus Rift arrived two days ago. I’ve since toyed around a little bit with various demos made by others, as well as looked at the SDK a bit (mostly the Unity integration).

Here are links to posts I’ve been sharing in Google+:

Update early 2019: Since Google+ is shutting down, I edited this post to contain the actual content of each of my posts in G+.


May 30, 2013

Some initial & random observations about the #Oculus Rift.

I got nauseous quite quickly on the first try. No wonder, since that was without any calibration. The very first observation was that too fast movement will surely trigger nauseousness.

Now after some experimentation, I have found out that I should move the lenses further away from my eyes… the furthest setting is sort of more comfortable but seems to reduce fov, so I ended up using 3 notches for now (so, almost middle actually).

If lenses are closer, they seem to get a bit foggy very easily… at least my eyes seem to evaporate moisture rapid enough to have that problem (I have the same thing with some models of sunglasses). But, that doesn’t happen that much if the lenses are tweaked to be further away.

TF2 vr_calibration can be used to figure out IPD. After doing that, it showed my IPD might be 62.9mm, while the default setting is 64. I can’t notice much difference with that. But I heard that’s one thing which can affect the motion sickness. BTW you can copy the TF2 settings to HL2. I tried the start of HL2 and found it pretty nice (a bit nicer than the Tuscany demo, I think).

Note that the Unity demo version of the Tuscany scene is nicer than the OculusWorldDemo in SDK.

I also tried to disable Aero composition in Windows. Also, I opened the Tuscany demo in Unity myself, and tried to change location of tracker query to OnPreRender, which is supposed to reduce latency a bit (and then built a new standalone build).

After all this, I feel I get a little bit less motion sickness than with the first tries without calibration stuff. Also those last steps seem to help with tracking smoothness as well, I think.

FPS-style turning around & moving feels awkward. Just standing still and looking around is pretty immersive though. Of course the resolution is still low and you can see the screen door effect. But after a while it seems I “forget” that. Except that anything small keeps flickering just too much. Max anti-aliasing setting helps with this in some cases.

The head mount display blocks external light, so light contrast feels pretty nice.

It’s sort of weird fun to lower your virtual height in the Tuscany demo, to simulate being a small child or even smaller animal.


May 31, 2013

Continuing with observations about Oculus Rift HMD.

It’s good to pay some extra attention to aligning eyes with the lenses. It’s quite easy to be a bit off vertically, at which point things appear blurrier than they should. At some point I kept shortening over-the-head strap, trying to fix that problem. Until I realized I should actually lengthen it a bit and tighten the horizontal strap instead.

A bit more testing of OVR SDK with Unity… I noticed I get considerable amount of extra tracker lag if I use the Direct3D 11 renderer. I don’t know much about the innards, but my wild guess is that maybe there’s longer swap chain or something. So I guess it’s better to disable the D3D11 for now.

Here’s my first test to play around with OVRPlayerController from the SDK. Better to start with the basics. :) If you like, you can think of this as a game where you have to find the hemisphere.
Windows executable: (link deleted, the build is obsolete)


Jun 1, 2013

Some random notes picked from two Oculus VR YouTube videos and some Wikipedia articles.

Michael Antonov – Running the VR Gauntlet
http://youtu.be/3CrjjN7W2qY

Head tilt should simulate an actual tilting head, not just tilting camera in-place. So, add some artificial positional movement. (I guess this actually partially hides need for positional tracking.)

Recommended rendering resolution for 7″ Rift is 2170×1360. The final image for each eye is fisheye distorted, and if you resample down from that size, then you will still have sufficiently max detail at the center.

Nate Mitchell – Running the VR Gauntlet
http://youtu.be/oMmi-jS4OYA

Create a body model to have visual identity for the player. They like to see hands and body. Or maybe even have mirrors to see yourself.

UI needs rethinking. Where to put it, so it’s visible, but not in the way? Integrate to game world if you can. But if you have non-integrated UI, then best option is still to render the UI in 3D.

You need to decide how to handle Z depth. If the 3D UI is too close, eyes can take half a sec to adjust to look at it. And most of the time you see the UI in double when you look at far away.

One solution is to use different depths. E.g. paint a reticle on the target you’re looking at. But things like scoreboards still need to be handled differently.

Match vestibular senses with the in-game camera view. I think this links with the head tilt point mentioned above. It can even reduce simulator sickness. (Well, still, there’s only so much you can do – for example you would need a rotating simulator chair to properly simulate feeling of acceleration.)

Cutscenes & movies need headlook camera control (sort of obvious). Adding headlook to everything, even loading screens, does a lot for immersion.

Simple things are now new experiences, for example flying or falling.

I ended up interlacing viewing of the videos above by checking out some stuff from Wikipedia.

http://en.wikipedia.org/wiki/Motion_simulator
As a sidetrack about the vestibular senses, I read the Motion simulator article. It suggests that if you would have an actual motion simulator (rotating chair or so), it should move about 130ms before visual motion to maximize fidelity. It also says that brain can’t perceive slow and gradual enough motion in vestibular system (~2 degrees/sec), which is actually used by motion simulators to return to neutral position without you noticing. I guess this also gives a reason why moving slowly causes less simulator sickness.

And, since it seems so easy to feel sick from VR stuff, the Simulator sickness article in Wikipedia is also interesting reading!
http://en.wikipedia.org/wiki/Simulator_sickness


Jun 6, 2013

Again a few random notes about VR and Oculus Rift…

OculusWorldDemo in Oculus SDK supersamples from larger rendered image to the rift resolution, while the Unity integration demo doesn’t do that (yet). That’ll be fixed once Unity adds MSAA support for rendertextures. More info:
https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=266 (stale link)

Oculus VR @ GDC 2013: Virtual Reality – The Holy Grail of Gaming
http://youtu.be/29QdErw-7c4
Oculus VR @ GDC 2013: Q&A with Palmer Luckey
http://youtu.be/8CRdRc8CcGY

I watched those two videos. There were lot of good points in there, but I think there are some points worth reiterating:

Sense of speed on regular screen needs exaggeration – character running 50-60 km/h or a car driving 300-400 km/h. With VR you don’t have to exaggerate as much.

Sense of scale with VR is actually realistic – you’re able to just look at things and know the scale, e.g. get an accurate idea of how high something is just by looking at it.

For now it’s not a good idea to let kids to use Rift as their vision is still developing, and their IPD is small enough so that the current lens separation will still cause extra distortion, even if you would configure for it. Same problem goes for an adult with eyes very close to each other, except an adult doesn’t need to worry about affecting developing vision negatively.

VR Player
http://vrplayer.codeplex.com/

I tried out the VR Player 0.2 alpha which has added Oculus Rift tracking. Some videos feel quite weird when mapped to a full sphere, as the video itself contains “random” movement and your own head movement adds to that.

Projecting to a plane is quite close to what you can imagine getting if you go sit close to front of a TV or to 1st row in a cinema. That is, your view is filled with the content and you have to rotate head a little bit to focus on the image edges.

I think the player had a bit of unnecessary extra latency with the tracking. Perhaps it is more or less tied to framerate of the video or something (just guessing). Because of that it was uncomfortable after a few minutes.

Looking at videos of some very oldschool demoscene stuff with the VR player was sort of weird experience. Some of the early 3D looks really blocky this way. :) But on the other hand, by looking at some written text in a video using classic DOS font, I was surprised that it was still quite readable. I wonder what it’d be like to see some of that oldschool stuff as an actual VR version. What would be 3D VR textmode demo like?


Jun 18, 2013

I just happened to note this news item that display makers are already working on 6 inch 2560×1600 displays.
http://www.androidauthority.com/6-inch-wqxga-229614/

When I saw that, I didn’t think that I’d want that for a cellphone.

Instead, I thought that the resolution is 4 times the pixels compared to the 1280×800 resolution of first Oculus Rift devkit.

Since that current resolution already is “good enough”, it means doubling that in both directions will surely move the visual fidelity to “very nice” category.

But surely there are some nice performance implications… :)

Remember that even with the current 1280×800 Rift devkit resolution it’s recommended to draw 2170×1360 image for proper image quality (1.7x resolution in both directions). The point is to sample that down to 1280×800 while applying the lens distortion shader.

If we take that 1.7x multiplier and apply it to 2560×1600, we get up to 4352*2720. That’s a lot of pixels to fill, so it won’t hurt to have as fast as possible desktop GPU.

Just for some comparison, think of the last/current gen console games, which were often set to run at 1280×720 resolution in 30 fps. Since you have to run at 60 fps minimum for a VR HMD, 2560×1600 at 60fps will require 9 times the fill rate at minimum… but, that’s not enough, basically you should render at the larger resolution for the supersampling. 4352×2720 at 60fps brings us up to almost 26 times the pixels to fill.

Naturally there are bunch of ways to mitigate the impact, e.g. find ways to skip rendering some of the “lost” pixels at the boundaries which won’t be visible in any case.

All that being said, I think it’s more likely we’ll get the 1080p version first for the consumer targeted device. Oculus already showed that in E3. That should already be a considerable improvement over the devkit resolution. But now we know that it’s possible to push the resolution even further.


Michael Abrash wrote a nice article about judder:
http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/

Note that he has an additional interesting point about prospect of increasing resolution, if you read the comments. Abrash says: “I’d personally take no-judder over more resolution.”


And for a 4K display (3840×2160), the equivalent for-supersampling-1.7x-multilplied resolution is 6528×3672.

2013-04-29

llab, my game for 26th Ludum Dare

201304282344
The 26th Ludum Dare game making contest was held last weekend. The point in the contest is that you have to make a game in less than 48 hours, only by yourself. This time the theme was “Minimalism”, which was actually sort of weird, since the same theme was used also back in 2008 for the 11th LD! Back then I made the game warrior, which did very well.

However, for the latest compo I made a game called llab. It is a little bit inspired by the beginning of Wizball where you could only bounce. I also used Unity3D for the engine this time, which was a nice choice since it meant a lot less coding and more time for other kinds of pondering.

Fun fact: The sort-of-music you hear in the menu is made by making sounds to a voice-changing toy megaphone and then editing it slightly afterwards in Audacity.

(Note: Removed link to play in web browser, as Unity player plugin is now deprecated.)
Read the rest of this entry »

2013-04-01

Mini game controls prototype for 1GaM

Another “not really much of a game” entry for the One Game a Month. Toyed around with Processing, here’s a minimal game controls prototype. Sorry, you should use a mouse with this (and left mouse button), a touchpad is probably not going to be handy at all.
Read the rest of this entry »

2013-02-28

One Game A Month – Shmupless

Shumpless
Shooter without enemies.

This is not much of a game, just a minimal test about a shooter where only your own bullets are the dangerous thing. Control ship with arrow keys, shoot bullets with Left Ctrl key.

(link deleted) build is obsolete, as it required Unity web player

2013-02-02

Old stuff to start with One Game A Month – IOCCC19 Billiards


IOCCC19 Billiards
A failed attempt at taking part in 19th IOCCC.

There’s a new “One Game A Month” project which has just started at January, with aim for developers to make game each month for a year (or more). Although it’s February already, there’s still time to start and submit for January as well, the first month being a special case where you can submit late entries.

I didn’t do anything new for the January and thought about skipping that month already, but then I decided I could just add some older thing for the January entry, and here it is…
Read the rest of this entry »

2013-01-18

Strobotnik

strobotnik_logo_onlyimage_512x512_padded_whitebkg
Strobotnik is name of my new company, found in January 2013.

There’s really no projects yet to speak about, so the only thing you can check out for now is the WWW page, complete with a logo effect, and perhaps visit or follow the related social media pages.

WWW: Strobotnik.com

Twitter: @Strobotnik

Facebook: Strobotnik page

Google+: +Strobotnik (Update 2019: Google+ is being closed, deleted link)

2012-09-18

Raspberry Pi

Got a Raspberry Pi (sponsored by Nokia through the QtonPi project, thanks!). Was pretty easy to get running. For starters I ported my 4kb demo “San Angeles Observation” to run on it. I used the VBO-enhanced code from Chromium source tree, so it runs much better than the original OpenGL ES port. :)

Page about the original 4k demo: http://iki.fi/jetro/visuals/4k-intros/san-angeles-observation/

Here’s direct download link to runnable binary: Download .tar.gz – Tested to be runnable with the Raspbian wheezy distribution (2012-08-16 build).

Read the rest of this entry »

2012-08-07

Stuff for Assembly 2012

Here’s list of some stuff from Assembly 2012 summer, the annual computer festival!

Together with stRana we made a little demo for the Assembly 2012 event’s demo competition. Like last year’s Grandma demo, this is also bit of an experiment, this time with real-time rendering of drawn line art/sketch data. Music is created by our friend Crud, embedding also bit of beat-boxing sounds by Ion. You can read a bit more about the demo in this page. Click to download (Windows), or watch the YouTube video.

Read the rest of this entry »