VRLA Summer Expo 2016

This year’s VRLA Summer Expo was bigger than ever. Exciting to see an industry grow before my eyes. Some noticeable sights from this year.

Envelop – A VR desktop program. Focused more on productivity and workplace applications.

MindShow – It’s kind of hard to describe this one. Create your own show, play and act out several characters (one at a time.) Looks very promising!

WaveVR – Amazing looking DJ type application for real-time creation of music.

The future killer apps of VR will be of the creative sort. If I want to be passive I’ll watch TV/Netflix/HBO… if I want to create something I’ll put on a headset. Sitting passively in a headset feels like such a waste.

Setup a DHCP Server for an Internal Network in VirtualBox

In your Virtual Box’s installation directory (On windows this might be C:\Program Files\Oracle\VirtualBox), there is a VBoxmanage.exe program that can be used to setup a DHCP Server for an internal network. Without this setup they cannot find each other on the network.

First, create the DHCP Server.

Next, in Virtual box set the mode to Internal Network and change the network name to “testlab”. You may need to restart the VMs if they are running. They will be assigned IP addresses in the 10.10.10.x range.

Setting up Kali on VirtualBox

Download kali-linux-2016.1-amd64
Setup and install on VirtualBox (Reference <- Ignore stuff about installing VirtualBox Guest Additions)

Update /etc/apt/sources.list

Then…

Insert Guest Additions CD-ROM

Why it’s great to be a bicyclist

  • You can go through stop signs
  • You can go through red lights
  • You can whistle loudly at people in order to alert them to your presence
  • You can slow people down on their way to work
  • Create confusion at four way stop signs
  • You can walk into the office with one side of your pants rolled up
  • Showers at work
  • Awkward conversations while you remove your U-lock off the bike rack
  • Openly discuss and compare the biking infrastructure between cities for many minutes
  • That time you got hurt
  • That time you almost died
  • A deep knowledge of DOT projects in progress and in planning
  • Owning that cool bag
  • You now own many different white and red LED lights
  • An intimate knowledge of the state of disrepair that exists on American roads and infrastructure

The Origin of the Metaverse

I’ve been somewhat obsessed and excited about virtual reality after acquiring the Oculus Rift Development Kit One (or DK1) and experiencing just a little bit of what is being dubbed “presence” (see Mike Abrash’s presentation at Dev Days on what this is). My excited only increased after Sony announced their own VR headset, called the Sony Morpheus, lending credence to the idea that VR has finally arrived. This has all put me in a wildly speculative mood. I believe this technology will catch on very quickly because it will provide the most immersive and engaging experiences as of yet conceived. It will finally allow us to escape.

My consumption has not just been DK1 demos and projects, but I’ve been loyally following the latest updates on reddit.com/r/oculus and other blogs, as well as consuming any novel related to VR. I quickly devoured Ready Player One and I’m half way through Neal Stephenson’s Snow Crash. William Gibson’s Neuromancer will be next. Common to each of these novels is the idea of a virtual world separate from our own in which people from all over the world connect to explore fantastical environments in which the physical laws of the real universe do not apply. This often involves strapping a device to our face, something akin to goggles, along with sporting gear on our hands and body that enhance the experience by simulating touch, often called haptics. The virtual worlds are named something different in each novel (OASIS, Metaverse, Matrix). I like Snow Crash’s “metaverse” the most.

It seems a metaverse is inevitable as these headsets get better. Given the social nature of our species, it seems obvious that we’d also want to experience the metaverse together. We want to, or rather need to, share experiences with each other. Experiences in the metaverse will be novel, plentiful, and unconstrained by the physical properties of reality — we will want to share them.

React.js Diffs

Lately I’ve been looking into Facebook’s open source React.js library. It’s a front-end javascript user interface library that has a few interesting features. The most interesting one to me was something referred to on a React.js blog post as reconciliation:

When your component is first initialized, the render method is called, generating a lightweight representation of your view. From that representation, a string of markup is produced, and injected into the document. When your data changes, the render method is called again. In order to perform updates as efficiently as possible, we diff the return value from the previous call to render with the new one, and generate a minimal set of changes to be applied to the DOM.

Finally the post continues:

We call this process reconciliation.

In my research, or rather “googlesearch”, I stumpled upon this article which goes into more detail about the diffing process. I highly recommend giving it a read as it goes into detail into the heuristics of how React most efficiently adds/removes elements from the DOM, since doing a real “diff” would not be efficient.

I like the idea of seeing this diffing process, so I wrote a simple example that demonstrates the reconcilation process. In order to “see” the process, however, you will need to turn on “Show Painted Rectangles” in the Chrome debugger to see the result (open Chrome Dev Tools, press ESC, and go to the rendering tab).

See Example

Youtube for the Oculus Rift


vr-extension

Source | Demo

I’ve done it. Finally finished. It took me a whole day and most the night, and the better part of the next morning, but I can finally say I’ve finished one of the things I’ve been longing for — the ability to watch Youtube videos in the Rift!

I’ve implemented it as a Chrome extension, and there are many things that can be improved:

  • More settings (e.g. custom width, height)
  • More sources (vimeo? netflix?)
  • More bridges (only vr.js is supported)

I will followup with a more detailed technical breakdown of the hacks required to get this working.

iBeacon Background Advertising

The last week or so I’ve been playing around with a Bluetooth technology called iBeacon by Apple. Basically it allows an iOS device (e.g. iPhone, iPad, or estimote) to advertise a signal to other iOS devices within a certain proximity. Most examples demonstrated are museum or merchant focused. As a merchant for example, a store might place an iBeacon in a static location, such as the entrance, and it would begin transmitting to devices that are organically carried through its doors. Once a receiver detects the iBeacon signal it might notify the user of special discounts or coupons or a myriad of other marketing ploys. A museum might place iBeacons discreetly at each painting, and after downloading their app a user might get historical information and the painter’s biography delivered to their phone as they navigate the museum.

While I like the museum examples presented, frankly I’m not excited about an increased bombardment of advertising as I walk through the world. However, beyond the merchant and museum applications, I think there is a lot of potential in the peer to peer application space. In particular, ideas that become possible when a device both advertises and receives iBeacon signals. Imagine walking into the subway and being challenged by another user on the same train to a game of Rock, Paper, Scissors. Or a simple dating app that advertises availability while out at the bar. Or a multitude of other gaming applications (see Nintendo’s “Street Pass for examples). Not to mention ideas related to daisy chaining iBeacon devices into areas where reception and wifi isn’t available.

There is one very big problem with the technology as it now stands that prevents these ideas from taking off. A iOS device cannot advertise while the app is running in the background!

It is possible to receive iBeacon signals while in the background, or even if the application is closed (after the latest update – see this). However, advertising those signals can only be done while the app is running in the foreground.

This is definitely a bummer. Fortunately there are ways using traditional Core Bluetooth to create these applications – but it looks rather difficult. See this github project as an example. I’m also unsure what the battery draining implications are, or if the application has to be running in the background. I will be exploring this option in the upcoming weeks to get at least one idea off the ground.

I believe the lack of background broadcasting really reduces the peer to peer applications that would be possible otherwise. Apple has got to know this, right? Or is there a reason why background advertising is not supported?

Hacker School Checkpoint

It’s hard to believe but I’m two months into Hacker School which means there is not much time left (1 month!). It has passed quickly so I wanted to write a post that serves as a kind of “checkpoint” to briefly look at what I had hoped to accomplish in this batch and to orient myself for the final stretch. Firstly, I had a gander at my application (yes I saved it) and took note of the projects I had listed there:

  • Browser based Oculus Rift video player.The final project might be a browser based 360 video player with Oculus Rift support.
  • Complete the “Elements of Computing Systems” textbook – build a computer starting from logic gates, CPU, memory, (virtually) then moving on through the software heirarchy into a complete integrated system.
  • Develop an iphone application game called “Quick Draw”. Much like a country western, as you approach other players of the game in the real world with your phone, the first to take their phone out of their pocket and point it towards the other will win the “match”.

I only listed these three projects, which wasn’t very ambitious really. I have roughly completed the first two (I have a couple chapters left of nand2tetris). I haven’t done any mobile development so far.

Additionally I have spent time working on other projects that I did not plan at the outset:

  • Learn a bit of Elm and expose myself to FRP (Functional Reactive Programming). Created a simple game called Vessel.
  • Spent about a week learning about Haskell. Elm is very similiar so it was a natural segue.

I guess that’s it for the major projects. It appears kind of measly when written out like this, but what isn’t described here are the multitude of small “learning events” that I participated in. Examples include short talks by the facilitators, the weekly lecture on Mondays by residents, presentations by students on Thursdays, and most importantly the informal pairing and discussions that occur every day and during chat. Other things I’ve learned are functional programming in Python, what a Smalltalk programming session looks like, how git internals work, assembly programming in x64, writing a very simple kernel, and weird distros of linux, just to name a few.

Looking forward I hope to finish strong and plan to complete a few more things:

  • Complete an iphone application and distribute in the app store
  • Learn some more Haskell. One of my goals was to learn a functional language.
  • A script that can easily be added to a page to take a canvas element and place it into a 3d scene so it can be played in the Rift. I’m imagining being able to play Vessel or any other HTML5 canvas based game in the Rift.

HTML5 Panoramic 360 Video


vid360

Source | Demo

I’ve finally completed a project that I’ve had in mind for quite some time: A 360 HTML5 viewer. This was a project that was initially conceived during a hackathon a few months ago (the source for those experiments are located here)

Most excitedly the player has optional Oculus Rift Support. The player seeks to replicate what existing projects can already do (e.g. Total Cinema 360 and VR Player) but in the web browser. However, the player is limited to browsers that support WebGL and users must have the vr.js project installed for Oculus Rift support. It’s only been tested in Chrome (and not very thoroughly I might add). The videos on the demo page are mp4 which only work on Chrome (except for sintel, which should work in Firefox).

I’ve used a lot of web based players in the midst of building this thing and most are Flash based. I did find one by Kolor which looked pretty impressive.

I still don’t know much about the actually stitching process and the algorithms behind them. I’m also not sure about the projections I’ve chosen. I’m still very new to “360” video but I think as VR takes off it will become much more prevalant.

Thanks to the airpano people for their amazing videos, which I’m using to demonstrate the player.