New York Tech Journal
Tech news from the Big Apple

Bringing Droidcon Montreal to NYC: #VR, #noSQL, Surface #Textures

Posted on April 17th, 2015

New York Android Developers

04.16.2015 @Spotify, 45 W 18th St, NY

Three speakers presented excerpts from their presentations in Montreal


Dario Laverde spoke about VR on Android. He first outlined the hardware capabilities of the HTC Vive to create an immersive VR experience. These requirements are designed to insure “presence” (viewing without getting sick). These include: 90Hz refresh rate, 2 screens with 1080 resolution, ability to track your location.

Last summer, Google introduced Google cardboard which allows you to place a phone in a holder to experience VR. Today Google announced that it will officially certify some commercially available viewers. This will allow the software to be fitted to the lenses on the viewer.

Dario also talked about the need for a good pair of headphones for positional audio. Android developers can create surround sound using the open SL ES spec. He mentioned that a magnet on the viewer interfaces with NFC to provide a single-click input to the software. He noted that some phone do not have NFC in which case they can be controlled using a Bluetooth clicker.

Dario walked-through of some code snippets:

  1. Intent filter – identifies the app as a cardboard app within Google play:
  2. onNewFrame – get the location of your head
  3. onCardboardTrigger – monitor if NFC invoked

He noted that when adapting tilt-sensitive games, that viewer movement assumptions are different. For instance, a race car when tilted moves the car left or right, while a left or right head movement does not alter the direction of the car, only your view within the car.

He concluded by talking about how the limitations on computation power are the main barrier to adding other inputs (beyond head motion and NFC).

Next Will Hoang @Coachbase talked about Coachbase’s noSQL database for local storage on Android. He talked about how their database gives off-line capabilities to apps and automatic, real-time syncing to the cloud and other connected mobile devices when connections are available.

20150416_201809[1] 20150416_201801[1] 20150416_203526[1]

The third speaker was Lisa Neigut@ElectricObjects. Electronic Objects has created an Android-powered display for works of art. It received its funding in a Kickstarter in April 2014. They have created a large Android display panel similar to a tablet but without touch sensitivity.

The display was originally powered by a Raspberry Pi, but they have moved to Bluetooth with Android KitKat 4.4 (OS19). They expect to upgrade the OS once Freescale upgrades the iMX6D board.

Artists create materials in one of three formats

  1. Static images: jpg, png
  2. Html – Chromium WebView – use Crosswalk
  3. GIF

Lisa then talked extensively about the challenges of displaying dynamic Gif files.

Since Andoid does not have a native Gif decoder they first needed to understand the structure of Gif files: Each Gif contains a main frame and a series of subframes. Each frame is specified by a size and the coordinates for this frame within the main picture. Frames are presented in sequence to give the impression of an image in motion.

Gifs in which only a small part of the image is updated by each frame make fewer demands on the processors and therefore appear as a smooth set of motions. Frames with more changes can cause frames to be dropped or mangled on output.

They explored several methods for displaying Gif files:

  1. Media player was inefficient: images did not look good
  2. Chome: (Gifs were packaged in an html file) had a long initial delay (several minutes) upon initial loading of the display, but ran well afterwards. This was a problem since uses would want to browse the artworks before deciding to display the piece.
  3. Glide – does not drop frames, but displays only 3 to 4 frames/sec
  4. Native – gifDrawable. But still need better performance

To improve performance they used the Systrace utility to visualize the bottlenecks in the refresh cycle when processing each frame. Specifically, they used this to better understand the Android Graphics pipeline: SurfaceFlinger & BufferQueue.

Eventually they were able to attain 30 frames per second performance on a 720 psi display by directly accessing the surface textures using the TextureView Pipeline. This is consistent with recommendations made by some gamemaker systems: modify textures when possible to make gameplay fluid.

posted in:  Android, databases, Virtual Reality    / leave comments:   No comments yet