syGlass Spring Release: v2.3.0

Our spring update is here in the form of syGlass v2.3.0, with new features that apply to a broad range of use cases, from image analysis and visualization to immersive content creation. Whether you’re an experienced syGlass user or only just becoming interested in the benefits of virtual reality (VR) for 3D imaging, we would love for you to give it a try: just send us a message to get started.

The complete changelog is available here, but below we’ll take a closer look at some of the biggest changes in this version and the use cases they serve.

Time Series Segmentation & Object Tracking

Time series image volumes from imaging modalities like lattice light sheet (LLS) fluorescence microscopy contain huge amounts of both spatial and temporal information about their subjects, and extracting this information through annotation is an increasingly common goal. syGlass v2.3.0 introduces the ability to view, create, and edit segmentation of objects in each timepoint of these sorts of images, as well as an improved control scheme for the object tracking tool.

The VR environment presents unique advantages for this sort of annotation task. The stereoscopic view allows the annotator to naturally perceive depth in the scene and more easily understand the complex 3D relationships between the subjects of the imaging. The tracked 3D controllers provide an advantage as well: rather than using a mouse—which of course inputs only two dimensions—all three spatial dimensions of the scene are quickly and easily navigated.

There are three primary use cases where segmentation and tracking in VR is particularly advantageous:

  • Manual annotation of challenging images. In these cases, where the objects are poorly or inconsistently resolved in the images, or where the scene is particularly complex, the combination of stereoscopic viewing and 3D control provides a marked advantage in the annotator’s ability to perceive and identify the objects of interest. Many of these cases are beyond the current capabilities of auto-annotation tools.

  • Creation of ground truth or training data for auto-annotation procedures. Machine learning approaches for auto-annotation are increasingly in vogue, but often rely on precise training data and ground truth for the best results. The creation of ground truth and training data in VR offers many of the same advantages as for other manual annotation use cases, and has been used to great effect in recent publications.

  • Proofreading and correction of auto-annotation outputs. If you’ve used automated annotation procedures before, you know that the results are rarely perfect and are unlikely to be perfect in the near future. Evaluation and correction of the results benefits strongly from advanced visualization tools like syGlass, which make errors easy to both detect and correct.

Anaglyph Rendering

You're probably already familiar with anaglyph-rendered content: it combines stereo pairs into single images using red and cyan filters. Viewing them with red-cyan 3D glasses creates the impression that some parts of the image are leaping out of the plane of the screen, and that others are sunken in. While far from high-tech, this technique can provide a highly accessible and affordable way to view 3D images in stereoscopic fashion.

syGlass v2.3.0 offers numerous ways to take advantage of the anaglyph viewing paradigm. A real-time render mode is offered as an alternative to VR, giving viewers a headset-free 3D experience as they navigate syGlass with a mouse and keyboard. Anaglyph export options are also provided for the creation of publication-ready cinematic key frame videos, narration experiences, and static images.

Image Histogram Visualization

The image histogram—a record of the frequency at which each value in an image occurs—is an important factor for visualization. Thresholding an image, for example, hides all portions of the image with values above or below the specified cut-offs. syGlass v2.3.0 displays a visualization of the image histogram above the window and threshold visualization sliders as they are adjusted, providing important context for their use. For images with multiple channels, each channel’s histogram is independently computed and visualized.

Narration Scene Editing

The syGlass narration system allows for the recording of interactive VR experiences in which your avatar can walk a viewer through—from a first- or third-person perspective—your volumetric image data. These narrations can become lab training materials, supplemental content for publication, or even immersive lectures. They can be viewed in full interactive VR, exported to stereoscopic video for playback on standalone headsets, or rendered as 2D or anaglyph video for headset-free viewing.

syGlass v2.3.0 introduces many new features for making the creation of this content easier and more engaging than ever. Active learning exercises including label matching, multiple choice, and voice response prompts can be embedded directly within the experience. And, for the first time, an editing system allows for the trimming of scenes, the replacement of the audio recordings within them, and the overlay of outside audio.

… And Much More

The full set of changes in this spring release has improvements that are relevant to nearly every use case of the software, ranging from optimizations for the loading of files to new options for exporting the content created from them. The full changelog is available here.

Ready to get started? Just send us a message. In the meantime, we’re already hard at work to deliver another ambitious update this summer.

 

The macrophage images shown in this post and included as the splash screen for syGlass v2.3.0 are courtesy of Dr. Nick Condon et al.:

Nicholas D. Condon, John M. Heddleston, Teng-Leong Chew, Lin Luo, Peter S. McPherson, Maria S. Ioannou, Louis Hodgson, Jennifer L. Stow, Adam A. Wall; Macropinosome formation by tent pole ruffling in macrophages. J Cell Biol 5 November 2018; 217 (11): 3873–3885. doi:
https://doi.org/10.1083/jcb.201804137

The chameleon image shown in this post is courtesy of Dr. Raul E. Diaz, Jr.:

Dr. Raul E. Diaz, Jr., Department of Biological Sciences, Southeastern Louisiana University
 

Next
Next

Blubber, Bones, and Breakthroughs: Unraveling Whale Anatomy Through Virtual Reality