Researchers at Howard Hughes Medical Institute developed a way to produce color-tagged, 3D, microscopic videos of organelles in a live cell. They came to Drexel's Andrew Cohen, PhD, to develop an algorithm that could process massive amounts of visual data to better understand the behavior of organelles as a group and individually. This technology will help them unlock cell behavior and response to drug treatment.
Researchers at Howard Hughes Medical Institute and the Eunice Kennedy Shriver National Institute for Child Health and Human Development are getting a first glimpse at the inner-workings of live cells thanks to a new microscopy technique pioneered by Nobel laureate Eric Betzig with help from engineers at Drexel University. Their method uses grids of light that activate fluorescent color tags on each type of organelle — the result is a 3-D video that gives researchers their best look at how cells function. It will allow scientists to better understand how cells react to environmental stressors and respond to drug treatment.
In a paper published today in Nature, the team lays out its methodology for using Betzig’s lattice light sheet microscope in combination with image-tracking technology developed in Drexel’s Computational Image Sequence Analysis Lab, led by Andrew Cohen, PhD, to produce 3-D time lapse videos of organelle movement and generate quantitative data on their interactions.
“The cell biology community has recognized for many years that the cytoplasm is full of many different types of organelles, and the field is recognizing more and more how significant cross-talk between these organelles is in the form of close contacts between these organelles,” said Jennifer Lippincott-Schwartz, PhD, of HHMI’s Janelia Research Campus, and senior author of the study. “When two organelles come close to each other they can transfer small molecules like lipids and calcium and communicate with each other through that transfer. But no one has been able to look at the whole set of these interactions at any particular time. This technology is providing a way to do that. But this paper is about a whole new technology, being able to tag six different objects with six different fluorophores, and unmixing the fluorophores so that you can observe the six different objects discretely.”
Betzig’s microscopy technique uses layers of light grids that interact with fluorescent protein-tagged cells to build a 3D microscopic image. At Janelia Research Campus, Betzig and Lippincott-Schwartz have refined that technology to produce a detailed look inside the cell by tagging each organelle type with its own color.
“The challenge is analyzing this data,” Lippincott-Schwartz said. “It requires being able to simultaneously track these six different objects in 3D. What Andy Cohen and his group have done with the software system they have developed is enable us to really look at this in more quantitative ways than would be possible with conventional tools.”
Cohen’s lab developed a tool called LEVER 3-D in 2015 to help researchers study 3-D images of neural stem cells. It applies an advanced image segmentation algorithm they developed that can identify boundaries of cells and track their movements. Prior to this technology being available to microbiologists, the processing of microscopic images and time-lapse footage would take massive amounts of time because they would have to create lineage trees by hand and attempt to follow cell changes by making their own observations when comparing images.
This process is even more involved when multiple objects are being tracked in three dimensions. Lippincott-Schwartz’s group used a battery of computer programs to filter out all the different pieces of light spectra emitted by the organelles, to begin to bring the 3-D images and video into focus. The process, called “linear unmixing,” required more than 32 cores of a computer work station to sift through 7 billion sets of six-color images, pixel by pixel.
Typically they would use expensive commercial software programs to stitch them into a 3-D volume to go about studying them. But these programs are expensive and time-consuming to use, and were not capable of the sophisticated analysis for tracking moving objects in order to make quantitative measurements of their behaviors and particularly how they interact.
[iframe width="640" height="360" src="https://www.youtube.com/embed/9nVIpxcOaf0" frameborder="0" allowfullscreen>]
Cohen’s algorithm automates the entire process, which saves researchers a lot of time and it also lets them ask – and answer – more questions about what the cells are doing. He further verified the data by working with Drexel colleague Uri Herschberg, PhD, an associate professor in the School of Biomedical Engineering, Science and Health Systems and College of Medicine, to check it against 2-D images of the cells.
“It’s some really impressive footage that gives biologists this ability to look deeper and deeper into live cells and see things they’ve never seen before — like six different organelles in a living cell in true 3-D,” said Cohen, a professor in Drexel’s College of Engineering. “But it’s also a lot of work to begin quantifying what they’re seeing — and that’s where we can help, by using our program to automate big portions of that process and glean valuable data from it.”
Using the new technology to simultaneously look at six sets of organelles, Lippincott-Schwartz’s teams at Janelia and at the National Institutes of Health are making exciting new observations. They are looking at how the organelles distribute themselves inside the cell, how often they interact with each other and where, when and how fast they move during various times in the cell’s lifecycle.
“One very interesting outcome is that we found the largest organelle in the cell, which is the ER [endoplasmic reticulum], at any particular time point will be occupying about 25 percent of the volume of the cytoplasm, excluding the nucleus. But if you track the way it disperses through the cytoplasm over a short period of time, like 15 minutes, you see that it explores 95 percent of the whole cytoplasm during that time period,” Lippincott-Schwartz said. “We can do this for all of the other organelles at the same time to see how the cytoplasm is being sensed through the dynamic motions of dispersive activities of these organelles.”
Observing sub-cellular behavior is just the first application of this technology. Now that it has proven to generate usable data, the team will forge ahead to study what happens inside a cell when it is exposed to drug treatments and other common stresses on the system. The researchers suggest that it could be used to study many more than six types of microscopic objects. And it could help dig even deeper into the building blocks of life — into interactions of RNA particles and other proteins that play a role in a cell’s function and the behavior of diseased cells.
“As these tools continue to improve they will give researchers both a better look at cell behavior and many options for gathering and analyzing that data,” Cohen said. “They will be able to ask and answer increasingly complicated questions and that’s going to lead to some very exciting and important discoveries.”
Read the full paper here: http://www.nature.com/nature/journal/vaop/ncurrent/full/nature22369.html