Very high magnification view of cilia beating in real time. Video images were improved with noise reduction and contrast enhancemnent to make this motion visible. 400x magnification was used to relay the image to a newvicon tube camera, and differnetial interference contrast light microscopy to form the primary image. These were human airway epithelial cell cultures obtained from the UNC Cystic Fibrosis Center and grown in our laboratory. Each cilium is about 7 um long, and they can beat at up to 30 times per second, although those shown were much slower.
Category: Media Gallery
Very low magnification video of cultured ciliated airway cells grown in the CISMM cell culture facility. The cilia begin to beat in a coordinated manner that has the effect of moving the media, particles and mucous secreted by the cells in a circle around the culture well. These were first demonstrated by Hirotoshi Matsui and colleagues at the UNC Cystic Fibrosis Center. We use these “hurricanes” to find cultures with good ciliary function.
A 2.8-micron diameter bead attached to slowly-beating cilia is tracked by the laser tracking and position feedback systems of the 3DFM. For the first two cycles, the bead is not tracked; it and the other two beads near the bottom of the image move as the cilia beat. As the tracking feedback kicks in, the background moves while the bead stays fixed in the center of the “+” sign. Note that the point on the bead that is tracked is slightly out of the focal plane of the optical microscope, so the three beads get slightly fuzzy while tracking is operating.
Jeremy Cummings using a force-feedback device to control the magnetic drive on the first prototype of the 3D Force Microscope. This drives a magnetic bead in a square.
The nanoManipulator system provides a virtual-reality interface to scanned-probe microscopes, including interactive 3D graphics and force-feedback (haptic) display and control. robinett-nano-movie-small shows a movie of the system in operation.
A technical demonstration of the nanoManipulator using the PHANTOM user interface with the nanoWorkbench.
Ming Ouh-young at UNC designed and built a haptic feedback system to simulate the interaction of a drug molecule with its receptor site in a protein. (Brooks, Ouh-Young et al. 1990; Ouh-young 1990) This system, called the Docker, computed the force and torque between the drug and protein due to electrostatic charges and inter-atomic collisions. These forces were presented to a chemist, pulling the drug towards local energy minima. This task is very similar to that of other “lock and key” applications where a scientist moves one object and senses collisions with other objects in the environment The system presented the force and torque vectors both visually and using haptic feedback. Experiment showed that chemists could perform the rigid-body positioning task required to determine the lowest-energy configuration of the drug up to twice as quickly with haptic feedback turned on compared to using the visual-only representations. (Ouhyoung 1990) Scientists also reported that they felt like they had a better understanding of how the drug fit into the receptor site when they were able to feel the forces.
The Docker application, like other path-planning applications, required the presentation of both force and torque to the user. Because the drug molecule was not a point probe, different portions of it could collide with the protein at the same time. Extricating the drug from a collision sometimes required both translation and twisting. If a chemist were provided with only force (translation) information and no torque (twist) information, they could be led to move the drug in an improper direction.