Congratulations to our Image of Research winners! The Image of Research exhibition is the annual interdisciplinary exhibit competition organized by the Graduate College and University Library to showcase the breadth and diversity of research at UIC.
“Intelligent Welding” I am not your stereotypical welder. I am an M.S. student in the Department of Civil and Materials Engineering conducting research in UIC’s Welding Laboratory under the direction of Professors Ernesto Indacochea and Didem Ozevin. I am researching traditional welding techniques and autonomous welding in “real-time” to avoid defects. It addresses weld quality assurance by developing a combined real-time diagnosis, decision, and control system based on multi-sensor fusion and machine learning methodology. The major technology innovation of the research is that a welding machine will be able to make intelligent decisions in response to process variables, disturbances, and tool deteriorations. The picture was taken on February 28, 2017, in the lab. It represents the professionalism of understanding fundamentals and the power of simplicity. Having some expertise in welding has allowed me to excel in my research. It also shows the stark contrast of perception and reality. In the picture, I am focused to detail while commanding a metal-melting welder held inches away from my face as sparks fly. What you can’t see in the picture is that I am a 5’1” tall woman. Like the bright arc in the picture.
“Under the Virtual Ice” The search for life on other worlds starts here. The NASA-funded SIMPLE (Sub-ice Investigation of Marine, and Planetary-analog Ecosystems) project takes of one the first steps in preparing to search for life under the icy surface of Europa by exploring the waters under the ice-covered lakes of Antarctica. Using multiple virtual reality devices at the Electronic Visualization Laboratory, my research allows a multi-disciplinary team from UIC’s Earth and Environmental Sciences Department, NASA Ames, Stone Aerospace, and Montana State University to virtually recreate an underwater autonomous vehicle’s (AUV) mission based on data collected from numerous sensors during an expedition to the McMurdo ice shelf. The visualization application in the image depicts the underwater ice sheet as cubes derived from sonar scans collected from the AUV. Using the wand controller researchers can swim through the virtual lakebed at real-life scale, follow the yellow path of the AUV, and view salinity, pressure, conductivity, and oxygen concentrations. The image was created by Lance Long using multiple in-camera exposures of the subject wearing the head-mounted display in front of the visualization on a tiled display wall and just the display wall.
“Peripheral visual function: Accessing the unexplored” Our peripheral and central visual systems work synchronously to capture the world we see, while the central vision tells us “what” we are looking at, our peripheral vision helps us orient ourselves. Loss of peripheral vision would mean loss of edge detection, movement recognition and ultimately orientation. Unfortunately, unavailability of test targeting the periphery make it impossible to detect and monitor neurodegenerative diseases that first manifest in these regions. My research focuses on the design and development of a diagnostic testing system that targets the peripheral visual field. PeriStim™ is a novel three-dimensional hemispherical pattern stimulus source capable of detecting functional changes in the eye before any significant peripheral cell loss is seen. This can help in early disease detection, assessment, prognosis, and management. Image courtesy: Neural Engineering Vision Laboratory (NEVL). Image shows a subject undergoing the PeriStim™ test: The subject sits at a fixed distance, with a recording electrode placed in the eye. A checkered pattern stimulus is presented. The electrode measures the response of the eye to this stimulus. The recorded response is called the peripheral pattern electroretinogram (ppERG)