Computer Graphics and Virtual Reality Laboratory

The Computer Graphics and Virtual Reality Laboratory (CGVRL) at Connecticut College (CC) was founded on August 1, 2014 by Professor Lee and five student researchers: Danya Alrawi ’16, Raymond Coti ’16, Jamie Drayton ’17, Matt Rothendler ’15, and Will Stoddard ’17. CGVRL is an interdisciplinary research laboratory whose mission is to conduct research and development in visualization techniques and to apply these techniques to enhance our computing experience. Our research focuses on real-time interactive graphics, information visualization, immersive 3D virtual environments, user interaction/experience design, affective computing, physical computing, and other collaborative research. Following shows research students and some of our recent research projects. If you are interested in any of current/past research or have a new idea that fits in our research interests, please feel free to contact me via email anytime.

Interests

  • Computer Graphics
  • Virtual Reality
  • Virtual Human
  • Physical Computing
  • Video Games

Research Students

Research Projects

  • A mobile phone intervention using a relational human talking avatar to promote multiple stages of the HIV Care Continuum in African American MSM

    Mark Dworkin (University of Illinois at Chicago) and James Lee

    NIH funded project (1R01MH116721-01A1, 2019-2023)

    Student researchers: Josh Gorin ’20, Julia Rossiter ’21, Lauren Cerino ’21, and Szymon Wozniak ’21

    HIV-positive African American men who have sex with men (AAMSM) have the lowest percentage of retention in care and are less likely to have viral suppression, an outcome that relies on antiretroviral therapy (ART) adherence. This proposal focuses on an innovative theory-driven intervention aimed at helping to improve outcomes for AAMSM targeting 3 stages of the HIV Care Continuum, (1) retention, (2) adherence to antiretroviral medication, and (3) viral suppression. My Personal Health Guide is an innovative talking relational human Avatar mobile phone application to engage HIV-positive AAMSM in adherence and retention in care. Development of this app was informed by the Information Motivation Behavioral Skills Model that focuses on feedback between information and motivation that affect one’s behavioral skills, behaviors, and desired health outcomes. In the privacy of the user’s home or anywhere they have their phone, the Avatar can encourage healthy behavior, acknowledge stigma and speak with empathy, audibly teach persons with low literacy, employ credible culturally appropriate phrasing, and invite the user to hear advice and motivational stories of other HIV-positive people and their caregivers. A pilot study in HIV- positive AAMSM ages 18-34 years demonstrated acceptability, enthusiasm for the app, and preliminary efficacy. As part of a collaboration between UIC, Emory University, and the University of Mississippi Medical Center, we propose to test the efficacy of the My Personal Health Guide Avatar application for young HIV- positive AAMSM. In this 5-year study, the application will be refined based on pilot data (Aim 1) and then 250 HIV-positive AAMSM between the ages of 18-34 years with detectable viral load at baseline will be randomized to the My Personal Health Guide Avatar application or a food safety Avatar application control intervention for a 6-month period (Aim 2). Wirelessly monitored ART adherence will be collected for 1-month at baseline and then wirelessly monitored ART adherence, viral load, and clinic appointment data will be collected throughout the 6-month follow-up period. We hypothesize that participants in the My Personal Health Guide intervention will demonstrate significant improvements in ART adherence, viral load, and retention in care during the follow- up period compared to control participants. We will also identify mobile phone application functions that are associated with improvement in adherence in order to inform refinement of the application (Aim 3). We hypothesize that more frequent use of Avatar information functions that included motivational messages will be associated with improved ART adherence. This proposal is innovative in that it launches a new direction in prevention and treatment research for young AAMSM, a population at increased risk for therapeutic failure and poor retention in care, by using a relational talking human Avatar in a mobile phone to overcome the impact of stigma and literacy on engagement with clinical care. If the My Personal Health Guide intervention is demonstrated to be effective, it may be quickly scaled up for wide-scale dissemination.

  • Posture Portrait Project 2

    James Lee, Tyler Silbey '21, Bazeed Shahzad '24

    Exploring the practice of “posture portraits”, a series of semi-nude photographs taken of college students in the 1920’s to 1960’s, this project aims to educate audiences on this problematic practice and tell the story of the posture portraits from a new, modern lens. Through a combination of digital visuals, personal monologues, and physical choreography, an empowering performance took place at Virginia Tech’s Moss Arts Center in October 2021. This performance is now being adapted into a virtual reality experience, which can continue telling the story of the posture portraits while allowing us to study the impact virtual reality has on a user’s experience, compared to a standard 3D video game or live performance.

  • Ray-traced Caustics Application

    James Lee, Robert Jensen '22

    The goal of this project is to develop an intuitive ray tracing tech demo, highlighting previously unattainable caustic effects, that is easily accessible to anyone interested in RT technology regardless of their experience with game development. Ultimately, this project strives to stimulate interest in exploring cutting-edge videogame hardware and software by putting the user in control of making ray-traced scenes with unlimited creativity.

  • Tuning Avoidance Behavior in Flocking Algorithms

    James Lee, Logan Waien '22

    This research is focused on implementing a flocking algorithm within an environment and tuning its parameters to produce an organic effect. In trying to do so, an issue became obvious; there is no calculated way to measure the influence of a behavior on the flock. Although we can see the strength of each behavior within the algorithm settings, the way in which it alters the flock is not clear. Does it make the flock more erratic? Less smooth? Completely random? In measuring the average amount each agent in the flock steers to correct itself we were able to answer questions like these. Taking this a step further, by removing each behavior entirely one by one, we are able to isolate the absence of its effect on the flock. Using this information, it is now possible to take a measured approach when trying to tune the weights of flock behavior.

  • Optimizing Mobile Interfaces for Older Adults

    James Lee, Lauren Cerino '21

    Technology has the potential to greatly improve the quality of life for older adults by solving some of their common problems, such as isolation and health decline. However, older adults are often excluded from UI/UX research, resulting in interfaces that exclude their unique needs. To work towards a solution to this problem, I reviewed existing literature to identify older adults’ unique needs and determine what should go into the UI/UX design process in order to cater to those needs. Then, I demonstrated findings through the analysis (and, in Spring 2021, eventual prototyping) of existing mobile electronic health record (EHR), or patient portal, applications.

  • The Adventures of Captain Clean

    James Lee, Katy Pelletier '21

    The Adventures of Captain Clean is an educational video game focusing on COVID-19 safety precautions. The target audience is for 4-6 year olds, but it is still being determined. The game was created in Unity 2D while using Adobe Sketch and Visual Studio to bring it all together. Within the game, Captain Clean follows the user in efforts to turn public health safety precautions into healthy habits for kids. These habits include washing hands, using hand sanitizer, and wearing a mask. Later on, I hope to do testing with all different age groups to get feedback from all ages and also add lip animations to make the game more engaging and interactive.

  • Virtual Reality Tool Simulates Elevator Experience

    James Lee, Kylie Wilkes '20

    Elevators are an extremely useful tool for navigating tall skyscrapers and ensuring the disabled members of society are able to get around independently. Although people can benefit from the use of elevators, claustrophobia is a major issue in some cases. This fear alone can lead to cancellation of job interviews in some cases because it is too high of a floor or people with disabilities are unable to get where they need to go. To alleviate these problems, I have been in the process of developing a virtual reality (VR) tool to educate people about elevators and simulate the experience of being in an elevator. The tool is immersive and incorporates both the visual and auditory sensations that patients encounter while being in an elevator. This VR tool not only educates patients about elevators but also allows them to virtually experience what it is like in an elevator. More details in the report.

  • Study Abroad Web Application for the Walter Commons

    James Lee, Julia Dearden '19, and Jess Quint '19

    In this research, we discuss implementing an interactive web application for Walter Commons. At Connecticut College, the majority of students study abroad; in partnership with Professor Lee and the Office of Study Away, we are creating a new application that allows students to easily search through all study abroad programs that are supported by the college. The application will be implemented for the touch screen in the Walter commons. We are centering this application around user experience/user interaction pipeline methods as a means of creating the most efficient and effective application.
    This research is the continuation of the Global Commons Interactive Map project by Amanda Yacos ’18 and Walter Florio ’18

  • May I help you? An avatar health concierge for HIV-infected African American MSM

    Mark Dworkin (University of Illinois at Chicago) and James Lee

    NIH funded project (1R21NR016420-01)

    This research project aims to improve the proportion of HIV-infected persons engaged in care by developing a theory-based Avatar mobile phone intervention that engages young HIV-positive African American men who have sex with men (AAMSM) in three of the five stages of care: (1) retention, (2) HAART adherence, and (3) viral suppression. The intervention will maximize the likelihood of compliance with healthy behavior leading to both patient benefits (decreased morbidity, mortality and resistant virus) and population benefits (decreased HIV transmission). The intervention draws on the Information Motivation Behavioral Skills Model that focuses on feedback between information and motivation that affect one’s behavioral skills, behaviors, and desired health outcomes. The Avatar will encourage interaction with information and functions that promote engagement with the HIV Care Continuum, provide fundamental HIV information, provide motivating statements, facilitate interaction with healthcare, visualize laboratory results, and encourage, explain, and even illustrate relevant behavioral skills. (Video Documentation)

  • Recreating The Posture Portraits: Artistic and technological (re)productions of the gendered (re)presentations of bodies at Connecticut College: Past, present and future

    Andrea Baldwin, Heidi Henderson, and James Lee

    The project aims to explore the thought-provoking practice taken place at the College from the late 1920s to the 1960s. Archival evidence shows that Connecticut College students enrolling in the institution, as freshmen were required to take posture portraits. Preliminary research in the Connecticut College News Student News paper from 1927 – 1956 indicates that these portraits may have been required for several reasons including, admission to the college and students’ health records. We created an interactive application and performance and presented our work at CAT biennial symposium 2018. More details.

  • Interactive Visualization Wall

    James Lee, Danya Alrawi '16, Raymond Coti '16, Jamie Drayton '17, Matt Rothendler '15, and Will Stoddard '17

    The Visualization Wall is a large display system created to support high resolution visualization. The wall’s size and functionality allows users to display and interact with complex information within an immersive 2D and 3D visual environment. The Visualization Wall is the first project carried out by the newly established CGVRL during Summer 2014. More details.

  • Large-Scale Particle Simulation on the GPU

    James Lee, Rishma Mendhekar '18, and Isaih Porter '18

    This research examines the increased performance of large-scale particle simulation on the Graphics Processing Unit (GPU) against conventional implementation on the CPU. In the first semester, we developed a particle simulation program using a Compute Shader on the GPU to calculate particle motion with a 3D Simplex noise algorithm. The current implementation shows around 60 frames per second (FPS) in 4K resolution for about 8 million particles of a point primitive type as well as a quad sprite model. The performance gain over the equivalent version on the CPU is about a 200x speedup in frame rate. Both the CPU and GPU versions of the program were created using C# and HLSL with Unity and DirectX. We deployed this program for the art installation of The Posture Portrait Project at Connecticut College to achieve an image-dissolving visual effect where each particle is generated from image pixels. We have also implemented a boid flocking algorithm using the GPU using C# and HLSL. This particle motion requires significantly more computation than Perlin noise as each particle will need to be aware of each other particle’s location. We ended up with three versions of boids flocking with differing levels of CPU/GPU involvement– one version where boids were rendered as GameObjects, as sprites, and as polygons. The GPU used for testing both particle systems was a Nvidia Geforce GTX 1080 and the CPU an Intel Core i7-6700k @ 4GHz. This research work was presented at Consortium for Computing Sciences in Colleges – Northeastern Region (CCSCNE), April 2018 (Undergraduate Poster Exhibit and Research session). More details in the report. Boids Simulation Video

  • Weather Sensitive Smart Stylist

    James Lee, Gary Parker and Alex Bukovac '18

    This research is a user-focused application for clothing choices. The goal is to create a custom system that will recommend clothing items from your wardrobe. Initially, a rule-based relationship between weather and clothing was made for recommending a complete outfit. The rules created as a baseline for the general user was deduced by surveying a group of 100 people. Once the general rules were established, we wanted to explore the possibility of adapting to a specific user. To create a “smart stylist” and enhance user experienced, we asked the user to give feedback about the recommended outfit. This way the system could learn and adjust with machine learning. We used case-based reasoning to compare a given recommendation to the user’s previous response to recommendations similar, to better recommend an outfit. This adaptive rule-based system was inspired by Haosha Wang’s paper titled “Machine Fashion: An Artificial Intelligence Based Clothing Fashion Stylist”. In this study, the user inputs their style preferences and the program recommended an outfit for them. Yet, we wanted to take a different approach by asking for the user’s thoughts post-recommendation. This research work was presented at Consortium for Computing Sciences in Colleges – Northeastern Region (CCSCNE), April 2018 (Undergraduate Poster Exhibit and Research session). More details in the report.

  • What are we looking at? An interactive 360° Video Viewing Experience

    James Lee and Yi Xie '18

    While Virtual Reality and 360-degree video are getting more attention from individual users and the entertainment industry, how these immersive viewing environments can enhance the narration has not been fully explored yet. This research aims to develop a 360° video streaming environment that tracks and analyzes viewers’ navigation pattern and also gives the future viewers access to these data as a viewing guide. The streaming environment in Unity was developed to record user behavior data and we conducted a user study with 34 participants using two different 360° testing videos. Through data analysis, we concluded some meaningful viewing patterns from our sample data in relation to the video genre and elements in the video. The second part of the research is to use effective visual representations to display past viewing data. The past viewer’s viewing positions are visualized on an x-axis bar at the bottom. While the video is playing, the user now can see the most popular viewing spots in the video at the given moment. By making the data accessible to a future audience, this 360° streaming environment aims to foster a more engaging social watching experience. This research work was presented at Consortium for Computing Sciences in Colleges – Northeastern Region (CCSCNE), April 2018 (Undergraduate Poster Exhibit and Research session). More details in the report.

  • Global Commons Interactive Map

    James Lee, Amanda Yacos '18, and Walter Florio '18

    Our research involved the generation of an Interactive Map application to be used in the kiosk of the Walter Commons. This map would be used by current and prospective students to gain a global perspective and retrospective look at the experiences of fellow camels. To do so, we created a heatmap and detailed map to combine empirical and qualitative data to describe the spread of students who studied globally. We studied the density of students and were able to create a working database to describe where students are primarily concentrated across the world. In addition to this, we collected information about student’s study abroad experiences to be used to make “Student Profiles” using pictures and quotes to describe their academic and recreational experience abroad. As a team, we worked together to allow for a more holistic perspective and allow for easy access to this information in an aesthetically pleasing presentation.

  • Effects of HUD Presence on Cybersickness

    James Lee and Nikolas Burks '17

    Cybersickness is a problem that severely limits virtual reality. One method of understanding why virtual environments generate cybersickness is sensory conflict theory, which states that symptoms are generated from a conflict between the visual and vestibular motion frames. To reduce the symptoms of cybersickness under the assumption of sensory conflict theory, we propose to overlay a HUD onto a virtual environment. The thought is that the HUD will provide a stationary element to the visual frame that agrees with the vestibular frame, thus reducing symptoms. Also proposed is a way to overlay a HUD without decreasing from the user’s enjoyment of the virtual environment. This is via the varying of the presence of the HUD in proportion to perceive motion. To test this, a study 9 participants was conducted and divided into three groups: no HUD, a minimum presence HUD, and a dynamic HUD. Results indicate that the HUD does reduce symptoms of cybersickness that elicit nausea, but does not affect oculomotor symptoms, with the dynamic HUD reducing more than the minimum HUD. More details in the report.

  • Virtual Reality for Existing Structures

    James Lee and Jessica Napolitano '17

    Through time, war, and other phenomenon, pieces from the history of our evolution have been completely erased. Archaeologists are working to combat this by excavating sites and designing 3D models to ensure their preservation. However, all of these models are isolated and non-uniform, making it difficult to extrapolate a bigger picture of how cities interacted together as well as hindering collaboration between research teams. Thus there is a great need for a central repository that will encompass all archeological sites. Koller, Frischer, Humphrey confirmed this evaluation with their survey concluding that 90% of surveyed researchers in the field felt that there was a need for one. Thus, this research is centered on how to make such a central repository. Throughout the year a repository has been developed using Unity and incorporated security, speed, ability to change time and location, and the seamless infusion of metadata with the 3D models. Furthermore, the application is compatible with both desktops as well as virtual reality. Virtual reality can bring the repository to life with intuitive navigation and a better perception of scale. More details in the report. (Video Documentation)

  • Computer Science Department Interactive Display Board

    James Lee and Jamie Drayton '17

    The goal of our research is to present the accomplishments of Computer Science majors at Connecticut College to prospective students and prospective majors. The system is designed for a 55″ interactive display board that will be installed on the 2nd floor of New London Hall (in a hallway that all tours pass by). The system for the display is created with a focus on usability and the featured projects that students have pursued in recent years. After creating a proof of concept, two Art majors (Greg Montenegro ’18 and Alana Wimer ’18) joined the project to assist with usability and graphic design. The cross-disciplinary nature of this project across the Art and Computer Science departments drastically improved the result of the Interactive Display Board. This project exemplifies the value of a liberal arts approach. The system was designed in the Unity Game Design Engine with the NGUI and Universal Media Player add-on packages. More details in the report. (Video Documentation)

  • Attentional Costs of Different Notification Types

    James Lee and Alyssa Klein '16

    A user can set privacy barriers so that you only see that someone sent you a message but not the actual message or you can see a partial message, users can set various sounds and combinations of vibrate and sound…it can be overwhelming to have so many choices. This study aims to better understand how a notification is presented to the user impacts their desire to switch tasks, more specifically to see if the user seeing a preview message is more compelling versus not seeing the message at all. By learning the how behind text-based notification through observational methods coupled with the previous research based on our mental and emotional needs for connection the greater understanding and framework for engaging users with disruptions can be grown beyond the basic ways of catching the user’s attention but making thoughtful and intentionally decisions to better shape future interactions engaging with distracting media. More details in the report.

  • Engaging Students with Adaptive Tutoring (ESAT)

    James Lee and Phil Winchester '16

    A growing issue in Computer Science is introducing children to the necessary skills to see if they are interested in studying the topic. There are many tools out there that let kids play games that focus on Computer Science based puzzles. ESAT is one of those tools, but with a twist. The puzzles are similar to many other systems, but behind the games, there is an adaptive system to keep students learning as best they can. An adaptive system is designed to learn how the student learns and target tips, puzzles, and tutorials to be more personal and efficient. My system focuses on understanding the student’s zone of proximal development (zpd) and using this to gauge if they are learning above or below average. More details in the report.

  • Immersive Sports Biomechanics Tool

    James Lee and Raymond Coti '16

    The best professional athletes are able to process countless information each second in order to maximize their performances. There have been many studies that have evaluated the extent to which our brains are able to detect information embedded in biological motion. The field of sports analytics utilizes video analysis and motion capture systems in order to determine the most optimal techniques for sports. The main issue with these systems is that they are both expensive and require an expert to help turn the data into tangible information. For this research, we will be utilizing the Microsoft Kinect and Unity 3D to create a tool for learning Tae Kwon Do techniques that address the issues of expense and expertise. We propose doing so by having an immersive training program with gesture recognition working in conjunction with real-time joint analysis. More details in the report.

  • Interactive Campus Activity Map (ICAM)

    James Lee, Brion Morrissey-Bickerton '17, and Yi Xie '18

    College Students are always pressed for time. We are trying to get to our classes, organize club events, study, and feed ourselves all while managing a personal life. What if we knew when the library was going to be crowded beforehand so that we could plan around that and study for longer or print out a paper without being late to class? Our idea is to create a real-time, interactive map of Connecticut College. This will allow students and faculty to see what areas are most likely going to be busy depending on the time of day, the month, the weather and numerous other factors. Faculty and clubs will be able to better target students with messages and announcements; coordinators will be able to organize events without overcrowding a building; students will be able to pick up their mail or go to the gym without an excessive wait time. The project ultimately aims to accurately predict students’ actions by incorporating students’ fixed class schedule and an algorithm that determines what students will do with their free time based off of a survey. The real-time map application utilizes a JavaScript library, D3.js, to visually represent the whole campus population (about 2,000 people) and to create versatile designs to accommodate a wide range of users with different preferences.

  • Real-time Virtual Window Simulation

    James Lee, Chris Giri '15, and Alexandra Howell '18

    The Virtual Window project uses real-time manipulation of a live video feed using user head-tracking data to provide an interactive experience similar to looking out a window. OpenGL shaders are used to correct for barrel distortion in a wide angle camera lens. We were able to achieve more than 100 times faster calculation than CPU-based implementation. User positioning is done using the Kinect v2. Innovative approaches like multithreading and dynamic thresholding are used to reliably track users. Ultimately, the project seeks to consider the human effects of interactive display technology in ambient settings. This research project received Research Matters Awards in January 2015 ($2,000). More details in the report.

  • Accessibility of Application Windows on Large Displays

    James Lee and Danya Alrawi '16

    This research addresses the accessibility of open application windows on a large display. The taskbar, being located in a peripheral area, does not work well when using a large display. We propose a design that has a user-activated functionality to shrink down the content of the display to an area comfortable to the user to see what applications are available on the display. The user can then access the windows with ease; select a window to transfer from its original location to the location that is easy to the user to see, or remove a window without having to move around the room to see where it is. This research work was presented at Consortium for Computing Sciences in Colleges – Northeastern Region (CCSCNE), April 2015 (Undergraduate Poster Exhibit and Research session). More details in the report.

  • Expanding the Knowledge Base of an Interactive 3D Avatar

    James Lee and Benjamin Weinstein '15

    Benjamin worked with a 3D virtual avatar and developed its ability to communicate and converse with humans. The research aims to expand the previously developed interactive avatar framework’s capabilities to support more flexible dialog management. Benjamin studied the intersection of Human-Computer Interaction (with human-like features in a computer system) and traditional Chatbot models and systems. More details in the report.

  • Swift: Apple's new Application Development Platform

    James Lee and Philip Winchester '16

    The main goal of Swift is to be a highly readable and writable language. It is a C-based language, which means that Swift is the spiritual successor to Objective C. Philip researched Apple’s new Swift language and tested it against other popular programming languages: Python and Java. He also hosted a workshop, Swift Bootcamp, for CS major/minor students. More details in the report.

  • A Virtual Milky Way Galaxy Visualization

    James Lee, Danya Alrawi '16, and Raymond Coti '16

    This project aims to visualize our galaxy, the Milky Way, in both a scientifically precise and visually appealing way. The application is designed to support text, images, 3D models and videos to help users learn about the galaxy. The current version of the application shows an outer model of the Milky Way as well as general descriptions and images that activate based on the user’s point of view.

  • An Empirical User Study for Emerging Consumer-Grade Virtual Reality Display Hardware

    James Lee and Justin Anderson '14

    We modeled a virtual environment that the user can explore using two different virtual reality display devices, an ultra-high definition 3D TV and a head-mounted display (HMD), the Oculus Rift. An empirical user study is designed to evaluate two devices by measuring subjects’ performance for a series of tasks. We hypothesize that the HMD type display will have increased levels of user immersion over the 3D TV and result in an increased ability to carry out simple tasks as it supports wider peripheral vision than TVs without a great deal of distraction. We also expect some degree of a qualitative drawback from the consumer level low-resolution screen used in the Oculus Rift. This research work was presented at Consortium for Computing Sciences in Colleges – Northeastern Region (CCSCNE), April 2014 (Undergraduate Poster Exhibit and Research session). More details in the report.