Scientific

Hybrid Colloquium: Developing Disability-First Datasets for Non-Visual Information Access

Hybrid Colloquium: Developing Disability-First Datasets for Non-Visual Information Access

Abstract – Image descriptions, audio descriptions, and tactile media provide non-visual access to the information contained in visual media. As intelligent systems are increasingly developed to provide non-visual access, questions about the accuracy of these systems arise. In this talk, I will present my efforts to involve people who are blind in the development of information taxonomies and annotated datasets towards more accurate and context-aware visual assistance technologies and tactile media interfaces. https://abigalestangl.com/

Hybrid Colloquium: Measuring featural attention using fMRI and MEG

Hybrid Colloquium: Measuring featural attention using fMRI and MEG

Abstract – Attention to low-level visual features can alter the activity of neurons in visual cortex. In principle we expect attention to select the neurons most sensitive to changes in the feature being attended. This means that the precise nature of the neuronal responses may depend on both task and stimulus. Here, we used magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) to examine modulations of neural activity in visual cortex driven purely by both stimulus and task. We presented sequences of achromatic radial frequency pattern targets (200ms, ISI randomized from 1800-2000ms) with occasional small ‘probe’ changes in their contrast, shape and orientation. Probe types were randomized and independent and subjects were cued to attend to specific probe types in blocks of 24s. Responses from 15 subjects (9 F) were recorded in separate fMRI and MEG experiments. Support vector machines were used to decode MEG sensor space data at 5ms intervals and fMRI voxel-wise responses from retinotopically-defined regions of interest. We show that both attentional state and target events cause changes in ongoing neuronal activity and that we are able to distinguish between different types of low-level featural attention with good temporal and spatial resolution. Some of his most cited vision research and neuroimaging papers  Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

Hybrid Colloquium: Gaze and Gait: Changes in gaze behavior during locomotor learning

Hybrid Colloquium: Gaze and Gait: Changes in gaze behavior during locomotor learning

Abstract – During walking, people use vision to both create movement plans about future steps and correct the execution of the current step. However, the importance of these types of visual information changes based on the movement ability of the person and the difficulty of the terrain. In this talk, I will present the results from two experiments that explore how visual sampling strategies and visual reliance changes with locomotor learning. The first characterizes how visual sampling strategies change as people practice a treadmill-based target stepping task. The second examines how visual reliance changes during the same target stepping task by altering what visual information is available at different points of the locomotor learning process. I will conclude by presenting some preliminary results of these techniques applied to a clinical population, specifically individuals with a concussion. People with a concussion typically exhibit both oculomotor deficits (which would impact what visual information is available) and gait deficits (which often persist beyond the point of recovery when symptoms have returned to baseline). My current work, therefore, proposes that there may be lingering changes to an individual’s gaze behavior which may be causing these persistent gait deficits. https://www.alexcates.com/ Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

Affiliate Senior Scientist

Hybrid Brown Bag: My perspective on Vision and Vision Rehabilitation

Ophthalmology, visual science, and vision rehabilitation have significant areas of overlap. In this presentation, I want to discuss insights into their interaction and stress aspects that are often overlooked. https://yua.jul.mybluehost.me/users/august-colenbrander Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

Zoom Colloquium: Improving Comics Accessibility for People with Visual Impairments

Zoom Colloquium: Improving Comics Accessibility for People with Visual Impairments

Abstract: A number of researches have been conducted to improve the accessibility of various types of images on the web (e.g., photos and artworks) for people with visual impairments. However, little has been studied on making comics accessible. As a formative study, we first conducted an online survey with 68 participants who are blind or have low vision. Based on their prior experiences with audio-books and eBooks, we propose an accessible digital comic book reader for people with visual impairments. An interview study and prototype evaluation with eight participants with visual impairments revealed implications that can further improve the accessibility of comic books for people with visual impairments. We then focused on a specific type of digital comics called webtoon, which is read online where readers can leave comments to share their thoughts on the story. To improve the webtoon reading experience for BLV users, we propose another interactive webtoon reader that leverages comments into the design of novel webtoon interactions. Since comments can identify story highlights and provide additional context, we designed a system that provides 1) comments-based adaptive descriptions with selective access to details and 2) panel-anchored comments for easy access to relevant descriptive comments. Our evaluation showed that Cocomix users could adapt the description for various needs and better utilize comments. https://hcil-ewha.github.io/homepage/index.html Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)  

Hybrid Colloquium: Making Calculus Accessible

Hybrid Colloquium: Making Calculus Accessible

Abstract: When Isaac Newton developed calculus in the 1600s, he was trying to tie together math and physics in an intuitive, geometrical way. But over time math and physics teaching became heavily weighted toward algebra, and less toward geometrical problem-solving. However, many practicing mathematicians and physicists will get their intuition geometrically first and do the algebra later. Joan Horvath and Rich Cameron’s new book, Make: Calculus, imagines how Newton might have used 3D printed models, LEGO bricks, programming, craft materials, and a dash of electronics to teach calculus concepts intuitively with hands-on models. The book uses as little reliance on algebra as possible while still retaining enough to allow comparison with a traditional curriculum. The 3D printable models are written in OpenSCAD, the text-based, open-source CAD program. The models are in an open source repository and are designed to be edited, explored, and customized by teachers and learners. Joan and Rich will also address how they think about the tactile storytelling of their models. They hope their work will make calculus more accessible, in the broadest sense of the word, to enable more people to start on the road to STEM careers. Make: Calculus is available in a softcover print version, in a PDF/epub3 bundle in which the epub3 with MathML equations has been optimized for screenreaders (Thorium epub3 reader recommended), and in Kindle format. Joan and Rich will talk about some of the technology gaps they encountered trying to keep a book with calculus equations accessible. Joan Horvath and Rich Cameron are the co-founders of Pasadena-based Nonscriptum LLC, which provides 3D printing and maker tech consulting and training. Their eight previous books include Make: Geometry, which developed a similar repository of models for middle and high-school math in collaboration with the SKI “3Ps” project. They have also authored popular LinkedIn Learning courses on additive manufacturing, and run several related (currently virtual) Meetup groups. Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.) Abstract:When Isaac Newton developed calculus in the 1600s, he was trying to tie together math and physics in an intuitive, geometrical way. But over time math and physics teaching became heavily weighted toward algebra, and less toward geometrical problem-solving. However, many practicing mathematicians and physicists will get their intuition geometrically first and do the algebra later.Joan Horvath and Rich Cameron’s new book, Make: Calculus, imagines how Newton might have used 3D printed models, LEGO bricks, programming, craft materials, and a dash of electronics to teach calculus concepts intuitively with hands-on models. The book uses as little reliance on algebra as possible while still retaining enough to allow comparison with a traditional curriculum.The 3D printable models are written in OpenSCAD, the text-based, open-source CAD program. The models are in an open source repository and are designed to be edited, explored, and customized by teachers and learners. Joan and Rich will also address how they think about the tactile storytelling of their models. They hope their work will make calculus more accessible, in the broadest sense of the word, to enable more people to start on the road to STEM careers.Make: Calculus is available in a softcover print version, in a PDF/epub3 bundle in which the epub3 with MathML equations has been optimized for screenreaders (Thorium epub3 reader recommended), and in Kindle format. Joan and Rich will talk about some of the technology gaps they encountered trying to keep a book with calculus equations accessible.Joan Horvath and Rich Cameron are the co-founders of Pasadena-based Nonscriptum LLC, which provides 3D printing and maker tech consulting and training. Their eight previous books include Make: Geometry, which developed a similar repository of models for middle and high-school math in collaboration with the SKI “3Ps” project. They have also authored popular LinkedIn Learning courses on additive manufacturing, and run several related (currently virtual) Meetup groups.Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

Special Time: 11:00 AM, Zoom Colloquium: OKO - app that uses computer vision to assist blind and visually impaired people

Special Time: 11:00 AM, Zoom Colloquium: OKO – app that uses computer vision to assist blind and visually impaired people

Abstract – “AYES is a Belgian-based company that is co-founded by three computer scientists. Our journey started when our visually impaired family friend, Bram, told us about the challenges he faced while navigating outdoors. Quite quickly we realized that current assistive technologies are outdated and could benefit from artificial intelligence. For that reason, we started developing a mobile application, called OKO, that uses the smartphone camera and computer vision to assist people in their daily lives. Crossing the street is a very stressful task if there is no accessible pedestrian signal installed. For that reason, we’ve developed a feature capable of detecting the pedestrian traffic light. So far, we have identified 70.000+ safe crossings. Our goal now is to bring this technology to the USA since a lot of cities are experiencing difficulties with installing accessible pedestrian signals. In recent months we’ve also added public transport recognition which identifies the bus number and destination to get people on the right bus. In the future, we’ll add more computer vision-related features to deliver an even better navigation experience.” https://www.f6s.com/michieljanssen Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.) Abstract:”AYES is a Belgian-based company that is co-founded by three computer scientists. Our journey started when our visually impaired family friend, Bram, told us about the challenges he faced while navigating outdoors. Quite quickly we realized that current assistive technologies are outdated and could benefit from artificial intelligence. For that reason, we started developing a mobile application, called OKO, that uses the smartphone camera and computer vision to assist people in their daily lives. Crossing the street is a very stressful task if there is no accessible pedestrian signal installed. For that reason, we’ve developed a feature capable of detecting the pedestrian traffic light. So far, we have identified 70.000+ safe crossings. Our goal now is to bring this technology to the USA since a lot of cities are experiencing difficulties with installing accessible pedestrian signals. In recent months we’ve also added public transport recognition which identifies the bus number and destination to get people on the right bus. In the future, we’ll add more computer vision-related features to deliver an even better navigation experience.” Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.) 

Zoom Brown Bag: Localization for Indoor Navigation using Computer Vision

Zoom Brown Bag: Localization for Indoor Navigation using Computer Vision

Abstract – For blind and low-vision individuals, traveling independently can be a challenging endeavor. Current technology and infrastructure makes this task far more feasible than ever before, especially with tools such as voice-guided GPS-based wayfinding apps on a person’s own smartphone. Still, there are gaps in GPS-deprived environments such as indoor locations. We propose a light-weight computer vision and inertial sensor-based wayfinding system, which requires only a 2D floor plan and a few snapshots of signs or other visual landmarks around the area. I will present a method of indoor localization to determine a person’s location in the environment, to be used as part of a smartphone navigation app that can provide turn-by-turn directions. https://yua.jul.mybluehost.me/directory/ryan-crabb

Zoom Colloquium: On the interaction between body movements and cognition

Zoom Colloquium: On the interaction between body movements and cognition

Abstract: Cognitive processes are almost exclusively investigated in settings for which voluntary body movements are largely suppressed. However, even basic sensory processes can differ drastically between movement states. My special interest therefore lies in the naturally behaving system. We investigate the interaction between cognition, oscillatory brain activity and body movements in freely moving humans through the application of various mobile approaches. Within this scope we ask how walking influences attentional visual processes, auditory perception and creativity. Concerning smaller movements as well as the interaction between different types of movements, we focus on eye related movements, such as spontaneous eyeblinks, saccades and pupil size. Our work shows complementary neurophysiological and behavioral evidence of the importance of movement and movement state when considering simple as well as complex cognitive processes. https://www.researchgate.net/profile/Barbara-Haendel Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)