Algorithmic Implementation of Visually Guided Interceptive Actions: Harmonic Ratios and Stimulation Invariants

Wangdo Kim, Duarte Araujo, Moo Young Choi, Albert Vette, Eunice Ortiz

Research output: Contribution to journalArticlepeer-review

Abstract

This research presents a novel algorithmic implementation to improve the analysis of visually controlled interception and accompanying motor action through the computational application of harmonic ratios and stimulation invariants. Unlike traditional models that focus mainly on psychological aspects, our approach integrates the relevant constructs into a practical mathematical framework. This allows for dynamic prediction of interception points with improved accuracy and real-time perception–action capabilities, essential for applications in neurorehabilitation and virtual reality. Our methodology uses stimulation invariants as key parameters within a mathematical model to quantitatively predict and improve interception outcomes. The results demonstrate the superior performance of our algorithms over conventional methods, confirming their potential for advancing robotic vision systems and adaptive virtual environments. By translating complex theories of visual perception into algorithmic solutions, this study provides innovative ways to improve motion perception and interactive systems. This study aims to articulate the complex interplay of geometry, perception, and technology in understanding and utilizing cross ratios at infinity, emphasizing their practical applications in virtual and augmented reality settings.

Original languageEnglish
Article number277
JournalAlgorithms
Volume17
Issue number7
DOIs
StatePublished - Jul 2024

Keywords

  • algorithmic implementation to perception
  • dynamic interception
  • harmonic ratios
  • motion perception
  • stimulation invariants

Fingerprint

Dive into the research topics of 'Algorithmic Implementation of Visually Guided Interceptive Actions: Harmonic Ratios and Stimulation Invariants'. Together they form a unique fingerprint.

Cite this