Perception-Driven Soft-Edge Occlusion for Optical See-Through Head-Mounted Displays. 2025

Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa

Systems with occlusion capabilities, such as those used in vision augmentation, image processing, and optical see-through head-mounted display (OST-HMD), have gained popularity. Achieving precise (hard-edge) occlusion in these systems is challenging, often requiring complex optical designs and bulky volumes. On the other hand, utilizing a single transparent liquid crystal display (LCD) is a simple approach to create occlusion masks. However, the generated mask will appear defocused (soft-edge) resulting in insufficient blocking or occlusion leakage. In our work, we delve into the perception of soft-edge occlusion by the human visual system and present a preference-based optimal expansion method that minimizes perceived occlusion leakage. In a user study involving 20 participants, we made a noteworthy observation that the human eye perceives a sharper edge blur of the occlusion mask when individuals see through it and gaze at a far distance, in contrast to the camera system's observation. Moreover, our study revealed significant individual differences in the perception of soft-edge masks in human vision when focusing. These differences may lead to varying degrees of demand for mask size among individuals. Our evaluation demonstrates that our method successfully accounts for individual differences and achieves optimal masking effects at arbitrary distances and pupil sizes.

UI MeSH Term Description Entries

Related Publications

Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
May 2019, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
April 2015, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
November 2016, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
December 2022, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
September 2018, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
November 2015, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
November 2025, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
April 2016, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
April 2015, IEEE transactions on visualization and computer graphics,
Xiaodan Hu, and Yan Zhang, and Alexander Plopski, and Yuta Itoh, and Monica Perusquia-Hernandez, and Naoya Isoyama, and Hideaki Uchiyama, and Kiyoshi Kiyokawa
December 2015, IEEE transactions on visualization and computer graphics,
Copied contents to your clipboard!