Research Article | OPEN ACCESS
A Computational Model of Visual Attention Based on Space and Object
Shuhong Li and Qiaorong Zhang
College of Computer and Information Engineering, Henan University of Economics and Law, Zhengzhou, China
Research Journal of Applied Sciences, Engineering and Technology 2014 1:42-48
Received: January 25, 2013 | Accepted: March 02, 2013 | Published: January 01, 2014
Abstract
Object-based visual attention has got more and more attention in image processing. A computational model of visual attention based on space and object is proposed in this study. Firstly spatial visual saliency of each pixel is calculated and edges of the input image are extracted. Salient edges are obtained according to the visual saliency of each edge. Secondly, a graph-based clustering process is done to get the homogeneity regions of the image. Then the most salient homogeneity regions are extracted based on their spatial visual saliency. Perceptual objects can be extracted by combining salient edges and salient regions. Attention value of each perceptual object is computed according to the area and saliency. Focus of attention is shifted among these perceptual objects in terms of the attention value. The proposed computational model was tested on lots of natural images. Experiment results indicate that our model is valid and effective.
Keywords:
Perceptual object, saliency map, salient region, visual attention, visual saliency,
References
-
Dirk, W., 2006. Interactions of visual attention and object recognition: Computational modeling, algorithms and psychophysics. Ph.D. Thesis, California Institute of Technology.
-
Itti, L, C. Kouch and E. Niebur, 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE T. Pattern Anal., 20: 1254-1259.
CrossRef -
Itti, L. and C. Kouch, 2001. Computational modeling of visual attention. Nat. Rev. Neurosci., 2: 194-230.
CrossRef PMid:11256080 -
Itti, L. and C. Kouch, 2003. Feature combination strategies for saliency-based visual attention systems. J. Elec. Imag., 10: 161-169.
CrossRef -
Lamy, D. and Y. Tsal, 2000. Object features, object locations and object files: Which does selective attention activate and when. J. Exp. Psychol. Hum. Percept. Perform., 26: 1387-1400.
CrossRef PMid:10946721 -
Peter, J.B. and M. Walter, 2002. Spatial frequency, phase and the contrast of natural images. J. Opt. Soc. Am., 19: 1096-1106.
CrossRef -
Qiaorong, Z. and Z. Yafeng, 2010. Perceptual objects extraction based on saliency and clustering. J. Multimed., 5: 393-400.
-
Shao, J., J. Gao, Y. Zhao and X. Zhang, 2008. Perceptual object detection algorithm based on image intrinsic dimensionality. Chinese J. Sci. Instrum., 29: 810-815.
-
Xuelei, N. and H. Xiaoming, 2002. Statistical interpretation of the importance of phase information in signal and image reconstruction. Elsevier Science, June, 2002.
-
Yaoru, S., 2003a. Hierarchical object-based visual attention for machine vision. Ph.D. Thesis, University of Edinburgh.
-
Yaoru, S. and F. Robert, 2003b. Object-based visual attention for computer vision. Artif. Intell., 146: 77-123.
CrossRef -
Yaoru, S., F. Robert, W. Fang and M.G. Herman, 2008. A computer vision model for visual-object-based attention and eye movements. Comput. Vis. Image Und., 112: 126-142.
CrossRef -
Yiqun, H., X. Xing and M. Wei-Ying, 2004. Salient region detection using weighted feature maps based on the human visual attention model. LNCS, 3332: 993-1000.
-
Zhao, X.P., L. Wang and H. Zhan-Yi, 2006. A perceptual object based attention mechanism for scene analysis. J. Image Graph., 11: 281-288.
-
Zou, Q., S. Luo and Y. Zheng, 2006. A computational model of object-based attention using multi-scale analysis and grouping. Acta Electron. Sinica, 34: 559-562.
Competing interests
The authors have no competing interests.
Open Access Policy
This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Copyright
The authors have no competing interests.
|
|
|
ISSN (Online): 2040-7467
ISSN (Print): 2040-7459 |
|
Information |
|
|
|
Sales & Services |
|
|
|