With the increasing power of mobile headsets and mobile networks, mobile visual search applications have gained popularity and became tractable. One of the key technologies to enable visual search are the robust and compact features, which are extracted from an image and are invariant to recapturing variations. One of the key factors for compact visual descriptors is the selection of local features. The size of the compact visual descriptors and the computational complexities of a visual search system increase with the number of features selected. In this sense, ranking the descriptors extracted from a single image according to their importance in terms of recapturing is very necessary and important. In this paper, we attack this problem by proposing a novel self-matching selection. In this method, we randomly apply an out-of-plane rotation to the target image and match the original features to the features that are extracted from the out-of-plane rotated image. The importance of the features is ranked according to the self-matching score. This method is proven to be better than other peak strength and edge strength based methods by 30% from experiments on a large database.