Sample weighting and variations in neighbourhood or data-dependent distance metric definitions are three principal directions considered for improving the k-NN classification technique. Recently, manifold-based distance metrics attracted considerable interest and computationally less demanding approximations have been developed. However, a careful comparison of these alternative approaches is missing. In this study, an extensive comparison is firstly performed for three alternative neighbourhood definitions and four manifold-based distance measures. Then, a novel computationally less demanding feature line-based method is proposed, which exploits geometrical neighbourhoods of test samples for feature line construction. Experimental results have shown that the improvements achieved by the majority of the existing schemes are not considerable. It is also verified that the proposed scheme surpasses other computationally less demanding manifold-based schemes.