Existing hand gesture recognition methods predominantly rely on a close-set assumption, which in essence limits the viewpoints, gesture categories, and hand shapes at test time to closely resemble those seen during training. This requirement is however rarely met in practice, as images are often captured from unconstrained viewpoints, with novel gestures and unseen hand shapes that can differ significantly from the training data. This motivates us to investigate an open-set hand gesture recognition problem, where hand gestures are still recognizable from unconstrained viewpoints, and novel gesture classes and hand shapes can be incrementally learned with just a few examples. To address this, we propose a viewpoint influence elimination network that extracts view-independent features, significantly improving performance in scenarios with unconstrained viewpoints. Moreover, a joint-weighted classification scheme is introduced to augment the cosine similarity metric for evaluating few-shot incremental learning of novel gestures and shapes. Finally, as existing hand gesture recognition datasets primarily adhere to the close-set assumption, a new hand gesture recognition dataset, OHG, is introduced in this paper, that includes a wide range of viewpoints, diverse gesture classes, and distinct hand shapes. Experimental hand gesture recognition results demonstrate the superior performance of our approach in both unconstrained viewpoint and few-shot incremental learning scenarios.