Face plays an essential role in interpersonal communication. .Most of the current works on facial expression recognition attempt to recognize a small set of prototypic expressions such as happy, surprise, anger, sad, disgust and fear. However the most of human emotions is communicated by changes in one or two of discrete features. In this paper, we develop a facial Action Units (AU) Classification system, based on the facial features extracted from facial characteristic points in frontal image sequences. Selected facial feature points were automatically tracked using a cross-correlation based optical flow, and extracted feature vectors were used to classify Action Units, using RBF neural networks. Proposed classifier showed good results for classifying single and composite AUs.