Interest in human-robot coexistence, in which humans and robots share a common work volume, is increasing in manufacturing environments. Efficient work coordination requires both awareness of the human pose and a plan of action for both human and robot agents in order to compute robot motion trajectories that synchronize naturally with human motion. In this paper, we present a data-driven approach that synthesizes anticipatory knowledge of both human motions and subsequent action steps in order to predict in real-time the intended target of a human performing a reaching motion. Motion-level anticipatory models are constructed using multiple demonstrations of human reaching motions. We produce a library of motions from human demonstrations, based on a statistical representation of the degrees of freedom of the human arm, using time series analysis, wherein each time step is encoded as a multivariate Gaussian distribution. We demonstrate the benefits of this approach through offline statistical analysis of human motion data. The results indicate a considerable improvement over prior techniques in early prediction, achieving 70% or higher correct classification on average for the first third of the trajectory. We also indicate proof-of-concept through the demonstration of a human-robot cooperative manipulation task performed with a PR2 robot. We analyze the quality of task-level anticipatory knowledge required to improve prediction performance early in the motion trajectory.