The above videos demonstrate user painting in my Living Painting Application described in this post. The hand tracking is done using a Kinect and objects are seeded from the hand with a velocity in the direction the hand is facing. The spread of objects being emitted can be controlled as well.
The Living Painting Application can work on a number of layers at once, the top example uses two layers while the bottom is only one. The layers can be video sources that are painted in real time.