This work presents an approach to data-driven motion generation for humanoid robots, which is based on the observation and analysis of human whole-body motions. To this end, we investigate how captured human motions can be represented, classified and organized in a large-scale motion database. The statistical modeling of the transitions between characteristic whole-body poses enables the subsequent generation of multi-contact motions.