“recursive reflection” is a new work that was made for this exhibition. The motions of visitors moving to music at the venue are first sampled in real-time using Kinect. The sampled data are then used for tuning a model in combination with machine-learned data of the group ELEVENPLAY’s dance performance, to generate motions (=dance) that are shown as visuals at this exhibition.
In Western-style painting and sculpture, iconological aspects play a central role regarding the depiction of the human body, and a look at history shows that artists have been pursuing beauty by seeking canons (numerical ratios) and deformations in the image of the body. The advancement of photography, movie and other media technology has enabled us to visually analyze continuous movements of objects. Such technologies with a high capacity of reproducing reality have generated in us humans a new objective stance, deepened our understanding of movements and behaviors, and eventually affected various forms of artistic expression.
Cybernetics and data visualization have developed as well, and through a progressive integration of bodily expression and rapidly evolving motion capture and other real-time sensing and analysis technologies, it has become possible to precisely recognize the spatial volume of three-dimensional bodies, minutely convert their motions into data, and, in combination with machine learning techniques, create forms of body expression that reference the past while at once instantly predicting the future, from the viewpoint of the present. The constantly improving learning speed suggests further and greater expansion in the field of media technology.
* Algorithms and output methods will be continuously updated throughout the exhibition period.
Direction, Development: ASAI Yuta
Visual Programming: HORII Satoshi, KERA Futa
Sound Design: KUROTAKI Setsuya
System Programming: ISHII 2bit, HANAI Yuya
Craft: MOURI Kyouhei
Produce: INOUE Takao
Dance Training Data: ELEVENPLAY
Supervisor: MANABE Daito
Special Cooperation: Kyle McDONALD