Digital Media Art Project: Prototype
On the definition of my digital media art project, and aiming to complete the a/r/cographic methodology, I will develop the prototype. For those who are not familiarised with what I am doing and talking here, you can follow the recap that follows.
A/r/cographic methodology for the definition of a Digital Media Art Project:
For writing this entry, I switch back in time to October 2021 when I presented the artefact I’m Watching You/Me or IMWYM for short (Olivero & Araújo, 2021). Understanding and documenting this 1st prototype with the current knowledge and methodology, will be the base for the development of a 2nd edition. Hence, I intend to present IMWYM 2 at the Retiro DMAD 2022, as the corolary of this PMAD.
IMWYM is an artefact for enhancing the use of Hybrid Immersive Models within the field of digital arts. The aim of IMWYM is to stimulate the under-explored form of expression that spherical perspectives represent by facilitating, demonstrating, opening, and extending their use both with general applications and in particular with digital art live presentations.
The components of IMWYM 1 were:
- IN-1: a physical drawing (a paper on the wall)
- IN-2: a digital drawing (either from a drawing pad or a tablet*, or from a drawing software such as Eq A Sketch 360)
- IN-3: a pre existing equirectangular media (drawing, picture, or video)
- IN-4: a mobile phone with movement and location sensors
- IN-5: a high-resolution camera
- OU-1: a projector
- OU-2: an external monitor*
The basic functioning of this first prototype is divided in three parts:
The artist (AR-1) chooses one medium between IN-1, 2 or 3 based on a personal preference.
The software captures (through IN-5) and converts the equirectangular media into a classical perspective, which is shown on the screen (OU-1).
A visitor interacts with the visual sphere by sending position coordinates from the mobile phone (IN-4).
The following video shows the functioning and characteristics of the artefact. Furthermore, it also exposes how to interact with the visual sphere using the phone. Finally, there are some drawing examples both my own and from students.
The inputs are media generated either with traditional (IN-1) or digital techniques (IN-2, IN-3). IMWYM 1 focuses on the use of drawings, although we can also use photos and videos as long as they follow the equirectangular projection (Figure 3, left).
A very important feature that IMWYM 1 introduces is the possibility of mixing inputs. Indeed, through TouchDesigner’s interface we can select up to three parallel inputs. Meaning, I can compose an equirectangular drawing interactively and on-the-fly together with another artist or visitor. Finally, the VR environment results a superposition of the different inputs, and it can be seen through the output screen (OU-1).
Some examples of mixed interactive compositions could be:
- 360 video (IN-3) + Physical drawing (IN-1)
- Physical drawing (IN-1) + Digital drawing using Eq A Sketch 360 (IN-2)
- Equirectangular picture (IN-3) + Digital drawing using a drawing pad (IN-2) + Physical drawing (IN-1)
Figure 3 – Selection of inputs
If the artist chooses IN-1, then camera IN-5 captures the drawing on-the-fly. Thus, the software converts and mixes the inputs, and streams the result in VR modality through OU-1 (Figure 4, left). If the artist chooses IN-2 or IN-3, it has visual feedback both in VR modality through OU-1, and in equirectangular mode through OU-2. OU-2 improves the interaction by separating the interface from the software drawing board of IN-2.
A visitor VI-1 interacts with the visual sphere through a mobile phone IN-4. Effectively, this can be done either while I perform a live drawing session or while the artefact shows an already existing drawing (Figure 4, right). IN-4 sends OSC data to the software through compass and gravity sensors. Every new position of the phone updates the camera within the virtual sphere, thus discovering a new part of the drawing.
Finally, OU-1 shows the result in VR modality with a Field of View previously set (e.g., at 60º), i.e., the spherical perspective is converted in a moving linear perspective according to the camera’s position.
Figure 4 – Interaction
For producing this prototype, I considered the state of art for different spherical perspectives (equirectangular, azimuthal equidistant, cubical) and the software available for their practice. Within that state of the art, IMWYM highlights its utility introducing the innovation of allowing live spherical drawing and its parallel VR visualisation, a task for which there were almost no software options in October 2021 (Olivero & Araújo, 2021).
I composed IMYWM 1 using the free non-commercial version of TouchDesigner. I choose this node-based programming before entering a pude coding stage. This way, I managed to do a general evaluation of the workflow with the great versatility of TD. The exhibition during ARTECH 2021 allowed me to see the reaction of the public, to gather some very important opinions about the artefact’s usability and perception, and certainly, to detect issues and problems.
For example, it was a big challenge to migrate the software at the very last minute from Mac to Windows. I would have expected a better integration between both operative systems since it was the same software. Yet, no… I needed to re-work some things the night before in a big rush. All these problems and issues are discussed in the next entry of Testing.
I have developed the prototype stage for my digital media art project. In short, the artist draws in equirectangular projection using either traditional (IN-1) or digital techniques (IN-2, IN-3) while visitors interact with the camera’s position through IN-4. Thus, both artists and visitors watch the VR results through OU-1 and the equirectangular source through OU-2.
The artefact encourages the artist to concentrate in the drawing, while the visitor is free to choose where to look at. That way, the artist has a more complex view of the whole and the detail whereas the observer watches in classical perspective, not necessarily needing to deal with the unknown “distortions” of spherical perspective yet seeing both at once.
IMWYM 1 explores and explodes Hybrid Immersive Models within digital art, and it uses the artefact as a way of expanding spherical perspectives’ applications. Within the next entry (Testing), I will list the improvements and modifications for IMWYM 2. Such a list will consider what I have studied since October 2021, the current goals for this project, the articles published until now, and what I have learned from the live testing. Finally, in the last entry of the a/r/cographic path (Intervention), I will give the definition of IMWYM 2, final requirements and schemes for the installation.