Digital Media Art Project: Prototype

Digital Media Art Project: Prototype

Prototype of the Digital Media Art Project

On the definition of my digital media art project, and aiming to complete the a/r/cographic methodology, I will develop the prototypeFor those who are not familiarised with what I am doing and talking here, you can follow the recap that follows.

A/r/cographic methodology for the definition of a Digital Media Art Project:

For writing this entry, I switch back in time to October 2021 when I presented the artefact I’m Watching You/Me or IMWYM for short (Olivero & Araújo, 2021). Understanding and documenting this 1st prototype with the current knowledge and methodology, will be the base for the development of a 2nd edition. Hence, I intend to present IMWYM 2 at the Retiro DMAD 2022, as the corolary of this PMAD. 

Prototype of the Digital Media Art Project
Figure 1 - First prototype of the digital media art project

IMWYM 1

IMWYM is an artefact for enhancing the use of Hybrid Immersive Models within the field of digital arts. The aim of IMWYM is to stimulate the under-explored form of expression that spherical perspectives represent by facilitating, demonstrating, opening, and extending their use both with general applications and in particular with digital art live presentations.

The first version that I present today, focused on the live execution of spherical drawings for an audience. Within this first prototype many things resulted a bit rudimentary, due that some parts were not fully developed for many different reasons. However, this 1st edition also achieved great goals, such as the effective demonstration of one possible way to achieve spherical drawings for an audience in a performative setting with concurrent interactive live feed of the VR visualization.

Components

The components of IMWYM 1 were:

  • IN-1: a physical drawing (a paper on the wall)
  • IN-2: a digital drawing (either from a drawing pad or a tablet*, or from a drawing software such as Eq A Sketch 360)
  • IN-3: a pre existing equirectangular media (drawing, picture, or video)
  • IN-4: a mobile phone with movement and location sensors
  • IN-5: a high-resolution camera
  • OU-1: a projector
  • OU-2: an external monitor*
* These components were not used during ARTECH 2021 due to logistical issues. However, they were both available and functioning, as it can be seen within the video below.
Prototype of the Digital Media Art Project
Figure 2 - I'm Watching You/Me. 1st edition. Aveiro, Portugal, Congress ARTECH, 2021

Functioning

The basic functioning of this first prototype is divided in three parts:

The artist (AR-1) chooses one medium between IN-1, 2 or 3 based on a personal preference. 

The software captures (through IN-5) and converts the equirectangular media into a classical perspective, which is shown on the screen (OU-1).

A visitor interacts with the visual sphere by sending position coordinates from the mobile phone (IN-4).

The following video shows the functioning and characteristics of the artefact. Furthermore, it also exposes how to interact with the visual sphere using the phone. Finally, there are some drawing examples both my own and from students.

Mixing inputs

The inputs are media generated either with traditional (IN-1) or digital techniques (IN-2, IN-3). IMWYM 1 focuses on the use of drawings, although we can also use photos and videos as long as they follow the equirectangular projection (Figure 3, left). 

A very important feature that IMWYM 1 introduces is the possibility of mixing inputs. Indeed, through TouchDesigner’s interface we can select up to three parallel inputs. Meaning, I can compose an equirectangular drawing interactively and on-the-fly together with another artist or visitor. Finally, the VR environment results a superposition of the different inputs, and it can be seen through the output screen (OU-1). 

Some examples of mixed interactive compositions could be: 

  • 360 video (IN-3) + Physical drawing (IN-1)
  • Physical drawing (IN-1) + Digital drawing using Eq A Sketch 360 (IN-2)
  • Equirectangular picture (IN-3) + Digital drawing using a drawing pad (IN-2) + Physical drawing (IN-1)

3 different inputs

  • A handmade physical artwork created on-the-fly
  • An existing media
  • A drawing made with software (Eq A Sketch 360)

One of the features of IMWYM is the possibility to mix inputs on-the-fly

Figure 3 – Selection of inputs

Interaction

If the artist chooses IN-1, then camera IN-5 captures the drawing on-the-fly. Thus, the software converts and mixes the inputs, and streams the result in VR modality through OU-1 (Figure 4, left). If the artist chooses IN-2 or IN-3, it has visual feedback both in VR modality through OU-1, and in equirectangular mode through OU-2. OU-2 improves the interaction by separating the interface from the software drawing board of IN-2.

A visitor VI-1 interacts with the visual sphere through a mobile phone IN-4. Effectively, this can be done either while I perform a live drawing session or while the artefact shows an already existing drawing (Figure 4, right). IN-4 sends OSC data to the software through compass and gravity sensors. Every new position of the phone updates the camera within the virtual sphere, thus discovering a new part of the drawing. 

Finally, OU-1 shows the result in VR modality with a Field of View previously set (e.g., at 60º), i.e., the spherical perspective is converted in a moving linear perspective according to the camera’s position.

IMWYM allowed live spherical drawing and a parallel VR visualisation

The artist draws in equirectangular format, and another user interacts with the mobile phone

Interaction

The visitor interacts with the visual sphere sending OSC from the mobile phone to the software

Figure 4 – Interaction

Innovation

For producing this prototype, I considered the state of art for different spherical perspectives (equirectangular, azimuthal equidistant, cubical) and the software available for their practice. Within that state of the art, IMWYM highlights its utility introducing the innovation of allowing live spherical drawing and its parallel VR visualisation, a task for which there were almost no software options in October 2021 (Olivero & Araújo, 2021).

I composed IMYWM 1 using the free non-commercial version of TouchDesigner. I choose this node-based programming before entering a pude coding stage. This way, I managed to do a general evaluation of the workflow with the great versatility of TD. The exhibition during ARTECH 2021 allowed me to see the reaction of the public, to gather some very important opinions about the artefact’s usability and perception, and certainly, to detect issues and problems.

For example, it was a big challenge to migrate the software at the very last minute from Mac to Windows. I would have expected a better integration between both operative systems since it was the same software. Yet, no… I needed to re-work some things the night before in a big rush. All these problems and issues are discussed in the next entry of Testing.

Conclusion

I have developed the prototype stage for my digital media art project. In short, the artist draws in equirectangular projection using either traditional (IN-1) or digital techniques (IN-2, IN-3) while visitors interact with the camera’s position through IN-4. Thus, both artists and visitors watch the VR results through OU-1 and the equirectangular source through OU-2. 

The artefact encourages the artist to concentrate in the drawing, while the visitor is free to choose where to look at. That way, the artist has a more complex view of the whole and the detail whereas the observer watches in classical perspective, not necessarily needing to deal with the unknown “distortions” of spherical perspective yet seeing both at once.

IMWYM 1 explores and explodes Hybrid Immersive Models within digital art, and it uses the artefact as a way of expanding spherical perspectives’ applications. Within the next entry (Testing), I will list the improvements and modifications for IMWYM 2. Such a list will consider what I have studied since October 2021, the current goals for this project, the articles published until now, and what I have learned from the live testingFinally, in the last entry of the a/r/cographic path (Intervention), I will give the definition of IMWYM 2, final requirements and schemes for the installation.

References

Olivero, L. F., & Araújo, A. B. (2021). I’m watching you/me. Live drawing and VR visualization of spherical perspectives using TouchDesigner. Proceedings of ACM ARTECH Conference (ARTECH2021). 10th international conference on digital and interactive arts, Aveiro Portugal. https://doi.org/10.1145/3483529.3483778

2 Responses

Comments are closed.