echoscape
What if your favorite song looked like your favorite park?
You could move around verse 1, climb up the pre-chorus hill, see and feel the beat drop as you drop with it and wait around the corner for the bridge. Yes, a literal bridge!
Echoscape is an speculative, immersive design platform that plays on this idea and combines the visual and aural mediums in a virtual space, allowing the users to create a musical landscape that directly represents the physicality of music.
What began as a technical research journey became a deep dive into philosophical theories on cognition, and the embodied ways in which we make meaning. As such, the design principles I adopted were directly adapted from Lakoff and Johnson’s works on conceptual and image metaphors.
We cannot clearly separate our understanding and conceptualization of music from our experience of it. We do not merely experience a musical work and then understand it. There is not experience first, followed by our grasp of the meaning of that experience. Rather, our understanding is woven into the fabric of our experience. Our understanding is our way of being in and making sense of our experience. Thus the way we experience a piece of music will depend importantly on how we understand it, and our understanding is intimately tied to our embodiment – that is, to our sensory-motor capacities and to our emotional makeup. The grounding of metaphors in bodily experience suggests possible universal structures (of bodily perception and movement) for understanding music.
— Steve Larson
- Studying past projects involving music and virtual reality to observe the utilization of space and the design challenges faced
- Noting how the physicality and fluidity of music was preserved while also accounting for the presence of discrete sound objects
- Exploring the cognition behind music-learning and experimented with different methods to incorporate embodied metaphors into my project
I want to lay a pathway for music creation that moves away from traditional notation systems and music theory, while still implementing and respecting the concepts they represent. I never had the opportunity to learn a musical instrument or be involved in anything musical growing up, partly based on an assumption that one needed extensive musical knowledge to be able to create.
However, in my experience with self-learning styles, I always find it more valuable to dive in first and figure out how to name what I did later. One example of a similar instance is my experience with the English language. While I have always loved reading, to this day I don’t consciously recall any of the grammatical rules that go into the construction of language. Asking me basic grammatical questions will leave me confounded but proofreading and finding grammatical mistakes is something I can manage in my sleep. It’s an interesting dynamic of being knowledgeable enough in a language to create and know why you make certain choices while at the same time not knowing how to verbalize the innate sense of “knowing”.
My experience with creating music has been largely the same- while music theory and notation are important foundations of the language of music, knowing them is not a necessary element for creation. When I did start playing around with music, I was too stubborn and impatient to learn the art of sight reading before I could sit down at a piano. So, as one would normally do, I decided to make my own musical notation system for the concepts I could audiate musically but not verbalize. When I did sit down to analyze what I had made, I was able to link the patterns and concepts I had subconsciously internalized to existing terms used by professionals when discussing music theory. This type of ground up processing and learning has informed the groundwork of my thesis, in that I am not merely aiming to create a product that lets you experience music in a novel way, but rather inventing a new kind of musical language based on physical movement and spatial interactions.
Research Areas
click on the branches to explore the sources