master
Your Name 5 years ago
parent e8739e0665
commit eb16ef51ef

@ -16,7 +16,7 @@
</head>
<body>
<div id="container">
<!-- <div id="body"> -->
<div id="left">
<h1>
<div id="bigpcb">
@ -24,10 +24,9 @@
<img class="bigpcb" src="img/pcbimg.jpg"/>
</a>
</div>
<marquee class="maintitle" behavior="scroll" direction="left" height="250px"><embed class="output" src="img/inpout.gif" /></marquee>
</h1>
<a href="#damla_module"><div id="damla_T"><h1 class="name_T">Carbon</h1></div></a>
<h2 class="student">by Damlanur Bilgin</h2>
<p class= "p-style">Carbon is a device that translates graphite markings on paper into signals that manipulate sound and visuals. Carbons interface is pencil, paper, and an LED screen that reflects the user's marks on paper and translates signals from other modules into light and color.
@ -38,6 +37,7 @@
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p class= "p-style">The knowledge of how to use pencil and paper is much more widespread than the knowledge of playing an instrument. Replacing the interface of a synth with a sheet of paper and a pencil opens this device up to people who wouldnt know how to interact with a musical instrument. The user can make decisions based on the way they want to move their hand or the shape of marks they want to leave on the paper. In a way, Carbon is also a translator between audio and visual. A musician can use the sound output of the synth to guide their drawing in the same way an illustrator can use shapes on paper to control sound.
</p>
<br>Carbon is born out of a desire to interface with a medium one is unfamiliar with. The lack of technical knowledge in music that started out as an insecurity ended up guiding me through this project in exploring how I can interact with the unfamiliar through the familiar.</p>
</div>
<div class="links">
@ -56,7 +56,7 @@
<br>By connecting the available outputs to inputs and using the adjustable knobs on the interface, the user has the options to mix, modify and let the samples from both analog and digital sources interact with each other to create new, unexpected sounds.
<br> >Note 1: Despite both only being capable of playing lo-fi** samples, the analog device can make recordings and play them back directly and the digital module's behavior is fully programmable, allowing to make use of feedback, phasing, gate triggers, pitch control functions, and configurable functionalities as desired and more.
<br>How it works: A speech sample (saying IN-SIDE OUT) is cut into pieces so the separate words can be divided to the digital module (programmed onto) and analog tape loop* (recorded onto its magnetic tape). While the analog tape loop is prominent and continuously plays the same slice, the digital module runs a program that allows the recorded sample to (start playback on a trigger and) jump between different starting positions and loop sizes start playback. Played together, random flakes and new combinations of the separate words are constantly being generated in real-time. They correct and complement each other, making new combinations of the 3 words: IN-SIDE-OUT.</p>
<button type="button" class="collapsible">Read More >> </button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p class= "p-style">HOW: more detailed (for website only)
<br>The starting point of this research was an analysis of the Meergranen* module, originally designed as a sample playback device to function within a modular Eurorack** setup. While testing its capabilities, a lot of its technical limitations were exposed: the module could only hold a preloaded 4-second sample in a low-resolution audio quality (8000Hz). Besides that, the experience of being new to this technology, electronics, and programming, it frustrated me that I couldn't comprehend what was happening inside this module and what caused these limitations. Instead of opening it up (I just manually assembled and soldered it), I had to connect this PCB full of weird electronic components to a computer to see what complex code it was running to 'just play a crappy sample'. But I noticed it could do very interesting things too; it allowed input signals to modulate the playback speed, and influence other behaviors and get unexpected results.
@ -79,8 +79,9 @@
<a href="#avital_module"> <div id="avital_T"><h1 class="name_T">GENERATOR</h1></div></a>
<h2 class="student">by Avital Barkai</h2>
<p class= "p-style">Generator is a module that translates voltage into video signal. The voltage can come both from the power source of the module and the outputs of other modules that can connect to it. It is both self-controlled and sound-controlled so it can be a stand alone object or part of a bigger constellation. It is an image maker that explores the visual possibilities within the limitations of the hardware and the code. Coding was a substantial part of this process sometimes pointing me in the right direction, sometime presenting setbacks and frustrations and and at other times creating unexpected outcomes. The goal at first was unknown and mysterious and I came to realised that I will discover it as I go. Once I got the module to work with the LCD screen, it became my canvas. As I progressed, I realised I wanted the user of the module to be able to generate images and examine the possibilities of this medium. I created this version of the “etch a sketch”, in which the user alternates between the sense of control and randomness. On the one hand the rules of the modules are mostly clear there are 5 brushes, you can determine their size, and you can draw across the X and the Y axis. On the other hand it takes some time to learn how to control it, there are many options to discover and it can be influenced by other modules' outputs that cannot be controlled directly. </p>
<button type="button" class="collapsible">Read More</button>
<p class= "p-style">Generator is a module that translates voltage into video signal. The voltage can come both from the power source of the module and the outputs of other modules that can connect to it. It is both self-controlled and sound-controlled so it can be a stand alone object or part of a bigger constellation. It is an image maker that explores the visual possibilities within the limitations of the hardware and the code. Coding was a substantial part of this process sometimes pointing me in the right direction, sometime presenting setbacks and frustrations and and at other times creating unexpected outcomes. The goal at first was unknown and mysterious and I came to realised that I will discover it as I go. Once I got the module to work with the LCD screen, it became my canvas. As I progressed, I realised I wanted the user of the module to be able to generate images and examine the possibilities of this medium. I created this version of the “etch a sketch”, in which the user alternates between the sense of control and randomness. On the one hand the rules of the modules are mostly clear there are 5 brushes, you can determine their size, and you can draw across the X and the Y axis. On the other hand it takes some time to learn how to control it, there are many options to discover and it can be influenced by other modules' outputs that cannot be controlled directly.
</p>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p class= "p-style"> Finally, the screen clears every 10-12 seconds so you have a limited time to draw your image. The act of refreshing the screen seemed natural because every image we make is unique. It is almost impossible to make exactly the same image, so each one created is one of a kind but also temporary and fleeting, soon to be replaced by another image. All of these combined reference the experience of creative coding. There is a magic in randomness, producing while exploring and getting unexpected outcomes. My wish was to embed a personal perspective happy “mistakes”, making adjustments while working and viewing the tools that we use as partners of the process and not just a means to an end. I encourage whoever builds this module to explore the coding and try to see where she/he can take it, what can happen while playing with code.</p>
</div>
@ -99,7 +100,7 @@
<br>Music is in itself something very intuitive and emotional and as it is something profoundly satisfying and an excellent method of emotional self reflection the making of it should be accessible to a broad audience. This approach is not only aiming to provide a new controller for musicians, but to put the creation of music into the lives of individuals as a practice of leisure and self-realization.
<br>Dancing, usually a way to deal with music after the process of producing it has ended, is a direct and very personal translation of music. Provided the necessary interfaces are available, making music could and should be as simple as dancing to it.
<br>GLARE module is being controlled by gestures only and thereby works in a very intuitive way. The movements controlling the auditive content can resemble the motion of dancing.</p>
<button type="button" class="collapsible">Read More</button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p class= "p-style"></p>
</div>
@ -119,7 +120,7 @@
<br>However, orality is strictly interlinked with sound as a carrier of language, which invests the act of speech with multitudes of aesthetic qualities. The sonic and phonetic dimensions of language are what articulates speech, while simultaneously imposing its ephemerality. As Walter Ong fundamentally states:
<br>“All sensation takes place in time, but sound has a special relationship to time unlike that of the other fields that register in human sensation. Sound exists only when it is going out of existence. It is not simply perishable but essentially evanescent, and it is sensed as evanescent. When I pronounce the word permanence, by the time I get to the -nence, the perma- is gone, and has to be gone.”1
<br>It is then important to consider the recording of language as not only a practice of writing, but also one of speaking and listening.</p>
<button type="button" class="collapsible">Read More</button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p class= "p-style">voice.say(spFUNCTION);
<br>The voice appears here as an electronic anomaly: a synthetic placeholder for a missing vocal anatomy. You are now faced with a device which is able to speak—a disembodied voice sounding from an electronic circuit. The voice struggles to articulate through the constraints of a lo-fi sound output. Some sounds fade and are left to exist only as the memories of certain phonemes in the listeners cognitive effort. Although, listening is performed without the ability to localize the precise source of the sound. It is therefore an acousmatic voice with origins unknown. As Mladen Dolar puts it in Whats in a Voice?:
@ -208,7 +209,7 @@
</p>
<button type="button" class="collapsible">Read More</button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p class= "p-style">A physical object, which goals are to generate a practice of storytelling, inspired by the ways disparate narratives can come together to create inroads into the unknown (or the obvious).
<br>With this in mind, this module is an arena to explore how protocols can induce new forms of inventiveness in the act of storytelling, grounded in the cohabitation of a multiplicity of standpoints, rather than a linear, all-encompassing narrative.
@ -244,7 +245,7 @@
<h2 class="student">by Sandra Golubjevaite</h2>
<p class= "p-style"> >>>> okokokok >> what >> dont be scared >> seven [7] is a straight forward .print&.read device >> you can play the prewritten poem [seven.ino] or you can write your own >> [7] can manipulate text >> it can send an outgoing message or be interrupted by an incoming one >> [7] enjoys repetition, coincidence & a gentle touch >> to channel text the module needs to be connected to a TV screen through a video input >> a video signal is broadcasted and can be listened to in mono >>>>>> mhmhmhmh >> how >> how i approached hardware >> 7 knobs & 7 buttons arranged in no particular hierarchy >> i wanted to create a humble device with clear manual functions >> an interface that makes you feel in control >> this urge became clear after getting familiar with a term calm technology during the Special Issue X >> calm technology - a type of information technology where the interaction between the technology and its user is designed to occur in the user's periphery rather than constantly at the centre of attention >> how >> how i approached software >> working within the framework of the tv.out library and a television monitor determined certain features of the program, such as its esthetics and its interface >> with a subject in mind the content of the poem [seven.ino] was developed and written while learning how to code with Arduino IDE >> the way the poem unfolds to a user depends on his/hers interaction with [7] >>>>>> ghhrrghrrr >> </p>
<button type="button" class="collapsible">Read More</button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p> >> why >> in the beginning of the Special Issue X i was intrigued by DadaDodo >> DadaDodo is a program that analyses texts for word probabilities and then generates random sentences based on that >> sometimes these sentences are nonsense but sometimes they cut right through to the heart of the matter and reveal hidden meanings >> i was also interested in the experimental poetry examples mentioned in Florian Cramers “words made flesh” >> but why >> repetition of text is a method that i practice during live vocal performances >> a partial looping of a poem functions as a transition or/and an emphasis >> as a poet i am interested in a life of a written static poem-block >> when a poem does not have a voice present how could it still rustle? >> [7] is a first prototype towards that idea <<<<<<</p>
</div>
@ -256,16 +257,16 @@
</div>
<a href="#mika_module"><div id="mika_T" class="moduletext"><h1 class="name_T">TXX.UO</h1></div></a>
<a href="#mika_module"><div id="mika_T"><h1 class="name_T">TXX.UO</h1></div></a>
<h2 class="student">by Mika Motskobili</h2>
<p>Txx.uo consolidates two contrasting radio frequency implementation modes: RFID reader scans the cards/objects containing RFID tags using radio waves and LCD screen displays a Q-code* [internationally established three-letter abbreviation used in radio communication].
<p class= "p-style">Txx.uo consolidates two contrasting radio frequency implementation modes: RFID reader scans the cards/objects containing RFID tags using radio waves and LCD screen displays a Q-code* [internationally established three-letter abbreviation used in radio communication].
<br>*A particular Q-code denotes a question when it is followed by a question mark and references an answer [statement] when it's not:
<br>QRU? : Have you anything for me?
<br>QRU : I have nothing for you.
<br>The RFID technology is used for object identification, authentication and security reasons, whereas Q-code is transmitted on a specific radio frequency by a radio operator and can be intercepted by anyone, who is tuned in to the same frequency.
<br>This module converts the data received from a scanned card into a sound and binary code. This Binary signal is then transfigured into LED blinking and is also funneled to another module through an output channel. Top row of an LCD screen displays a specific question from the list of a Q-code, depending on a knob position and when receiving a signal from another module, a random Q-code answer is shown on the second row of the screen.</p>
<button type="button" class="collapsible">Read More</button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p>Components and configuration
<br>⯐ : Solder
@ -304,10 +305,10 @@
</div>
<a href="#anna_module"><div id="anna_T" class="moduletext"><h1 class="name_T">VISIBLE SPEECH</h1></div></a>
<a href="#anna_module"><div id="anna_T"><h1 class="name_T">VISIBLE SPEECH</h1></div></a>
<h2 class="student">by Anna Sandri</h2>
<p>Visible Speech employs phonetic vocabularies and other oddities to visually reproduce conversations. Not only as a stand alone unit, but also as a part of a collective act when combined with other modules.
<p class= "p-style">Visible Speech employs phonetic vocabularies and other oddities to visually reproduce conversations. Not only as a stand alone unit, but also as a part of a collective act when combined with other modules.
<br>It communicates by using constructed and non-constructed languages, through alphabets read by both humans and machines.
<br>The Visible Speech module comes from a fascination for constructed languages (as languages that have been consciously created), phonetic translations of speech and the never ending human desire to elaborate an ideal universal language.
<br>This leading interest has been translated towards the structure which sustains the modules technical core and interface. an instrument built and programmed with its own particular kind of widespread alphabet, human developed and machine readable.
@ -316,9 +317,9 @@
<br>He was trying to create a vocabulary able to rationalise universal concepts and numbers. A similar pattern to the machine language used in todays encoding systems based on binary arithmetics.
<br>Today, four centuries after Leibniz's utopian system was theorised, we can ask to what degree his dream has been accomplished.
<br>Are our machines speaking the ultimate Characteristica Universalis?</p>
<button type="button" class="collapsible">Read More</button>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p>
<p class= "p-style">
<br>Leibnizs ideal language was conceived to be effectively expressive so as to become universal.
<br>Today, by contrast, we perhaps face the opposite condition. We can see the proliferation of too many universal languages, aiming for the same result but differing at their cores.
<br>The dream of a Characteristica Universalis seems stuck in the complexity arising from the duality of human languages and machine languages, caught between the aim for entirety and the impossibility of reducing plurality.
@ -367,10 +368,10 @@
<a class="link_title" href="https://pzwiki.wdka.nl/mediadesign/User:Peach/issue-10-lfp" target="_blank">Personal Website</a>
</div>
<a href="#tisa_module"><div id="tisa_T" class="moduletext"><h1 class="name_T">DISTRACTION MANAGER</h1></div></a>
<a href="#tisa_module"><div id="tisa_T"><h1 class="name_T">DISTRACTION MANAGER</h1></div></a>
<h2 class="student">by Tisa Neža Herlec</h2>
<p> How are you?
<br>How is your posture?
<p class= "p-style"> How are you?
<br>How is your posture?
<br>Are you well hydrated?
<br>Do you have to go to the toilet?
<br>Are you hungry?
@ -387,7 +388,7 @@
</p>
<button type="button" class="collapsible">Read More >></button>
<div class="extended">
<p> It only works when and if the user is successfully triggered to complete their steps in the protocol, asking and answering their own set of questions.
<p class= "p-style"> It only works when and if the user is successfully triggered to complete their steps in the protocol, asking and answering their own set of questions.
If the user fails to internalize the protocol, ignoring the sound, failing to reach a symbiotic union with the device, the DM is deemed to be completely dysfunctional and futile.
@ -486,9 +487,7 @@
<a class="link_title" href="https://pzwiki.wdka.nl/mediadesign/User:Peach/issue-10-lfp" target="_blank">Personal Website</a>
</div>
</div>
</div>
<div id="right">
@ -708,7 +707,8 @@
</div>
</div>
</div>
<!-- </div> -->
<script>

@ -59,6 +59,11 @@ a:visited{
text-transform: uppercase;
}
.name_T:hover {
text-decoration: none;
color: #21632c;
}
/* style edits */
@ -75,7 +80,6 @@ body
margin-bottom: 0px;
margin-right: 0px;
margin-left: 0px;
}
div#left {
@ -85,7 +89,7 @@ div#left {
display: inline-block;
margin-left: auto;
margin-right: auto;
height:700px;
height:100vh;
overflow-y: scroll;
/*! padding-bottom: 200px; */
background-color: white;
@ -99,7 +103,7 @@ div#right {
display: inline-block;
margin-left: auto;
margin-right: auto;
height:700px;
height:100vh;
overflow-y: scroll;
/*! padding-bottom: 200px; */
background-color: #21632c;
@ -108,7 +112,7 @@ div#right {
.p-style{
/* letter-spacing: 1px; */
margin-left: 1vw;
margin-right: 1vw;
margin-right: 2vw;
margin-top: 1vh;
margin-bottom:1vh;
font-size: 13pt;

Loading…
Cancel
Save