diff --git a/.DS_Store b/.DS_Store new file mode 100644 index 0000000..05d4c53 Binary files /dev/null and b/.DS_Store differ diff --git a/floppies/.DS_Store b/floppies/.DS_Store new file mode 100644 index 0000000..21d80f8 Binary files /dev/null and b/floppies/.DS_Store differ diff --git a/floppies/claudia/noweb/.DS_Store b/floppies/claudia/noweb/.DS_Store index cdfa72d..0079a00 100644 Binary files a/floppies/claudia/noweb/.DS_Store and b/floppies/claudia/noweb/.DS_Store differ diff --git a/floppies/franc/noweb/2.gif b/floppies/franc/noweb/2.gif new file mode 100644 index 0000000..3334802 Binary files /dev/null and b/floppies/franc/noweb/2.gif differ diff --git a/floppies/franc/noweb/3.gif b/floppies/franc/noweb/3.gif new file mode 100644 index 0000000..8203717 Binary files /dev/null and b/floppies/franc/noweb/3.gif differ diff --git a/floppies/franc/noweb/edited/.DS_Store b/floppies/franc/noweb/edited/.DS_Store new file mode 100644 index 0000000..5008ddf Binary files /dev/null and b/floppies/franc/noweb/edited/.DS_Store differ diff --git a/floppies/franc/noweb/edited/10-optflow.jpg b/floppies/franc/noweb/edited/10-optflow.jpg new file mode 100644 index 0000000..e6f7431 Binary files /dev/null and b/floppies/franc/noweb/edited/10-optflow.jpg differ diff --git a/floppies/franc/noweb/edited/11.jpg b/floppies/franc/noweb/edited/11.jpg new file mode 100644 index 0000000..80b16cd Binary files /dev/null and b/floppies/franc/noweb/edited/11.jpg differ diff --git a/floppies/franc/noweb/edited/2-ableton-kinect.jpg b/floppies/franc/noweb/edited/2-ableton-kinect.jpg new file mode 100644 index 0000000..ad9faf2 Binary files /dev/null and b/floppies/franc/noweb/edited/2-ableton-kinect.jpg differ diff --git a/floppies/franc/noweb/edited/5-pd.jpg b/floppies/franc/noweb/edited/5-pd.jpg new file mode 100644 index 0000000..b2aa1a9 Binary files /dev/null and b/floppies/franc/noweb/edited/5-pd.jpg differ diff --git a/floppies/franc/noweb/edited/6-pd.jpg b/floppies/franc/noweb/edited/6-pd.jpg new file mode 100644 index 0000000..8712acf Binary files /dev/null and b/floppies/franc/noweb/edited/6-pd.jpg differ diff --git a/floppies/franc/noweb/edited/7-optflow.jpg b/floppies/franc/noweb/edited/7-optflow.jpg new file mode 100644 index 0000000..1f70137 Binary files /dev/null and b/floppies/franc/noweb/edited/7-optflow.jpg differ diff --git a/floppies/franc/noweb/edited/8-optflow.jpg b/floppies/franc/noweb/edited/8-optflow.jpg new file mode 100644 index 0000000..a3b4d1c Binary files /dev/null and b/floppies/franc/noweb/edited/8-optflow.jpg differ diff --git a/floppies/franc/noweb/floppy-final2.png b/floppies/franc/noweb/floppy-final2.png new file mode 100644 index 0000000..6b33be6 Binary files /dev/null and b/floppies/franc/noweb/floppy-final2.png differ diff --git a/floppies/franc/noweb/index.html b/floppies/franc/noweb/index.html index bc6fe43..700dec9 100644 --- a/floppies/franc/noweb/index.html +++ b/floppies/franc/noweb/index.html @@ -1 +1,651 @@ -GREAT JOB! + + + + + Pushing the Score + + + + + + + + + + + + +
+ +

+

+

+ + +
+

PUSHING THE SCORE

+
+ + +

+ CONCEPT ⟶ +
+ BODY ⟶ +
+ SAMPLES ⟶ +

+
+ + + +

+ www.issue.xpub.nl/02/ www.deplayer.nl/ +

+ + + + +

+ +
CONCEPT

+ +

+ +
BODY

+ +

+ +
SAMPLES

+ +
+ + + + + + + + + + + + + + + + + + + + + + + Technology is providing us new ways to shape our perception of space, while at the same time it is transforming our bodies into gadgets. This is not only changing our spatial awareness but it’s also extending our senses beyond given nature. Moreover, control systems that regulate and command specific behaviours can be very practical tools to improve physical functionalities or translate its data. For instance, this experiment employs “Optical Flow” sensor which detects motion from image objects between frames, and “Open Sound Control (OSC)” which enables to exchange and format data from different devices, for instance, from Python to Puredata. Although the unique possibilities to improving human physical or cognitive limitations by plugging a body to an electronic or mechanical device are yet very hypothetical and might extend beyond our imagination, nevertheless technology is continuously transforming the abstract or fictional conception of “cybernetics” to a more realistic evidence. The communication between both automated and living systems is continuously evolving, upgrading and rising up more sophisticated engineered tools that might enable us to increase our knowledge, morphing our perception through deeper experiences. In this experiment, the potential for controlling data through motion on space while becoming independent of physicality, opens up new creative and pragmatic alternatives for facing both technological and communication constraints. + + + + + + This body analyses human motion on space and detects it using “Opitcal Flow” in “Python”, using a series of predesigned multidirectional interpreters. These interpreters are made up of a series of points (intersections), forming a grid which intersects with movement. This is detected in form of numeric values, which are automatically transmitted and formatted to a graphic array in Puredata. This array arrange these values and generates a polygonal waveform based on these received coordinates (which numbers ranges between "x", having values from 0 to 10, and "y" from -1 to 1). This activates an “oscillator” object which defines the frequency of the tone, together with “metro” object, which time spans its duration in miliseconds, consequently iterating the audio (re-writting it in the display). The intersections and the graphic array (together with the entire Puredata patch) become an interactive notation system, while people become the instrument/tool that triggers it. + X / Y INTERSECTIONS + + 5 PUREDATA EXTENDED6 OSC MSG RECEIVE7 GRAPHIC ARRAY8 OSC / METRO + + + SYSTEM;1 PYTHON2 OPTICAL FLOW2 INTERSECTIONS3 OSC MSG SEND + + + Y range = -1 to 1 + X range = 0 to 10 + Waveform + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + OPTICAL FLOW + + #!/usr/bin/env pythonimport numpy as npimport cv2, mathimport videohelp_message = '''USAGE: opt_flow.py [<video_source>]Keys: 1 - toggle HSV flow visualization 2 - toggle glitch'''# def draw_flow(img, flow, step=4): # size grid # h, w = img.shape[:2]# y, x = np.mgrid[step/2:h:step, step/2:w:step].reshape(2,-1)# fx, fy = flow[y,x].T# lines = np.vstack([x, y, x+fx, y+fy]).T.reshape(-1, 2, 2)# lines = np.int32(lines + 0.5)# vis = cv2.cvtColor(img, cv2.COLOR_GRAY2BGR)# cv2.polylines(vis, lines, 0, (0, 0, 255)) # BGR# for (x1, y1), (x2, y2) in lines:# cv2.circle(vis, (x1, y1), 1, (0, 255, 0), -1)# return visimport OSC# from pythonosc import osc_message_builder# from pythonosc import udp_clientimport timedef send_flow0(img, flow, step=4): # size grid h, w = img.shape[:2] y, x = np.mgrid[step/2:h:step, step/2:w:step].reshape(2,-1) fx, fy = flow[y,x].T #print "fx, fy", fx, fy lines = np.vstack([x, y, x+fx, y+fy]).T.reshape(-1, 2, 2) lines = np.int32(lines + 0.5) vis = cv2.cvtColor(img, cv2.COLOR_GRAY2BGR) flines = [] for (x1, y1), (x2, y2) in lines: # print ("y1", y1) if (x1 == 38 or x1 == 46 or x1 == 54 or x1 == 62 or x1 == 70 or x1 == 78 or x1 == 86 or x1 == 94 or x1 == 102 or x1 == 110 or x1 == 118) and y1 in range(38, 90, 8): flines.append(((x1,y1),(x2,y2))) normx = x1 / 8 - 4 normy = 1 - ((y1 / 8 - 4) / 3.0) dx = x2-x1 dy = y2 - y1 m = int(math.sqrt( (dx*dx) + (dy*dy) )) if m>2: print ("dot", (normx, normy)) msg = OSC.OSCMessage() msg.setAddress("/dot") #msg.append(dx) #msg.append(dy) #msg.append(m) msg.append(normx) msg.append(normy) client.send(msg) # client.send_message("/franc", m) flines = np.int32(flines) cv2.polylines(vis, flines, 0, (0, 40, 255)) # BGR for (x1, y1), (x2, y2) in flines: cv2.circle(vis, (x1, y1), 1, (0, 255, 0), -1) return vis flines = np.int32(flines) cv2.polylines(vis, flines, 0, (0, 40, 255)) # BGR for (x1, y1), (x2, y2) in flines: cv2.circle(vis, (x1, y1), 1, (0, 255, 0), -1) return vis # cv2.rectangle(img, pt1, pt2, color[, thickness[, lineType[, shift]]])if __name__ == '__main__': import sys print help_message try: fn = sys.argv[1] except: fn = 0 # connect to pd # Init OSC client = OSC.OSCClient() client.connect(('127.0.0.1', 9001)) # first argument is the IP of the host, second argument is the port to use #data="hello" # client = udp_client.SimpleUDPClient("127.0.0.1", 9001) # connect camera # cam = video.create_capture(fn) cam = video.create_capture("0:size=160x120") #canvas size in pixels ret, prev = cam.read() prevgray = cv2.cvtColor(prev, cv2.COLOR_BGR2GRAY) cur_glitch = prev.copy() while True: # print "GRAB FRAME" ret, img = cam.read() gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) flow = cv2.calcOpticalFlowFarneback(prevgray, gray, 0.5, 3, 15, 3, 5, 1.2, 0) prevgray = gray cv2.imshow('flow', send_flow0(gray, flow)) ch = 0xFF & cv2.waitKey(5) if ch == 27: break cv2.destroyAllWindows() + + + By exploring the connection between motion and sound, experiments have been performed through different software and tools, which has strengthen substantially the following additional material in this project. For instance, Kinect sensor and Synapse, which receives input data from Kinect and sends it out to Ableton or Max MSP, have been tested out. Similarly, motion detection was together explored with “color detection” in Puredata, which brought up more interesting alternatives. Sound recording and feedback loop was further tested with this method, though mechanically it was hardly accurate. Finally with “Optical Flow”, this work was reconfigured with a wider sense for interacting with data. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/floppies/franc/noweb/noise.gif b/floppies/franc/noweb/noise.gif new file mode 100644 index 0000000..0b682e6 Binary files /dev/null and b/floppies/franc/noweb/noise.gif differ diff --git a/floppies/franc/noweb/pd.png b/floppies/franc/noweb/pd.png new file mode 100644 index 0000000..1f579f6 Binary files /dev/null and b/floppies/franc/noweb/pd.png differ diff --git a/floppies/franc/noweb/style.css b/floppies/franc/noweb/style.css new file mode 100644 index 0000000..def4122 --- /dev/null +++ b/floppies/franc/noweb/style.css @@ -0,0 +1,308 @@ + + \ No newline at end of file diff --git a/floppies/margreet/.DS_Store b/floppies/margreet/.DS_Store new file mode 100644 index 0000000..2c7195a Binary files /dev/null and b/floppies/margreet/.DS_Store differ diff --git a/floppies/margreet/LICENSE.txt b/floppies/margreet/LICENSE.txt new file mode 100644 index 0000000..11b3927 --- /dev/null +++ b/floppies/margreet/LICENSE.txt @@ -0,0 +1,23 @@ +A sonification of the Dutch elections March 2017; based on the hashtags #gestemd and #ikstem +Creative Commons - Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) + +You are free to: +Share — copy and redistribute the material in any medium or format +Adapt — remix, transform, and build upon the material +for any purpose, even commercially. +The licensor cannot revoke these freedoms as long as you follow the license terms. + +Under the following terms: +Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. +ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. + +More information can be found here: +https://creativecommons.org/licenses/by-sa/4.0/ + +/////////////////////////////////////////////////////////////////////////// + +TGC (Terra Gamma Circulaire) scripts and config files +WTFPL (Do What the Fuck You Want To Public License). + +More information can be found here: +https://en.wikipedia.org/wiki/WTFPL \ No newline at end of file diff --git a/floppies/margreet/README.txt b/floppies/margreet/README.txt new file mode 100644 index 0000000..651cbfb --- /dev/null +++ b/floppies/margreet/README.txt @@ -0,0 +1,17 @@ +Author: Margreet Riphagen +Date: 2017 +Publication: Special Issue #2 +Publication launch: Tetra Gamma Circulaire #3 at De Player in Rotterdam (24th of March 2017) + +Title: A sonification of the Dutch elections March 2017; based on the hashtags #gestemd and #ikstem + +Description: +In the run up to the 2017 general elections in the Netherlands last Wednesday, the 15 of March 2017, a lot of Twitter traffic was generated. Literally millions of tweets were send that day over the Internet. + +This sonification entails three kinds of scores; +a) for the whole tweet, +b) for the hashtag ikstem (#ikstem), and +c) for the hashtag gestemd (#gestemd). + +Thanks to: +All PZI tutors, fellow students and Jan-Kees van Kampen \ No newline at end of file diff --git a/floppies/margreet/main.pd b/floppies/margreet/main.pd new file mode 100644 index 0000000..5d4eef1 --- /dev/null +++ b/floppies/margreet/main.pd @@ -0,0 +1,123 @@ +#N canvas 457 24 739 687 10; +#X declare -lib unpackOSC; +#X text -22 232 #ikstem; +#X text 55 233 #gestemd; +#X obj -69 492 dac~; +#X obj -37 134 unpack s s s; +#X obj 441 -2 loadbang; +#X obj -36 -28 import unpackOSC; +#X msg 441 31 \; pd dsp 1; +#X obj -37 40 mrpeach/udpreceive 127.0.0.1 4000; +#X obj -37 66 mrpeach/unpackOSC; +#X obj 191 87 unpackOSC; +#X obj 191 64 udpreceive 127.0.0.1 4000; +#X text 160 318 comment; +#X obj 157 254 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1 +-1; +#X obj 89 254 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1 +-1; +#X text 45 133 tag \, hashtag \, time; +#X obj -3 162 select #ikstem #gestemd both; +#X obj 0 252 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1 +-1; +#N canvas 692 710 276 254 ikstemenv 0; +#X obj 72 80 line~; +#X msg 81 45 0 50; +#X msg 72 -15 1 1; +#X text 99 -13 1 msec attack; +#X text 118 44 50 msec release; +#X obj 88 15 del 10; +#X obj 72 -44 inlet; +#X obj 25 170 outlet~; +#X obj 25 -45 inlet~; +#X obj 25 146 *~; +#X text 134 14 10 msec sustain; +#X connect 0 0 9 1; +#X connect 1 0 0 0; +#X connect 2 0 0 0; +#X connect 5 0 1 0; +#X connect 6 0 2 0; +#X connect 8 0 9 0; +#X connect 9 0 7 0; +#X restore -69 344 pd ikstemenv; +#X text -16 289 comment; +#N canvas 692 710 276 254 bothenv 0; +#X obj 72 80 line~; +#X obj 72 -44 inlet; +#X obj 25 170 outlet~; +#X obj 25 -45 inlet~; +#X obj 25 146 *~; +#X text 137 44 5 second release; +#X text 114 -14 500 msec attack; +#X msg 72 -15 1 50; +#X obj 88 15 del 100; +#X text 144 15 100 msec sustain; +#X msg 81 45 0 2000; +#X connect 0 0 4 1; +#X connect 1 0 7 0; +#X connect 3 0 4 0; +#X connect 4 0 2 0; +#X connect 7 0 0 0; +#X connect 8 0 10 0; +#X connect 10 0 0 0; +#X restore 100 344 pd bothenv; +#X obj 100 467 snapshot~; +#X obj 112 435 metro 100; +#X floatatom 100 493 0 0 0 0 - - -; +#N canvas 692 710 276 254 gestemdenv 0; +#X obj 72 80 line~; +#X text 99 -13 1 msec attack; +#X text 118 44 50 msec release; +#X obj 72 -44 inlet; +#X obj 25 170 outlet~; +#X obj 25 -45 inlet~; +#X obj 25 146 *~; +#X text 134 14 10 msec sustain; +#X msg 81 45 0 700; +#X obj 88 15 del 100; +#X msg 72 -15 1 10; +#X connect 0 0 6 1; +#X connect 3 0 10 0; +#X connect 5 0 6 0; +#X connect 6 0 4 0; +#X connect 8 0 0 0; +#X connect 9 0 8 0; +#X connect 10 0 0 0; +#X restore 14 344 pd gestemdenv; +#X obj 125 409 tgl 15 0 empty empty empty 17 7 0 10 -262144 -1 -1 0 +1; +#X text -69 250 triggers; +#X text 142 233 both; +#X text 146 410 monitor output on/off; +#X text -41 5 apperently \, on OSX \, one sometimes needs to append +mrpeach/ ...; +#X text 190 45 normally \, this should suffice:; +#X obj 101 286 osc~ 150; +#X obj -69 286 osc~ 7000; +#X obj 14 286 osc~ 500; +#X text 174 296 sinewaves envelopes (triggered by incoming OSC) double +click pd ..env objects to change envelopes; +#X connect 3 1 15 0; +#X connect 4 0 6 0; +#X connect 7 0 8 0; +#X connect 8 0 3 0; +#X connect 10 0 9 0; +#X connect 12 0 19 1; +#X connect 13 0 23 1; +#X connect 15 0 16 0; +#X connect 15 1 13 0; +#X connect 15 2 12 0; +#X connect 16 0 17 1; +#X connect 17 0 2 0; +#X connect 17 0 2 1; +#X connect 19 0 20 0; +#X connect 19 0 2 0; +#X connect 19 0 2 1; +#X connect 20 0 22 0; +#X connect 21 0 20 0; +#X connect 23 0 2 0; +#X connect 23 0 2 1; +#X connect 24 0 21 0; +#X connect 30 0 19 0; +#X connect 31 0 17 0; +#X connect 32 0 23 0; diff --git a/floppies/margreet/main.py b/floppies/margreet/main.py new file mode 100644 index 0000000..350ad53 --- /dev/null +++ b/floppies/margreet/main.py @@ -0,0 +1,68 @@ +from __future__ import print_function +import csv, os, sys +from datetime import datetime +from time import sleep +import OSC + + +# open a connection to pd +client = OSC.OSCClient() +address = '127.0.0.1', 4000 # 57120==SC +client.connect( address ) # set the address for all following messages +print ("1.client stderr", file=sys.stderr) + + +msg = OSC.OSCMessage() # OSCresponder name: '/touch' +msg.setAddress("/twitter-ikstem") +#msg.append('hello from python') +#client.send(msg) + +#os.system('xzcat /media/floppy/twittersonification.csv.xz > /tmp/twittersonification.csv') #floppydisk +os.system('xzcat twittersonification.csv.xz > /tmp/twittersonification.csv') #lokaal + + + +then = None +with open('/tmp/twittersonification.csv', 'rU') as csvfile: + file = csv.DictReader(csvfile) + print ("2.opening file stderr", file=sys.stderr) + + #csv.DictReader(csvfile)(["time"] + ['time']) + i=0 + sleep_time=1 + for row in file: + print ("3.row stderr", file=sys.stderr) + + i+=1 + t = row['time'] + print ("4.time stderr", file=sys.stderr) + t = float(t) + now = datetime.fromtimestamp(t) #[]dictreader reads the rowheader + #print now, row + if then: + ti = (now-then).total_seconds() + #print ti/100 + sleep_time = ti/1000 + print ("5.msg stderr", file=sys.stderr) + + + #msg.append( row['text'].lower() ) + #client.send(msg) + #msg.clearData() + + if "#ikstem" in row['text'].lower() and '#gestemd' in row['text'].lower(): + msg.append( ['both', str(now)] ) + elif '#ikstem' in row['text'].lower(): + msg.append( ['#ikstem', str(now) ]) + elif '#gestemd' in row['text'].lower(): + msg.append( ['#gestemd', str(now)] ) + + # msg.append( [ str(row['text']) ] ) + + # send an osc message to pd + #print msg#[ i, row['text'], str(now)] + sleep(sleep_time) + client.send(msg) + msg.clearData() + then = now + print ("6.last print stderr", file=sys.stderr) diff --git a/floppies/margreet/noweb/.DS_Store b/floppies/margreet/noweb/.DS_Store new file mode 100644 index 0000000..5008ddf Binary files /dev/null and b/floppies/margreet/noweb/.DS_Store differ diff --git a/floppies/margreet/noweb/index.html b/floppies/margreet/noweb/index.html new file mode 100644 index 0000000..f3d6093 --- /dev/null +++ b/floppies/margreet/noweb/index.html @@ -0,0 +1,236 @@ + + + + + +twitter sonification + + +Sonification of the Dutch elections 2017 + + + +
+

Sonification of the Dutch elections 2017

+ +

In the run up to the 2017 general elections in the Netherlands last Wednesday, the 15 of March 2017, a lot of Twitter1 traffic was generated. Literally millions of tweets were send that day over the Internet.

+

+
+

The online political battle

+ +

En meteen is daar de tweet: #waarwasBuma2 +(And right after there was the tweet: #wherewasBuma)

+ +

This research is inspired on an article in the NRC (27 februari 2017) about the online political battle. Just like in the United States, parties attempt to reach voters through social media and to frame political opponents. The parties use social media more than ever this year’s elections. They try to convey their message to the unprecedented number of undecided voters and they try to frame opponents with catchy slogans or hashtags.

+ +

Seeing all these tweets passing so quickly it reminds me of a cascade of data, the starting point of a sonification3 to perceptualize this huge amount of data in a score.

+
+ +

Pushing the score

+ +

This sonifiction entails three kinds of scores; a) for the whole tweet, b) for the hashtag ikstem (#ikstem), and c) for the hashtag gestemd (#gestemd).

+ +

a) It captures tweets sed during the day of the elections, between 07.30 and 21.00, when the polling stations were open. In total there are 47613 tweets captured. Some examples of tweets send:

+ +

Ik wist niet wat ik moest stemmen, dus heb ik uiteindelijk maar een bootje gevouwen van het stembiljet #ikstem #tk2017 #gestemd https://t.co/KBqLBkYrpV

+ +

Grappig! RT @Mvan_berkel: In Leiden is rekening gehouden met zwevende kiezers. #ikstem #TweedeKamerverkiezingen https://t.co/Un8uJfNZ0v

+ +

Met volle trotst en vrolijkheid voor de eerste keer gaan stemmen vandaag #ikstem

+ +

b + c) A hash tag is used to streamline relevant topics by keyword or phrase by grouping them together to make it easier to find and follow tweets from people who are talking about the same thing. In this sonificatoin I used #ikstem and #gestemd. Both are given a different kind of sound.

+
+
+

Used hardware and software

+ +

Python , specific libraries: csv, os and OSC

+ +

Pure Data

+ +

RaspberryPi and Pi skin conductivity

+ +

Flyer: https://issue.xpub.nl/02/

+
+ +

References

+
+
+
    +
  1. +

    Twitter is an online news and social networking service where users post and interact with messages, “tweets,” restricted to 140 characters, (http://twitter.com).

  2. +
  3. +

    En meteen is daar de tweet: #waarwasBuma, De politieke strijd online, door Andreas Kouwenhoven & Hugo Logtenberg, 27 februari 2017, 21:05 (https://www.nrc.nl/nieuws/2017/02/27/en-meteen-is-daar-de-tweet-waarwasbuma-7033073-a1547979).

  4. +
  5. +

    Sonification is the use of non-speech audio to convey information or perceptualize data. Auditory perception (the sensory system for the sense of hearing) has advantages in temporal, spatial, amplitude, and frequency resolution that open possibilities as an alternative to visualization techniques.

  6. +
+
+ + + \ No newline at end of file diff --git a/floppies/margreet/noweb/stemicoon.svg b/floppies/margreet/noweb/stemicoon.svg new file mode 100644 index 0000000..3048c7e --- /dev/null +++ b/floppies/margreet/noweb/stemicoon.svg @@ -0,0 +1 @@ +Asset 11 \ No newline at end of file diff --git a/var/www/static/404-floppy-not-found.html b/var/www/static/404-floppy-not-found.html index 4559f26..78dc27f 100644 --- a/var/www/static/404-floppy-not-found.html +++ b/var/www/static/404-floppy-not-found.html @@ -3,6 +3,7 @@ INSERT FLOPPY!!! +