Merge branch 'master' of git.xpub.nl:/var/www/git.xpub.nl/repos/tgc3

master
Nadine Rotem-Stibbe 7 years ago
commit f581fd2f4a

BIN
.DS_Store vendored

Binary file not shown.

3
.gitignore vendored

@ -71,4 +71,5 @@
*.pyc
.DS_Store
*DS_Store
._*

@ -1,17 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<link rel="stylesheet" type="text/css" href="style.css">
<title>The fine line - Existencialism and spirituality</title>
</head>
<body>
<div id="wrap">
</div>
</body>
</html>

@ -15,7 +15,7 @@
<p>This is a conceptual project based on a reflection around the sentence “The <b>FINE LINE</b> between <b>NOTHING MATTERS</b> and <b>EVERYTHING MATTERS</b>”. It took me a while to figure out how to translate this sentence into something. At the beginning, everything had to make perfect sense and had to have a reason to be. Among mind maps, research, crazy complicated ideas, and loads of thought, it became obvious that the concept was being explored from an “EVERYTHING MATTERS” perspective. Half of the sentence, “<b>NOTHING MATTERS</b>”, was being left behind. But, how to make a project based on meaninglessness? Suddenly, it clicked: <b>ANYWAY, WE ARE GOING TO DIE.</b> This sentence vanished all meaning this project could have, placing myself to the other side of the line. And then a process began: my mind tried to <b>MAKE SENSE</b> of everything saying words like: “If everyone thought this way, society would be aimless”. Then, on propose, I jumped to the other side: “Yes whatever, even though, we are going to die”. Quickly my mind tried to fix it again with other excuses in order to find meaning. And rushing, again on purpose, I shifted the side. It happened several times. I could feel the <b>TENSION</b>, the <b>POLARITIES</b>, the <b>OPPOSITION</b>: I was feeling the line between everything matters and nothing matters. </p>
<p style="margin-bottom:30px;">The idea of this project is to represent this line digitally and sonorously, and let the audience play with it. <b>SEEKING BALANCE AND BREAKING IT</b>. <b>HARMONIOUS</b> sounds and the <b>DISRUPTION</b> of those. Experiencing how close from each other <b>OPPOSITES</b> can be. </p>
<ul><a href="../fine-line/index.html" style="font-size:30px">EXPLORE THE LINE</a></ul>
<ul><a href="fine-line/index.html" style="font-size:30px">EXPLORE THE LINE</a></ul>
</div>
</body>
</html>

@ -220,7 +220,7 @@ if(mouseY <= height/2) {
background(255,255,255,opac); // alpha
// draw the shape of the waveform
drawWaveform();
//drawWaveform();
}

@ -4,16 +4,16 @@
<link rel="stylesheet" type="text/css" href="style.css">
<script language="javascript" type="text/javascript" src="../fine-line/libraries/p5.min.js"></script>
<script language="javascript" src="../fine-line/libraries/p5.dom.min.js"></script>
<script language="javascript" src="../fine-line/libraries/p5.sound.min.js"></script>
<script language="javascript" type="text/javascript" src="sketch.js"></script>
<!-- <script language="javascript" type="text/javascript" src="fine-line/libraries/p5.min.js"></script>
<script language="javascript" src="fine-line/libraries/p5.dom.min.js"></script>
<script language="javascript" src="fine-line/libraries/p5.sound.min.js"></script>
<script language="javascript" type="text/javascript" src="sketch.js"></script> -->
<title>The fine line</title>
</head>
<body>
<div id="canvasp5"></div>
<!-- <div id="canvasp5"></div> -->
<div id="wrap">
@ -28,7 +28,7 @@
<!--<ul><a href="essay.html">Existencialism and Spirituality</a></ul>-->
<ul><a href="explanation.html">Presentation of the line</a></ul>
<!--<ul><a href="score.html">Score to find the line</a></ul>-->
<ul><a href="../fine-line/index.html" style="font-size:30px">THE FINE LINE</a></ul>
<ul><a href="fine-line/index.html" style="font-size:30px">THE FINE LINE</a></ul>
</li>

@ -1,17 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<link rel="stylesheet" type="text/css" href="style.css">
<title>The fine line - Score</title>
</head>
<body>
<div id="wrap">
</div>
</body>
</html>

@ -1 +1,22 @@
FLOPPYLEFT - 2017
Nietzsche Public License v0.6
Copyright <2017> <Giulia de Giovanelli>
Copyright, like God, is dead. Let its corpse serve only to guard against its
resurrection. You may do anything with this work that copyright law would
normally restrict so long as you retain the above notice(s), this license, and
the following misquote and disclaimer of warranty with all redistributed
copies, modified or verbatim. You may also replace this license with the Open
Works License, available at the http://owl.apotheon.org website.
Copyright is dead. Copyright remains dead, and we have killed it. How
shall we comfort ourselves, the murderers of all murderers? What was
holiest and mightiest of all that the world of censorship has yet owned has
bled to death under our knives: who will wipe this blood off us? What
water is there for us to clean ourselves? What festivals of atonement,
what sacred games shall we have to invent? Is not the greatness of this
deed too great for us? Must we ourselves not become authors simply to
appear worthy of it?
- apologies to Friedrich Wilhelm Nietzsche
No warranty is implied by distribution under the terms of this license.

@ -4,13 +4,13 @@ Title: Adopt A Walk
Description:
By stealing the walk of another person do I become someone else?
This is an audio-guide of an experiment of gait analysis,
If you never heard about this term, gait analysis is the study of patterns of walk during ambulation used by new surveillance biometric technologies.
If you never heard about this term, gait analysis is the study of walking patterns as used by new surveillance biometric technologies.
People are asked to walk following a series of spoken instructions. The walks are stored temporarily on a page where youre invited to “adopt a walk” of another person.
“Have you ever tried to look at the way a person walk as a way to identify her?
Surveillance technologies are using the homogenic perception of human being as a model for their mechanics.”
“Have you ever tried to identify someone by the way they walk?
Surveillance technologies are using homogenic perception of human beings as a model for their mechanics.”
With this experiment youre invited to observe characteristic of walks your similar and adopt them. Try to imagine that in modifying your walk you could escape from detection of your identity.
With this experiment youre suggested to observe characteristics of other walks and adopt them.
In adopting a different walk do you become someone else?
Will it be you or someone elses identity that is detected by these surveillance algorithms?

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 MiB

@ -1,12 +0,0 @@
#N canvas 296 315 450 300 10;
#X obj 37 104 osc~ 440;
#X obj 37 146 dac~;
#X obj 161 74 loadbang;
#X msg 161 111 \; pd dsp 1;
#X obj 37 36 netreceive 3000;
#X obj 46 62 print;
#X connect 0 0 1 0;
#X connect 0 0 1 1;
#X connect 2 0 3 0;
#X connect 4 0 5 0;
#X connect 4 0 0 0;

@ -1,12 +1,7 @@
#!/usr/bin/env python
#!/usr/bin/python
import os, random, time
import subprocess
while True:
freq = str(random.randint(0,10)*110)
print(freq)
os.system('echo "'+freq+';" | pdsend 3000')
time.sleep(0.25)
subprocess.call(["scripts/voiceguide.sh"], cwd="/media/floppy")

@ -5,7 +5,11 @@ import cgitb; cgitb.enable()
from jinja2 import Template
# Directory => ITEMS list (all files with a timestamp name, grouped)
ff = os.listdir("clips")
path = "/var/www/static/gait"
try:
ff = os.listdir(path)
except OSError:
ff = []
tpat = re.compile(r"^(\d\d\d\d)-(\d\d)-(\d\d)-(\d\d)-(\d\d)-(\d\d)")
items = {}
for f in ff:
@ -22,7 +26,8 @@ items = [items[key] for key in sorted(items, reverse=True)]
for i in items[10:]:
for f in i.items():
print "deleting ", f
# 10 os.unlink(f)
fp = os.path.join(path, f)
os.unlink(fp)
# dump the data (debugging)
# print "Content-type: text/plain"
# print ""
@ -34,6 +39,7 @@ print ""
print Template(u"""<html>
<head>
<title>ADOPT A WALK</title>
<meta charset="utf-8">
<link rel="stylesheet" type="text/css" href="../styles/main.css">
</head>
<body>
@ -63,7 +69,7 @@ print Template(u"""<html>
<div class="movies">
{% for i in items %}
<a href="../clips/{{i.mp4}}"><img src="../clips/{{i.jpg}}" /></a>
<a href="/static/gait/{{i.mp4}}"><img src="/static/gait/{{i.jpg}}" /></a>
<p>{{i.mp4}}</p>
{% endfor %}
</div>

Before

Width:  |  Height:  |  Size: 89 KiB

After

Width:  |  Height:  |  Size: 89 KiB

Before

Width:  |  Height:  |  Size: 20 KiB

After

Width:  |  Height:  |  Size: 20 KiB

Before

Width:  |  Height:  |  Size: 14 KiB

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 69 KiB

@ -1 +1,8 @@
GREAT JOB!
<html>
<head>
<meta http-equiv="refresh" content="0;url=/cgi-bin/index.cgi" />
</head>
<body>
<a href="/cgi-bin/index.cgi">start</a>
</body>
</html>

@ -1,6 +1,6 @@
@font-face: {
font-family: "sporting_grotesque_gras-webfont";
src:"../fonts/sporting_grotesque_normal.otf";
src: url("../fonts/sporting_grotesque_normal.otf");
}
body {

@ -4,13 +4,15 @@ from __future__ import print_function
import cv2, os, sys, time
import numpy as np
from argparse import ArgumentParser
from picamera.array import PiRGBArray
from picamera import PiCamera
p = ArgumentParser("")
p.add_argument("--video", type=int, default=0, help="video, default: 0")
p.add_argument("--output", default=None, help="path to save movie, default: None (show live)")
p.add_argument("--width", type=int, default=640, help="pre-detect resize width")
p.add_argument("--height", type=int, default=480, help="pre-detect resize height")
p.add_argument("--width", type=int, default=160, help="pre-detect resize width")
p.add_argument("--height", type=int, default=128, help="pre-detect resize height")
p.add_argument("--fourcc", default="XVID", help="MJPG,mp4v,XVID")
p.add_argument("--framerate", type=float, default=25, help="output frame rate")
p.add_argument("--show", default=False, action="store_true")
@ -18,9 +20,9 @@ p.add_argument("--frames", type=int, default=100)
args = p.parse_args()
fourcc = None
cam = cv2.VideoCapture(args.video)
cam.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH, args.width)
cam.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT, args.height)
#cam = cv2.VideoCapture(args.video)
#cam.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH, args.width)
#cam.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT, args.height)
if args.output:
try:
@ -32,22 +34,31 @@ if args.output:
else:
out = None
while True:
ret, prev = cam.read()
prevgray = cv2.cvtColor(prev, cv2.COLOR_BGR2GRAY)
if prevgray.shape == (args.height, args.width):
break
print ("Starting camera", file=sys.stderr)
cam = PiCamera()
framesize = (args.width, args.height)
cam.resolution = framesize
cam.framerate = 32
rawCapture = PiRGBArray(cam, size=framesize)
# allow the camera to warmup
time.sleep(0.25)
count = 0
try:
while True:
ret, frame = cam.read()
# while True:
# ret, frame = cam.read()
for frame in cam.capture_continuous(rawCapture, format="bgr", use_video_port=True):
# print "GRAB FRAME"
frame = frame.array
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
ret, t= cv2.threshold(gray, 127, 255, cv2.THRESH_BINARY)
frame = cv2.cvtColor(t, cv2.COLOR_GRAY2BGR)
# flow = cv2.calcOpticalFlowFarneback(prevgray, gray, 0.5, 3, 15, 3, 5, 1.2, 0)
# prevgray = gray
# clear the stream in preparation for the next frame (important for picamera!)
rawCapture.truncate(0)
if out != None:
out.write(frame)

@ -0,0 +1,73 @@
#!/usr/bin/env python
from __future__ import print_function
import cv2, os, sys, time
import numpy as np
from argparse import ArgumentParser
p = ArgumentParser("")
p.add_argument("--video", type=int, default=0, help="video, default: 0")
p.add_argument("--output", default=None, help="path to save movie, default: None (show live)")
p.add_argument("--width", type=int, default=640, help="pre-detect resize width")
p.add_argument("--height", type=int, default=480, help="pre-detect resize height")
p.add_argument("--fourcc", default="XVID", help="MJPG,mp4v,XVID")
p.add_argument("--framerate", type=float, default=25, help="output frame rate")
p.add_argument("--show", default=False, action="store_true")
p.add_argument("--frames", type=int, default=100)
args = p.parse_args()
fourcc = None
cam = cv2.VideoCapture(args.video)
cam.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH, args.width)
cam.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT, args.height)
if args.output:
try:
fourcc = cv2.cv.CV_FOURCC(*args.fourcc)
except AttributeError:
fourcc = cv2.VideoWriter_fourcc(*args.fourcc)
out = cv2.VideoWriter()
out.open(args.output, fourcc, args.framerate, (args.width, args.height))
else:
out = None
while True:
ret, prev = cam.read()
prevgray = cv2.cvtColor(prev, cv2.COLOR_BGR2GRAY)
if prevgray.shape == (args.height, args.width):
break
count = 0
try:
while True:
ret, frame = cam.read()
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
ret, t= cv2.threshold(gray, 127, 255, cv2.THRESH_BINARY)
frame = cv2.cvtColor(t, cv2.COLOR_GRAY2BGR)
# flow = cv2.calcOpticalFlowFarneback(prevgray, gray, 0.5, 3, 15, 3, 5, 1.2, 0)
# prevgray = gray
if out != None:
out.write(frame)
count += 1
if args.show:
cv2.imshow('display', frame)
if cv2.waitKey(5) & 0xFF == ord('q'):
break
if args.frames != None:
if (count >= args.frames):
break
except KeyboardInterrupt:
pass
print ("\nCleaning up... Wrote", count, "frames")
if out:
out.release()
if args.show:
cv2.destroyAllWindows()

@ -1,22 +1,27 @@
v=-v en-gb+f5 -s 150
espeak "Tetra Gamma Gait Analysis " -v en-gb +f5 -s 150
# ensure the record folder exists
mkdir -p /var/www/static/gait
v=-v en-gb -s 150
espeak "Tetra Gamma Gait Analysis " -v en-gb -s 150
sleep 1
espeak "Be ready for the security check." -v en-gb+f4 -s 150
espeak "Be ready for the security check." -v en-gb -s 150
sleep 1
espeak "Please state your name:" -v en+f4 -s 150
espeak "Please state your name:" -v en-gb -s 150
sleep 1
espeak "Position yourself 2 to 3 meters away from me." -v en-gb+f4 -s 150
espeak "Position yourself 2 to 3 meters away from me." -v en-gb -s 150
sleep 2
espeak "Walk towards me in a straight line ." -v en+f4 -s 150
sleep 0.2
espeak "Walk towards me in a straight line ." -v en-gb -s 150
sleep 0.1
mpv sweep_up.wav
basename=clips/$(date +%Y-%m-%d-%H-%M-%S)
basename=/var/www/static/gait/$(date +%Y-%m-%d-%H-%M-%S)
echo recording $basename.avi...
python scripts/recordwalk.py --output $basename.avi --frames 50 --framerate 4 --width 320 --height 240
# convert to mp4
@ -27,13 +32,16 @@ ffmpeg -i $basename.avi -vframes 1 -ss 0.5 -y $basename.jpg
mpv sweep_up.wav
espeak "Position yourself 2 to 3 meters away from me." -v en-gb -s 150
sleep 2
espeak "Walk towards me on a zig zag line.
" -v en+f4 -s175
sleep 0.2
" -v en-gb -s150
sleep 0.1
mpv sweep_up.wav
basename=clips/$(date +%Y-%m-%d-%H-%M-%S)
basename=/var/www/static/gait/$(date +%Y-%m-%d-%H-%M-%S)
echo recording $basename.avi...
python scripts/recordwalk.py --output $basename.avi --frames 50 --framerate 4 --width 320 --height 240
# convert to mp4
@ -44,9 +52,7 @@ ffmpeg -i $basename.avi -vframes 1 -ss 0.5 -y $basename.jpg
mpv sweep_up.wav
espeak "Thank you for your cooperation" -v en+f4 -s175
espeak "Thank you for your cooperation" -v en-gb -s175
sleep 1
mpv sweep_up.wav

@ -1,14 +1,15 @@
#N canvas 553 37 553 723 10;
#X declare -lib unpackOSC;
#X text 360 432 attack;
#X text 431 433 release;
#X obj 362 569 line~;
#N canvas 398 23 553 723 10;
#X declare -lib OSC;
#X declare -lib net;
#X text 372 516 attack;
#X text 443 517 release;
#X obj 374 653 line~;
#X obj 176 638 *~;
#X obj 361 452 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1
#X obj 373 536 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1
-1;
#X obj 435 455 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1
#X obj 447 539 bng 15 250 50 0 empty empty empty 0 -6 0 8 -262144 -1
-1;
#X msg 371 480 stop;
#X msg 383 564 stop;
#X text 70 455 #ikstem;
#X text 188 454 #gestemd;
#X obj 176 693 dac~;
@ -16,45 +17,58 @@
#X obj 19 243 print;
#X obj 133 345 select #ikstem both;
#X text 264 345 #gestemd;
#X obj 175 539 osc~ 450;
#X obj 361 503 del 50;
#X msg 432 526 0 250;
#X msg 296 503 30 30;
#X obj 476 483 del 50;
#X msg 361 527 1 500;
#X msg 195 477 400;
#X msg 68 480 150;
#X obj 373 587 del 50;
#X msg 444 610 0 250;
#X msg 308 587 30 30;
#X obj 488 567 del 50;
#X msg 373 611 1 500;
#X obj 212 198 loadbang;
#X obj 364 23 import unpackOSC;
#X msg 211 244 \; pd dsp 1;
#X text 16 34 comment;
#X text 176 446 comment;
#X obj 174 596 *~ 10;
#X obj 34 92 unpackOSC;
#X obj 118 585 *~ 10;
#X obj 34 48 udpreceive 127.0.0.1 4000;
#X connect 2 0 3 0;
#X obj 34 92 unpackOSC;
#X obj 364 23 import OSC;
#X obj 365 55 import net;
#X obj 149 148 print;
#X obj 182 539 osc~ 880;
#X msg 195 477 880;
#X obj 271 500 line~;
#X msg 266 434 1 10;
#X msg 318 494 0 30;
#X obj 318 451 b;
#X obj 318 473 delay 10;
#X msg 68 480 220;
#X text 16 34;
#X connect 3 0 9 0;
#X connect 3 0 9 1;
#X connect 4 0 16 0;
#X connect 4 0 14 0;
#X connect 4 0 17 0;
#X connect 4 0 15 0;
#X connect 4 0 18 0;
#X connect 5 0 16 0;
#X connect 5 0 15 0;
#X connect 5 0 6 0;
#X connect 6 0 15 0;
#X connect 10 1 11 0;
#X connect 6 0 14 0;
#X connect 10 1 12 0;
#X connect 10 2 11 0;
#X connect 12 0 21 0;
#X connect 12 0 4 0;
#X connect 12 1 20 0;
#X connect 12 1 4 0;
#X connect 14 0 27 0;
#X connect 15 0 19 0;
#X connect 12 0 35 0;
#X connect 12 0 31 0;
#X connect 12 1 29 0;
#X connect 12 1 31 0;
#X connect 14 0 18 0;
#X connect 15 0 2 0;
#X connect 16 0 2 0;
#X connect 17 0 2 0;
#X connect 18 0 16 0;
#X connect 19 0 2 0;
#X connect 20 0 14 0;
#X connect 21 0 14 0;
#X connect 27 0 3 0;
#X connect 28 0 10 0;
#X connect 17 0 15 0;
#X connect 18 0 2 0;
#X connect 19 0 20 0;
#X connect 23 0 24 0;
#X connect 24 0 10 0;
#X connect 24 0 27 0;
#X connect 28 0 22 0;
#X connect 28 0 3 0;
#X connect 29 0 28 0;
#X connect 30 0 3 1;
#X connect 31 0 30 0;
#X connect 31 0 33 0;
#X connect 32 0 30 0;
#X connect 33 0 34 0;
#X connect 34 0 32 0;
#X connect 35 0 28 0;

@ -42,7 +42,7 @@ with open('/tmp/twittersonification.csv', 'rU') as csvfile:
if then:
ti = (now-then).total_seconds()
#print ti/100
sleep_time = ti/1000
sleep_time = ti/40
print ("5.msg stderr", file=sys.stderr)

@ -1 +1,19 @@
FLOPPYLEFT - 2017
Creative Commons License
This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.
You are free to:
Share — copy and redistribute the material in any medium or format
Adapt — remix, transform, and build upon the material
for any purpose, even commercially.
The licensor cannot revoke these freedoms as long as you follow the license terms.
Under the following terms:
Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
Notices:
You do not have to comply with the license for elements of the material in the public domain or where your use is permitted by an applicable exception or limitation.
No warranties are given. The license may not give you all of the permissions necessary for your intended use. For example, other rights such as publicity, privacy, or moral rights may limit how you use the material.

@ -1,7 +1,11 @@
Author: Slavoj Žižek
Date: 1989
Title: The Sublime Object of Floppy
Author: Max Franklin
Date: 2017
Title: euclid
Description:
And so on, and so on, and so on.
A chaotic software and hardware synthesiser and generative sequencer. Designed to explore improvisation, and musical interactivity.
To play, touch the metal contacts, connecting them. The more conductive you are, the more you will be able to affect the instrument.
The state of your composition is recorded, and displayed here as a downloadable score. Never to be played, nor heard again.

@ -1,4 +1,4 @@
#N canvas -95 222 2148 1345 10;
#N canvas 154 24 2148 1345 10;
#X declare -lib net;
#X declare -lib osc;
#X obj -7084 50 unpackOSC;
@ -698,9 +698,9 @@ Max 1000ms;
#X obj 199 241 f;
#X obj 232 240 + 1;
#X obj 199 432 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 3;
-1 -1 0;
#X obj 199 634 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 1;
-1 -1 0;
#X obj 251 305 int;
#X obj 304 310 int;
#X floatatom 307 365 5 0 0 0 - - -;
@ -762,9 +762,9 @@ Max 1000ms;
#X obj 199 241 f;
#X obj 232 240 + 1;
#X obj 199 432 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 7;
-1 -1 12;
#X obj 199 634 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 7;
-1 -1 11;
#X obj 251 305 int;
#X obj 304 310 int;
#X floatatom 307 365 5 0 0 0 - - -;
@ -826,9 +826,9 @@ Max 1000ms;
#X obj 199 241 f;
#X obj 232 240 + 1;
#X obj 199 432 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 8;
-1 -1 9;
#X obj 199 634 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 7;
-1 -1 9;
#X obj 251 305 int;
#X obj 304 310 int;
#X floatatom 307 365 5 0 0 0 - - -;
@ -890,9 +890,9 @@ Max 1000ms;
#X obj 199 241 f;
#X obj 232 240 + 1;
#X obj 199 432 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 16;
-1 -1 9;
#X obj 199 634 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 16;
-1 -1 8;
#X obj 251 305 int;
#X obj 304 310 int;
#X floatatom 307 365 5 0 0 0 - - -;
@ -1116,9 +1116,9 @@ Max 1000ms;
#X obj 199 241 f;
#X obj 232 240 + 1;
#X obj 199 432 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 2;
-1 -1 1;
#X obj 199 634 hradio 15 1 0 18 empty empty empty 0 -8 0 10 -262144
-1 -1 2;
-1 -1 1;
#X obj 251 305 int;
#X obj 304 310 int;
#X floatatom 307 365 5 0 0 0 - - -;
@ -1695,6 +1695,7 @@ Max 1000ms;
#X obj -5548 157 metro 10000;
#X obj -7084 26 udpreceive 127.0.0.1 3000;
#X msg -5510 640 write noweb/score/score.txt;
#X msg -6589 64 \; pd dsp 1 \;;
#X connect 0 0 1 0;
#X connect 1 0 144 0;
#X connect 1 1 145 0;
@ -1710,6 +1711,7 @@ Max 1000ms;
#X connect 7 0 214 0;
#X connect 7 0 212 0;
#X connect 7 0 235 0;
#X connect 7 0 238 0;
#X connect 9 0 10 0;
#X connect 9 0 33 0;
#X connect 9 0 104 0;

@ -1,7 +1,7 @@
r4 16 7 0;
r4 18 6 0;
r3 18 7 0;
r2 9 2 0;
r1 7 8 0;
melody 51 43 34 26 37 43 33;
bass 73 75 74 86 87 70 86;
r4 16 9 0;
r4 16 10 0;
r3 17 7 0;
r2 7 5 0;
r1 6 6 0;
melody 71 63 76 85 89 80 82;
bass 64 62 72 68 60 57 45;

@ -6,7 +6,7 @@ FLOPPY="/media/floppy"
MAINPY="${FLOPPY}/main.py"
PYRUN="python ${MAINPY}"
MAINPD="${FLOPPY}/main.pd"
PDRUN="pd -lib import -path /usr/local/lib/pd-externals/net/ -path /usr/local/lib/pd-externals/osc/ -lib osc -lib udpreceive -oss -r 48000 -rt -nogui ${MAINPD}"
PDRUN="pd -lib import -path /usr/local/lib/pd-externals/net/ -path /usr/local/lib/pd-externals/osc/ -path /usr/lib/pd/extra/cyclone -lib osc -lib udpreceive -oss -r 48000 -rt -nogui ${MAINPD}"
stdbuf -oL -- udevadm monitor --udev -p ${FD} | while read -r -- STATE _ _ _ _
do
@ -35,11 +35,15 @@ do
pkill -9 python
fi
if [ "$(pgrep -f '^pd')" ]
then
echo "pd still running ... killing"
pkill -9 pd
fi
then
echo "pd still running ... killing"
pkill -9 pd
fi
if [ "$(pgrep -f '^mpv')" ]
then
echo "mpv still running ... killing"
pkill -9 mpv
fi
fi
fi
done

Loading…
Cancel
Save