# OuNuPo Make Software experiments for the OuNuPo bookscanner, part of Special Issue 5 https://issue.xpub.nl/05/ https://xpub.nl/ ## License ## Authors Natasha Berting, Angeliki Diakrousi, Joca van der Horst, Alexander Roidl, Alice Strete and Zalán Szakács. ## Clone Repository `git clone https://git.xpub.nl/repos/OuNuPo-make.git` ## General depencies * Python3 * GNU make * Python3 NLTK `pip3 install nltk` # Make commands ## N+7 (example) Author Description: Replaces every noun with the 7th next noun in a dictionary. Inspired by an Oulipo work of the same name. run: `make N+7` Specific Dependencies: * a * b * c ## Sitting inside a pocket(sphinx): Angeliki Description: Speech recognition feedback loops using the first sentence of a scanned text as input run: `make ttssr-human-only` Specific Dependencies: * PocketSphinx package `sudo aptitude install pocketsphinx pocketsphinx-en-us` * PocketSphinx: `sudo pip3 install PocketSphinx` * Python Libaries:`sudo apt-get install gcc automake autoconf libtool bison swig python-dev libpulse-dev` * Speech Recognition: `sudo pip3 install SpeechRecognition` * TermColor: `sudo pip3 install termcolor` * PyAudio: `pip3 install pyaudio` ## Reading the Structure: Joca Description: Uses OCR'ed text as an input, labels each word for Part-of-Speech, stopwords and sentiment. Then it generates a reading interface where words with a specific label are hidden. Output can be saved as poster, or exported as json featuring the full data set. run: `make reading_structure` Specific Dependencies: * nltk (http://www.nltk.org/install.html) * nltk.tokenize.punkt, ne_chunk, pos_tag, word_tokenize, sentiment.vader * nltk.download('vader_lexicon') (https://www.nltk.org/data.html) * weasyprint (http://weasyprint.readthedocs.io/en/latest/install.html) * jinja2 (http://jinja.pocoo.org/docs/2.10/intro/#installation) * font: PT Sans (os font https://www.fontsquirrel.com/fonts/pt-serif) * font: Ubuntu Mono (os font https://www.fontsquirrel.com/fonts/ubuntu-mono) ## Erase / Replace: Natasha Description: Receives your scanned pages in order, then analyzes each image and its vocabulary. Finds and crops the least common words, and either erases them, or replaces them with the most common words. Outputs a PDF of increasingly distorted scan images. for erase script run: `make erase` for replace script run: `make replace` Specific Dependencies: * NLTK English Corpus: * run NLTK downloader `python -m nltk.downloader` * select menu "Corpora" * select "stopwords" * "Download" * Python Image Library (PIL): `pip3 install Pillow` * PDF generation for Python (FPDF): `pip3 install fpdf` * HTML5lib: `pip3 install html5lib` Notes & Bugs: This script is very picky about the input images it can work with. For best results, please use high resolution images in RGB colorspace. Errors can occur when image modes do not match or tesseract cannot successfully make HOCR files. ## carlandre: Alice Description: Generates concrete poetry from a text file. If you're connected to a printer located in /dev/usb/lp0 you can print the poem. run: make carlandre Dependencies: * pytest (Documentation: https://docs.pytest.org/en/latest/getting-started.html) ## over/under: Alice Description: Interpreted programming language written in Python3 which translates basic weaving instructions into code and applies them to text. run: make overunder Instructions: over/under works with specific commands which execute specific instructions. When running, an interpreter will open: > To load your text, type 'load'. This is necessary before any other instructions. Every time you load the text, the previous instructions will be discarded. To see the line you are currently on, type 'show'. To start your pattern, type 'over' or 'under', each followed by an integer, separated by a comma. e.g. over 5, under 5, over 6, under 10 To move on to the next line of text, press enter twice. To see your pattern, type 'pattern'. To save your pattern in a text file, type 'save'. To leave the program, type 'quit'. ## oulibot: Alex Description: Chatbot that will help you to write a poem based on the text you inserted by giving you constraints. run: make oulibot Dependencies: irc.bot: pip3 install irc_client nltk: pip3 install nltk && python3 -m nltk.downloader rake_nltk: pip3 install rake_nltk nltk.tokenize nltk.corpus textblob: pip3 install textblob PIL: pip3 install Pillow numpy: pip3 install numpy tweepy: pip3 install tweepy