End of Year Video Abstract

Now that I have put together my video abstract for this year’s project I am able to reflect on my concept and process retrospectively. Overall I am pleased with the outcome of this year’s project and I think I have been able to scratch the surface on the overall subject of digital and machine intimacy but having spent the past year working within this conceptual area I have found myself leaving with more questions and possible areas of exploration I want to revisit.

I think I did well this year to narrow my focus exclusively to the relationship between textiles and digital agents, if I were to have chosen a wider area of focus I think I would have became quickly overwhelmed as this area of thought has a great deal of sub-strands and it is still emerging conceptually as technology continues to advance. Contrasting digital agents with the historical significance of textiles in the scope of human intimacy and the production of textiles allowed me to hone in on a very specific form of human interaction with the outside world that translated well in the scope of physical computing and the emerging field of capacitive and digital textiles.

Looking back to the start of the year my concept has changed somewhat but I have managed to stay within the realms of what I had initially set out to do. At first I was very focused on the communal aspect of textile creation in history and the exploration of the human form through the capacitive textiles I was making but as I progressed further into more digital avenues of exploration my focus shifted on the intermediary processes between the textile sensors and my digital outputs, while working with these intermediary processes (ie Arduino, data, machine learning, etc) I discovered a great deal of interesting and meaningful interactions.

I learned a great deal of practical skills from this project. My physical fabrication skills have become stronger as a result of the very hands on process I chose this year, and I feel that my programming abilities have improved. I want to continue to develop these skills as I work on future projects. In terms of my overall workflow, this year I made a conscious decision to move away from the extensive passive research and undergo hands-on material and production experiments, this has been a breakthrough of sorts as I think it allowed me to get out of my own head and allow the process to inform my next steps.

*subtitles for video available within youtube caption facility*

Experimenting with clay

As part of my final bout of experimentation with my mechanical arm I wanted to revisit clay as a material because it can be directly imbued with touch. The reason I wanted to re-explore touch based output was because of the connection between my capacitive textiles and touch input, I felt there was potential for some sort of poetic expression of touch from the input to the output of the entire process.

After mounting my motors the motion and velocity of the arm movement is more powerful and has more force behind it, although the range of movement has been reduced due to the fact that the motors are stationary. This has a limiting effect on the overall range of expression although makes the arm itself more stable. In future I would like to explore possible ways of mounting the motors on runners which would allow for a greater range of movement.

This iteration of the arm was designed with the intention of being used with a pen and drawing sequences, however it is still possible to use it to create physical imprints with clay. To get around some of the physical and design constraints of the arm I decided to give the arm clay tools to allow it to create a wider range of marks.

While running through the clay sequence with my data sets I realised that this arm design, while can still be used with clay, is not optimised for this purpose. It struggled to create clear imprints and had a tendency to launch the pieces of clay off its surface and across the room rather than exhibiting a smooth mark making process. This is an issue because this behaviour is not caused by the data sets which are driving the motors, rather the arm itself. In future when making mechanical arms or devices that are intended to work with clay and other malleable materials I realise that I need to design and incorporate this functionality throughout the fabrication process, multi-functionality and multi-expression is something I aim to build into future contraptions.

Studio Run and Documentation

This week I took more time to install my mechanical arm in studio to give it another few run-throughs in order to collect output drawings and to observe and troubleshoot the overall process. I have been able to run through all of my data sets several times with the mechanical drawing arm so I can now compare the drawings from the different data sets, while there are still some technical and hardware tweaks that have to be made in order to reduce accidental variation in the way the arm moves I can still see some differences between the data sets at this early stage.

At this point I have been able to run through all three of my data sets and I have mechanically drawn outputs for each, this has meant that I have been able to visually compare the outputs from the data sets against one another. Granted that at this stage there are still some variables that need taken care of (ie stability of arm, mounting of motors to make them stationary, etc) but despite this I am still able to see variations between each data set in the drawings and throughout the run-time of the mechanical arm. For example:

Ailsa’s dataset: Short length pattern, zig zagging motions, broad strokes

Jordanne’s Dataset: Longer sweeps, tight looping, random directions, stays stationary for longer periods of time

Laurajane: More chaotic movement, longer sweeps, straighter lines

Dataset: Laurajane [076097117114097106097110101013101]

Dataset: Jordanne [106111114100097110110013010]

Dataset: Ailsa [097105108115097]

  • Tutor feedbackTroubleshooting hardwareOrdering servo bracketsOutput resultsMaterial experiments, different paper, wet paper ect
  • After having spoken with my tutors and running through the drawing process a few more times I realise that it is important for me to manage the physical variables (ie mount the motors, make the arm more stable, etc) so that I am able to explore and visualise the data to the fullest. My next steps will be mounting my motors securely on the drawing surface with some brackets to ensure that the arm has increased stability and to undergo any troubleshooting processes that come up.

    While experimenting with the mechanical arm I wanted to conduct some material experiments. Something that bothered me was that there was no way of knowing in my previous drawings if the pen had been stationary for a period of time or not. To overcome this problem I soaked some heavyweight paper in water so that the ink could bleed and flow allowing for more expressive forms of mark making. This worked well in giving the pen and ink additional mark making capacity and led to some interesting results, while observing the drawing process I noticed that the paper would only stay wet for around 5-10 minutes which at first I thought would be an issue. After allowing the arm to go through a few cycles with the wet paper I realised that the fact that the paper was drying was actually adding another layer of storytelling to the marks being made highlighting where the pen started drawing and where it ended, and gave some indication as to how long the entire drawing process took and whether or not some parts of the drawing took more time than others. I’ll likely experiment with other pens and mark making methods but I think the wet paper technique worked well and I will be revisiting it in future stages.

    So far I am pleased with the results of my mechanical arm outputs but I know there are some additional tweaks needing to be made. I need to solder the motor circuit so that the wiring and components are stable and so that the circuit can be made into a smaller form factor with more durability. Once I have fabricated my final circuit and finessed the mechanical arm I plan to give each data set another run through process and it will be these output drawings that I will be taking forward as realised solutions.

    Studio testing and documentation

    This week I have been parsing the user data that I have received back from the field and postal use and I have three usable data sets that I can run through my mechanical arm, giving me three separate data streams to work with and compare.

    Setting up the installation was fairly straight-forward, largely due to the fact that I had designed the mechanical arm to be portable and light. Once I set up and began running the arm I noticed there were a few tweaks that needed to be made. For a start, the arm still could not support its own weight, I removed some of the heavier supports from the arm to remove some of the strain. This helped ease this issue somewhat, however having removed the supports I will have to be careful with the overall force from the motors so as to ensure the arm does not snap or bend. Weight distribution continues to be an issue and I will have to look into this further.

    To document the arm I chose to take a series of shots of a real-time installation run, this meant that I could review the installation holistically and see what was working and what needed further adaptation. Additionally I noticed that the overall functionality of the mechanical arm was glitchy; it would stop working after 20 seconds, the motors wouldn’t respond to every data point, etc. This issue was fixed by switching out the adapter going between the arduino board to my laptop to a non Apple adapter and by adding an external battery.

    Real-time documentation of mechanical arm output, driven by touch data

    Overall I am pleased with the outcome of this initial run through, I think that the mechanical arm itself creates interesting series of marks. It is interesting to see how the arm translates the human touch data I captured. At present there are commonalities between drawings from the same data set, but due to the continued troubleshooting that I have been doing the outputs are still displaying some variations from one another. This should hopefully be rectified once I have effectively troubleshoot the arm and it can run through a sequence with minimal human intervention.

    My next steps are to continue troubleshooting the output device and get it to a point where it can run without me having to make any tweaks or adjustments. Once I have achieved this I will perform and document a final series of drawings in a suitable space which will serve as the true baseline for what the touch data looks like on paper having been interpreted by the output device.

    Mechanical Arm Mounting and Data Streams

    This week I have been parsing my user input data and constructing the finalised mechanical arm. My first step when building this was to consult the Kinetics department in the Media Studio and discuss with them some details of construction, choosing parts, etc. They recommended that with this iteration of the mechanical arm I pay attention to the overall strength and integrity of the arm. In my previous models structural integrity and movement were some of the key problems, kinetics recommended that I use a more rigid material along with washers and spacers for my screws to maintain tension and reduce wear and tear in my hinges and moving parts (this is due to the metal screws wearing away at softer wooden parts. Washers and metal channels for my screw holes will negate this). Additionally, they recommended that I double up and overlap some of the parts to add some rigidity to the arm.

    I attached the mount to a foldable side table I had at home, the table is fairly sturdy and it portable, something I wanted as I will be unable to store this in studio due to COVID restrictions. The mount piece is from one of my previous prototype and I haven’t made many drastic changes to it’s design. I resized and added additional screw holes, and created multiple of the standing mount piece so that I can glue them together and make one thick solid piece.

    When installing the mount I made a series of pilot holes and simply screwed the mount to the table. When constructing the arm I attached the standing mount piece so that the entire arm could be detachable from the table making it even more portable. This would also allow for interchangeable mechanical arm configurations. I used nylock nuts and spacers to get the desired tension for my hinges. This has taken some of the strain off the moving sections of the arm and will give it more long lasting performance. I have also constructed a box to conceal my circuits.

    I am happy with the results of this fabrication, although there are still some points which need troubleshooting. I need to revisit my method for mounting my motors as the motors seem to be unscrewing the hinges instead of providing movement, I may have to adjust this by adding bell-cranks to the motor and the arm pieces. There also appears to be some bowing in the arm due to the weight of the overall arm, some of the measures I took to strengthen the arm have helped offset this, but more needs to be done to support its weight. This is definitely going in the right direction however, I feel that this piece will be finished after I make a few tweaks.

    Below is the code which links my touch data from the input device I made to the motors, allowing me to create unique expressions using the mechanical arm for each user.

    https://github.com/Noxibus/dataToMotors

    //Processing code : CSV to Serial Write
    
    import processing.serial.*;
    Serial myPort; 
    
    Table table;
    TableRow row;
    int timer =0; 
    int timerDelay= 1000;//1000 millisenconds/ 1 second. 
    int i=0;
    float val;
    int mapMin = 0;
    int mapMax = 2000;
    float lerpFdata;
    float newData;
    
    
    void setup() {
      String portName = Serial.list()[5]; //change to correct port
      myPort = new Serial(this, portName, 9600);
      size(500, 500); //testing screen
      background(0);
    
      //initialise the CSV file. 
      table = loadTable("DATALOG.csv", "header");
      row = table.getRow(0);
     
    }
    
    void draw() {
        printArray(Serial.list());
      readCSV();
    myPort.write(int(val)+ ";");
    //myPort.write(int(val));
      println(int(val));
      //float cVal = constrain(val, 0, 180);
    }
    
    void readCSV() { 
      if (millis() >= timer) {
       
        timer = millis() + timerDelay;
    
        row = table.getRow(i); // get a new row of data
        val = map(row.getInt("Value"), mapMin, mapMax, 0, 180);
    
        // increment row counter
        i++;
    
        //  check to see if at end of data
        if (i == table.getRowCount()) {
          i = 0;//if so, loop back to first row
        }
      }
    }
    //Arduino Code: Processing incoming serial data and feeding data to motors
    // sourced from http://www.esologic.com/parsing-serial-data-sent-to-arduino/
    #include <Servo.h>
    
    // code to extract String of data from processing
    const char EOPmarker = ';'; //This is the end of packet marker
    char serialbuf[32]; //This gives the incoming serial some room. Change it if you want a longer incoming.
    
    #include <string.h> // we'll need this for subString
    #define MAX_STRING_LEN 20 // like 3 lines above, change as needed.
    
    
    Servo myServo;
    void setup() {  
      Serial.begin(9600); //start the serial comms
      myServo.attach(8);
    }
    
    void loop() {
    
      if (Serial.available() > 0) { //makes sure something is ready to be read
        static int bufpos = 0; //starts the buffer back at the first position in the incoming serial.read
        char inchar = Serial.read(); //assigns one byte (as serial.read()'s only input one byte at a time
        if (inchar != EOPmarker) { //if the incoming character is not the byte that is the incoming package ender
          serialbuf[bufpos] = inchar; //the buffer position in the array get assigned to the current read
          bufpos++; //once that has happend the buffer advances, doing this over and over again until the end of package marker is read.
        }
    
        else { //once the end of package marker has been read
          serialbuf[bufpos] = 0; //restart the buff
          bufpos = 0; //restart the position of the buff
    
          //atoi = convert string  to number.
          int val = atoi(subStr(serialbuf, ",", 1)); // recieve the 1st variable, this is getting written to motor
          Serial.println(val); //check what  the step amount is.Might need some conversions.
          myServo.write(val);
          delay(100);
    
        }
      }
    }
    // function needed to extract the string of data from processing.
    char* subStr (char* input_string, char *separator, int segment_number) {
      char *act, *sub, *ptr;
      static char copy[MAX_STRING_LEN];
      int i;
      strcpy(copy, input_string);
      for (i = 1, act = copy; i <= segment_number; i++, act = NULL) {
        sub = strtok_r(act, separator, &ptr);
        if (sub == NULL) break;
      }
      return sub;
    }

    Collecting User Data

    Now that I have a functioning input device I am able to collect user data from people outwith the context of my physical installation. It is rudimentary in nature but it fulfills the intended purpose I had intended to with the input device. This initial finished product is rough around the edges I’ll admit, initially when fabricating the enclosure I hadn’t intended on using the Trill breakout board which meant there were additional wires and components I had to account for. The Trill board itself fit nicely alongside the datalogger and the circuit, but the wires were what took up a great deal of space within the enclosure. This is something to bear in mind for future projects. The battery is also bigger than I had initially accounted for which meant that it would not fit within the enclosure, I mounted it onto the outside of the enclosure using rubber bands and electrical tape.

    Overall, I am happy with the functionality of the device however I do think that in future I need to pay more attention to the overall form factor of my enclosures and their user interfaces. While this input device is more user friendly than the stand-alone circuit it has a way to go before it is completely user friendly in terms of usability and ease of use.

    In order to provide the most seamless experience for users I created this leaflet with a photographic step-by-step guide for setting up and using the input device. Given that the device itself still requires additional tweaks to make it completely user friendly I felt that providing a visual guide was essential if users were to interact with the device in any meaningful way.

    Input Device User Instruction Leaflet
    • Process of reaching users: meeting outside then posting

    For the process of collecting user data I decided to both meet people outdoors in COVID-safe settings and to post the kit to those who are further afield. I was able to meet with two users in the nearby vicinity and record them using it first hand. This was helpful as I was able to walk users through the process and see how they interacted with the device first hand.

    Users were intimidated at first by the appearance of the kit, this is understandable considering that it still looks fairly mechanical. After getting to grips with the device the process of use and collecting data went smoothly however. For the users I met with outdoors I was able to direct them through the process and control the documentation of the usage which definitely had a knock on effect in terms of the overall usability of the device and the parameters of use. I predict that when my postal users go to use the device I will still be able to collect data, but I think the overall process will require more remote troubleshooting than I would have initially wanted.

    From my initial user experiments it was interesting to see how different people approached the input process and the ways in which they interacted with the piece of capacitive fabric. For the outdoor experiments I opted to use the same fabric sample so as to keep this variable the same and maintain consistency between the use tests between different users. This meant I was able to clearly see differences in how people manipulated the capacitive fabric in the numerical data in a clear manner.

    The nature of the capacitive fabric samples (ie colour, texture, size, malleability, etc ) encourage users to interact with the samples on a purely tactile basis, I think that this method of collecting data is interesting because users may not be aware of their own style of interactions and the nuances with which they manipulate the touch surface. They respond to it in an almost subconscious manner, something with is reminiscent of automatism and the surrealist practices i have explored previously and the relationship between machine art and surrealism.

    Going into this process I expected users to interact with the samples in ways that presented as being drastically different from other users, I was expecting extreme movements and perhaps some heavy handedness. I think this personal bias comes from my time working with capacitive materials in front of a serial monitor where I can receive real-time feedback without intermediary processes. In order to test my experiments I tend to subject my pieces to extremes which is my own personal bias. I expected to see users behave this way, but instead the movements and touch they exhibited was generally more instinctive and displayed a less extreme range of movement. This will be interesting to observe when the data is connected to my drawing arm, my hypothesis is that these differences in touch and use will be displayed in the drawing outputs of the mechanical arm.

    Working with users has given me an idea for the naming conventions for my drawing outputs also. Seeing as I know the names of my users I am able to manipulate the characters in their names and reappropriate them into digital translations. I have translated the names of my users into ASCII numerical code, I did this as I feel it is a poetic descriptor of the pieces (final output) of this entire mechanical process of transcribing human touch and intimacy into machine and digital constructs.

    For example:

    Ailsa : 065 105 108 115 097 013 010 013 010

    Jordanne: 074 111 114 100 097 110 110 101 013 010

    Laurajane: 076 097 117 114 097 106 097 110 101 013 010 013 010

    Julie: 074 117 108 105 101 013 010 013 010 013 010

    Amber (me): 065 109 098 101 114 013 010 013 010 013 010 013 010

    (ASCII Translator found here)

    My next steps will be to get the input device returned to me with user data from one of my volunteers. Then, I will take some time to create my own input example to compare my data (someone who knows how the system works, the creator of the piece) vs my users, I want to do this so that I can draw conclusions regarding my own bias towards the system and to push the parameters of the device.

    Once I have received all of my data streams I will parse my data into a format which is compatible with my servo motors, then I will run some real-time tests with my mechanical arm.

    Sound credits: https://freesound.org/people/Mativve/sounds/416778/#comments

    Input Device Fabrication (cont.)

    This week has been a week of broad spectrum troubleshooting across all areas of my project. Firstly, I have been working to fabricate my input device and get it to a point where it runs smoothly and is user friendly. Secondly, I have been working on ways in which I can parse my capacitive data and send it out to the motors on my mechanical arms so that I can get responsive real-world feedback from my touch trace data. Lastly, I have been considering the physical fabrication and staging of my input device; how I am going to solder it, how I am going to present it within an enclosure, and how to walk users through the process of capturing their touch data.

    I have been using Fritzing to visualise my input device circuit and potential PCB schematics. This has been helpful as it gives me a clear point of reference to work from when I am going between breadboards and static soldered circuit boards. I probably won’t make a full PCB style circuit, but examining the potential applications of PCB in the context of my input circuit has helped me visualise and problem solve some aspects of my final circuit schematic.

    My circuit schematic is based off the Teensy guideline circuits published by Bela for use with their Trill sensors, I used this method as it incorporates low powered resistors (4.7k) to make the sensors compatible with lower power boards such as my Adalogger which has an output of 3.3v. Without the presence of the resistors in the circuit the Trill Craft and the Adalogger simply don’t talk to one another which results in a series of errors being printed to the Serial Monitor.

    Bela have a useful Processing Library which allows input data to be ported into Processing, the example sketches in the library (above) allow touch interactions from the Trill sensor to be easily. At a later date I would like to experiment with this library more to create data visualisations driven by the Trill sensor, this could be useful for real-time works.

    My first attempt of soldering the input circuit wasn’t successful but having made this failed prototype I could do some hands-on problem solving and assess what the problem was. With this first circuit I thought that my ground and power wires would be connected properly given that I’ve kept them isolated like in my breadboard examples. The issue I realised was that there needed to be a spacer between the resistor and my power lines which is what I implemented in the second circuit iteration which seemed to work. Additionally, something I realise now that I have made this is that any insulating should happen only after I am certain that the circuit is working and joined correctly as I realise that the hot glue I used as an insulator concealed any potential breaks or gaps in the circuit.

    My second circuit attempt was much more successful. I used lines of solder at the back of the board to act like the connection lines in the breadboard, these will be insulated for user use. I opted to use wires with pre-installed headers for ease of installation during rapid prototyping, for future iterations of this circuit I would use strimmed wires like in the first prototype. There are a few lines within the circuit that require minor fixes but for the most part this configuration seems to be successful.

    The input device looks like it will fit inside the 3D printed enclosure, albeit snuggly. Initially I had printed this with the intention of only holding a battery and the Adalogger within the enclosure prior to the integration of the Trill Craft, thankfully I had printed several different lid types for the enclosure and the lid featured in the above images gives some additional height and space to accommodate boards and parts. It is likely that I will have to shorten the wires so that I can keep the internals of the device compact.

    While testing my circuit I brought in capacitive fabric samples to test the overall functionality of the input device which in itself raised some interesting issues. In the serial monitor I was able to see inputs from all pins including the pin with my attached material. This was not reflected in the data file however, in the SD card stream I was getting streams of 0’s with no data inputs. This problem took some time to solve and I am still working to create a ‘perfect’ solution to this problem but for now I have been able to patch up the cracks and create something with barebones functionality. Below is my initial piece of code which was bringing back those SD card errors:

    //trill craft x adalogger
    //why is the sd card section of the code not working
    #include <SPI.h>
    #include <SD.h>
    #include <Trill.h>
    
    const int chipSelect = 4; //make sure this is the right pin for adalogger
    
    Trill trillSensor;
    int data = 0;
    
    void setup() {
      Serial.begin(115200);
    
      while (!Serial) {// wait for serial port to connect. Needed for native USB port only
      }
      Serial.print("Initializing SD card...");
      if (!SD.begin(chipSelect)) {
        Serial.println("Card failed, or not present");
        while (1);
      }
      Serial.println("card initialized.");
    
      int ret = trillSensor.setup(Trill::TRILL_CRAFT);
      if (ret != 0) {
        Serial.println("failed to initialise trillSensor");
        Serial.print("Error code: ");
        Serial.println(ret);
      }
    }
    
    void loop() {
      delay(100);
      trillSensor.requestRawData();
    
    //  while (trillSensor.rawDataAvailable() > 0) {
      while (trillSensor.rawDataAvailable()) {
        data = trillSensor.rawDataRead();
        if (data < 1000)
          Serial.print(0);
        if (data < 100)
          Serial.print(0);
        if (data < 10)
          Serial.print(0); //FOR LATER
        Serial.print(data);
        Serial.print(" ");
      }
     
    // File dataFile = SD.open("trillData.txt", FILE_WRITE);
    File dataFile = SD.open("datalog.txt", FILE_WRITE);
      //int data = trillSensor.rawDataRead();
      if (dataFile) {
        //IF DATA >10 = SAVE DATA (leave 0)
       dataFile.print(data);  //only one line of data being printed to file
       dataFile.println(",");
        dataFile.close();
      Serial.println(data);  
      } else {
        Serial.println("error opening file"); //error fixed by changing name back to datalog idk why
      }
      delay(50);
    
    }

    I went over to the Bela forum to ask users if they had any experience isolating data from individual pins. I received a very in depth response from the user @giuliomoro who outlined some steps I could take using the Wire library on Arduino to isolate offset bytes and perform the pin readings manually. This process is quite technically involved and would take some time for me to learn, configure, and troubleshoot so for now I am opting for the simpler solution of just writing all elements of the data stream over to the SD card but when I have time I would like to experiment with these more complex methods for a cleaner and more precise process.

    Here is what my code looks like now, I am now receiving reliable data streams from the device. The drawback to this method is that I am only able to work with one piece of capacitive fabric at a time as the data streams from all pins are blurred together in this one data stream. This is where the above methods would likely be useful.

    //trill craft x adalogger - ALMOST WORKING
    
    #include <SPI.h>
    #include <SD.h>
    #include <Trill.h>
    
    const int chipSelect = 4; //make sure this is the right pin for adalogger
    
    Trill trillSensor;
    int data = 0;
    
    void setup() {
      Serial.begin(115200);
    
      while (!Serial) {// wait for serial port to connect. Needed for native USB port only
      }
      Serial.print("Initializing SD card...");
      if (!SD.begin(chipSelect)) {
        Serial.println("Card failed, or not present");
        while (1);
      }
      Serial.println("card initialized.");
    
      int ret = trillSensor.setup(Trill::TRILL_CRAFT);
      if (ret != 0) {
        Serial.println("failed to initialise trillSensor");
        Serial.print("Error code: ");
        Serial.println(ret);
      }
    }
    
    void loop() {
      delay(100);
      trillSensor.requestRawData();
    
    //  this is the part that seems to have made things work!!!!!!
    
      while (trillSensor.rawDataAvailable()) {
        data = trillSensor.rawDataRead();
        File dataFile = SD.open("datalog.txt", FILE_WRITE);
        if (dataFile) {
        
        if (data < 10)
          Serial.print(0); //FOR LATER
        dataFile.print(0);
        
        Serial.print(data);
        dataFile.print(data);
        Serial.print(",");
        dataFile.print(",");
        }
    dataFile.close();
      //error fixed by changing name back to datalog idk why
      
      delay(50);
    
    }}

    My next steps with the input device will be to have it fully functioning and ready for users. Once I am happy with the physical circuit and the code I will be meeting up with people outside in parks where it is possible to social distance and I will be posting it to people who are able to document their process using it. I will also be creating a set of photographic instructions to help users in the process.

    The incorporation of the Adalogger and general remote applications in line with social distancing for collecting user data has proven to be a bigger challenge than I initially thought it would be and it has taken me quite some time to combine the different elements of the device, troubleshoot, and to realise the user elements of the process. This process would undoubtedly have taken less time if I were able to get users involved in a real-time installation as working with the Trill Craft is much more straightforward with the Arduino Uno and Processing data parsing methods. While this process has been challenging it has however given me a chance to work with new tools and workflows which I think is vital in the learning process and I am at a point in the fabrication process where I almost have a finished outcome.

    Mechanical Arm Drawings

    In response to my tutorial feedback I have started the process of documenting my mechanical arms during the drawing process. The first set up I am documenting is my most basic arm prototype with training data, I think this is a good piece to kick off this series as it will provide a baseline output for me to refer back to when I begin the real-time documentation process of some of my more bizarre configurations with some of the unpredictable data I will be receiving from my input device.

    The marks made from this initial experiment are interesting in their repetition, and the variation in pen pressure in each stroke is something else that should be noted. These visual qualities are things I will be studying as I expand my prototyping in this area.

    My mechanical arms are somewhat removed from the engineered aesthetic and functionality of conventional mechanical devices. instead, they bear more resemblance to human body parts – arm bones, hands, tendons, fingers. This links back to the practice of Renato Dib I explored earlier in the year and the ways in which he uses textiles to emulate parts of the human body to explore facets of the human experience and intimacy. This is no accident, I wanted to imitate somewhat anthropomorphised configurations to create conceptual connections between humans and machines on a visual level at face value while the back end processes of the data parsing and classification interpret human touch and intimacy.

    The visual output created by this mechanism is reminiscent of autonomist surrealist work, which is a relationship I think could be worth exploring. I have explored automatism within the scope of art in previous projects and it is something which I have found to be very prevalent within a lot of new media art forms. Surrealist automatism refers to bodily movements and thoughts controlled by the subconscious mind (not to be confused with automatic processes in the body such as breathing). These phenomena were initially explored in the early field of psychology by psychoanalysts such as Sigmund Freud who used automatic drawing and writing to explore the subconscious mind. Automatism was then explored in the scope of visual art by the likes of Max Ernst, Joan Miro, and Andre Masson.

    Joan MiróThe Sun (El Sol)1949 (Image from MOMA)

    Andre Breton described the process of automatism as “…the dictation of thought in the absence of all control exercised by reason and outside all moral or aesthetic concerns.” (The Manifesto of Surrealism [1924]) I feel that these properties of automatism, namely the absence of control and aesthetic concern can be observed within the mechanical processes of new media and machine learning art.

    Automatism and machine art intersect in an interesting way as machines do not have subconscious processes, they are comprised of the sets of instructions we have given to them and in a way mirror our subconscious more than we may realise. Neural networks and ML training data are constructed using elements which contain human intent and bias which is seen in their output, a notable example of this can be seen in the increasing prevalence of human bias and prejudice being observed in AI systems.

    ANDRÉ MASSON (1896-1987) Automatic Drawing, 1938 (Pen on Paper)
    ANDRÉ MASSON (1896-1987)
    Automatic Drawing, 1924 (Pen on Paper)

    There is an undeniable cross-over between Surrealist practice and Machine Learning Art. Both processes rely upon processes of automatism and the abstraction of objects (or data) in order to create an output. Processes observed within Machine Learning and New Media Art bear some resemblance to the Surrealist technique of Frottage (popularised by Max Ernst in 1925) which involves a process of taking graphite/charcoal/pastel rubbings of objects and transferring these textures to canvas where the artist would then abstract and deconstruct the textures further to realise forms and shapes.

    My mechanical pieces and capacitive experiments exhibit a similar process of object remediation, abstraction, and transformation. This technique of Frottage can be loosely applied to the conceptual grounding of parsing touch traces through mechanical processes which further abstract and disseminate the data in order to undergo an artistic process.

    The Frottage process (Image: Philipp Schmitt)
    Simplified machine learning art process (Image: Phillipp Schmitt)

    In an entry featured in Plot(s): The Journal of Design Studies [Volume V, 2018], Phillipp Schmitt discusses the relationship between AI and Surrealism in the article Augmented Imagination: Machine Learning Art as Automatism. Schmitt details elements of Surrealist practice in the article, and compares the process Machine Learning Art to the process of Frottage outlining similarities in both processes (see above).

    Many images produced by artists using ML…are similar in their visceral evocative qualities, their organic “brushstrokes” and textures, their surreal creatures and objects. They are usually made to depict familiar objects and draw from visual material of this world, but have something alien to them that is highly fascinating.

    Phillipp Schmitt (2018), Augmented Imagination: Machine Learning Art as Automatism.

    Machine Learning and New Media art offer a new Surrealist tool kit with which artists can draw from automatist, Surrealist, and technological practice alike. I am keen to explore the distinctly mechanical processes and mark making procedures exhibited by my mechanical arms and to frame them in a somewhat Surrealist context.

    Further Reading:

    https://adht.parsons.edu/designstudies/plot/augmented-imagination-machine-learning-art-as-automatism/

    https://www.moma.org/learn/moma_learning/themes/surrealism/tapping-the-subconscious-automatism-and-dreams/

    https://www.tate.org.uk/art/art-terms/a/automatism

    Capacitive Input Device

    Having had some time to experiment with the Adafruit 32u4 Adalogger I feel now that I am able to incorporate more elements which will ultimately make for a smoother user experience when I hand out my input device. Since my previous experiments with the data logger and capacitive sensing I realised that if I wanted to be able to capture multiple inputs I would have to reconsider my management of my capacitive sensors.

    Prior to this I had been wiring my capacitive sensors to a resistor soldered between two wires. This method was causing a lot of interference and was prone to failure due to breaks in the wire around the soldered resistor. This made me begin to consider break out boards as an alternative solution to the installation of my capacitive sensors, during tutorials Jen had pointed me in the direction of the Trill Craft breakout board, a board which relies upon I2C communication. The Trill Craft is useful because it is adaptable and removes the need for a great deal of soldering which saves me time during the prototyping and troubleshooting process.

    Image from <https://learn.bela.io/products/trill/get-started-with-trill/#teensy&gt;: This is the reference I am using to wire the Trill Craft to my BLE Sense and Adalogger

    From my time getting to grips with the Trill Craft I have found it to be quite versatile and it is compatiable with a lot of different Arduino boards. So far I have tried it with my Arduino Uno, BLE Nano 33 Sense, and my Adafruit 32u4 Adalogger. It is easiest to configure with the Uno as the Bela website already has clear documentation for working with the Uno, on their website Bela has outlined which pins to use on the Uno for SDA and SCL communication and the set up is straight forward. Working with my other boards is slightly trickier as they use a lower voltage than the Uno (3.3v instead of 5v) which means that I have to work more with resistors in order to get a stable connection between the serial monitor and the capacitive sensing pins on the Trill Craft. The site has a guide for working with Teensy boards which I have been using to work with these lower voltage mini boards and I have found much of the advice given for Teensy is translatable to the BLE and the Addalogger.

    The serial data stream from the Uno board is consistent and clear variation can be seen when you touch the different pins on the board. The data streams from the the Adalogger are still in need of some tweaks, I have to play around with resistors as due to the lower power output of the board there are some technical issues which must be addressed in order to get a clear and responsive data stream from all the pins. At present it appears some of the pins are not connected to the stream, which can be seen below in the values which have remained at 0 despite being touched. This is likely going to be fixed by using the correct resistors.

    On the Adalogger I have connected the Trill directly to the SDA and SCL pins as from the chart below there are no other appropriate I2C communication pins. The fact that I am getting any data back from these pins at all tells me I have made the right choice with this, and hopefully the whole issue can be resolved by paying attention to voltage levels and changing resistors.

    adafruit_products_Feather_32u4_Adalogger_v2.3-1.png
    Image from <https://learn.adafruit.com/assets/46241&gt; : Reference sheet for Adalogger

    *Artist reference – Machine Learning Classification – Plants <https://medium.com/@narner/talking-to-plants-touch%C3%A9-experiments-1087e1f04eb1>*

    https://www.instructables.com/Touche-for-Arduino-Advanced-touch-sensing/

    Revisiting Machine Learning

    While I am waiting for my motor arm parts to be cut and my data logger to arrive in the mail I want to use this week to focus on further experimentation with data visualisation, and to revisit the machine learning methodology I was working with back in November. While I don’t want these machine learning processes to be the central focus of my project, I am intrigued by the potential of some methods I am yet to explore in the ml5 utility and I would like to revisit gestural classification as a means to drive my mechanical output and data visualisation.

    Back in November I had success with the gestural classification using the BLE Sense board, I think this still has potential outwith its initial scope as instead of classifying hand gestures I can repurpose the system to classify different types of interaction with my capacitive fabric (this would also probably work with my capacitive sensing examples which run off my Arduino Uno/Adafruit logger).

    Having taken some more time to fully explore the potential for outputs with different ml5 algorithms there are a few things I have found that interest me and while they may not be feasible for implementation during my studio project I think the methods within these systems can be taken to augment my studio practice and further practice after graduation.

    There is a machine learning model available through the ml5 framework called Handpose I think could have a potential application in my project. Rather than using touch based sensing which has underpinned my entire input process this year it uses computer vision to detect hands in its field of vision and assign a skeleton mesh to the captured hand. This could provide a new take on the identification of fabric interactions and it would allow me to dip my toes into the field of computer vision, which has been something I expressed interest in earlier this year.

    Screenshot from the ML5 Handpose library reference page, showing the model in use. The model can detect one hand at a time and maps out a skeleton mesh comprised of 21 points in 3D space.

    One drawback with the handpose model is that it can only track one hand at a time which is problematic, when interacting with the capacitive textile samples people are likely to use two hands which would potentially cause the system to crash and not all data would be tracked.

    Another potentially interesting model available through the ml5 framework is the CVAE model (Conditional Variational Auto-Encoder). The model learns to encode data and port it into a smaller representation (similar to image compression). The documentation on the website describes the model as being a Variational Auto-Encoder with the ability to generate new images based upon the training data (which in my case could be my serial data).

    “Autoencoders are neural networks that are capable of creating sparse representations of the input data…What is even better is a variant that is called the variational auto-encoder that not only learns these sparse representations, but can also draw new images as well…”

    While I am still unsure as to the feasibility of this as a workflow in this project I think it is worth exploring. If this algorithm can create visual representations of numerical serial data it could be a way to not only create rich data visualisations, but to incorporate the computer’s ‘brain’ or intent within my work in the output.