Screen Shot 2015-12-09 at 6.33.23 PM

For my final project, I constructed an interactive wall that showcases the multiplicity of narratives surrounding women, their pain, and their conflicted relationships with emotion.

I asked several women in my life to record themselves talking about their relationship with sadness and grief. As you touch various points on the wall, it triggers audio clips from those interviews. You can see the progress of this project here and here.

Check out the video:

I was initially inspired by Charlotte Perkins Gilman’s short story “The Yellow Wallpaper.” Published in 1892, the story is written in the style of a diary of a woman who, failing to enjoy the joys of marriage and motherhood, is sent to live in a room alone in the country in an effort to “cure” her ineptitude. She wants to write, but her husband and her doctor forbid it. Confined to her bedroom, the patterns on the faded yellow wallpaper come to life for the protagonist and eventually precipitate her descent into insanity.

Here’s a sampling of some of the things that were said:

“I think there’s a stereotype in our culture that emotions are bad and they are weak.”

“I expect myself to push forward and to be better and to move on.”

“I’ve come up with this theory that it’s best to feel whatever emotion feels most urgent at that time and if it’s sadness then so be it. I think that sitting with sadness and getting to know its roots — it’s huge.”

For the fabrication of the wall, I covered a canvas with some vintage floral wallpaper from the 1920s. I liked that wallpaper is something we associate with domesticity, a quality and a space that has historically been associated with women.Screen Shot 2015-12-09 at 7.15.43 PM

As I mentioned last week, behind the canvas is a web of wires connected to the SparkFun Capacitative Touch Sensor Breakout driven by an I2C interface. The Arduino code was drawn largely from example code I found online at bildr for utilizing the touch sensor. Using what we learned about serial communication, I was able to connect a p5.js sketch that plays the appropriate audio interviews when the corresponding flower is touched.IMG_6809

After play testing the wall last week, I decided to also add LED lights behind each flower that turn off when you are touching them. I also added a pair of headphones in order to create an environment in which the participant/listener feels an intimacy with the speaker.

Overall, I was really happy with the quality of the audio that I got and the simplicity of the interaction. My friends and family who recorded themselves were generous and thoughtful. I think that came through when you listen to the audio.

Screen Shot 2015-12-09 at 7.18.12 PM

If I were to do this project again, I would change the fabrication of the wall to make it more elegant and beautiful. Right now the wall has a DIY feel – which I like – but if this were to become a real installation piece it would require some rethinking.

The-Yellow-Wallpaper-1

 

“Pain is everywhere and nowhere. Post-​wounded women know that postures of pain play into limited and outmoded conceptions of womanhood…I know these dialects because I have spoken them; I know these post-​wounded narrators because I have written them. I wonder now: What shame are they sculpted from?”

– Leslie Jamison, “A Grand Unified Theory of Female Pain,” The Empathy Exams

For my final project in pcomp, I intend to explore the narratives we construct surrounding women, pain, and the erasure of the self.

My project was initially inspired by Charlotte Perkins Gilman’s short story “The Yellow Wallpaper.” Published in 1892, the story is written in the style of a diary of a woman who, failing to enjoy the joys of marriage and motherhood, is sent to live in a room alone in the country in an effort to “cure” her ineptitude. She wants to write, but her husband and her doctor forbid it. Confined to her bedroom, the patterns on the faded yellow wallpaper come to life for the protagonist and eventually precipitate her descent into insanity.

For my final project, I will construct a wall covered in yellow, faded floral wallpaper that participants can touch and interact with. When a participant touches an individual flower on the wallpaper, an LED will light up and an audio recording will play. The audio recordings are stories that I will record from women describing their personal experiences with love and pain.

I am still refining the conceptual piece of this project, which is requiring me to talk to many of my friends about what kinds of stories they would find most interesting. For now, these are the questions I’m considering:

  • Tell me about a time when you most felt loved.
  • How long will you let yourself be sad about something? Do you think there is an appropriate timeline?
  • Tell me about a time when you felt the most known.
  • Do you have any wounds from experiences long ago that you still carry around with you?

For the physical interaction, I will use conductive thread and wires behind the wallpaper to connect the center of each flower to a SparkFun capacitive touch sensor, which I have already tested out.

When the flower is touched, the audio plays and the LED turns on. When it is not touched, the audio and LED are off.

Schematic for hooking up the touch sensor:

52a0c354757b7fa13b8b456a

Here is a link my Bill of Materials (BOM), which is still being updated.

Here is my timeline.timeline

 

 

IMG_2465

The aspirational version of my water harp.

The project I proposed last week was ambitious to say the least. In my project proposal, I stated that I wanted to build an entire interaction around the tactile experience of running one’s fingers through a stream of water.

In reality, there were a lot of obstacles I hadn’t anticipated encountering and I realized that the project I thought I’d be building required a longer time frame to test out ideas. I still love the concept but I will need to keep testing out the project before it moves forward.

Here’s what I built:

The (water) harp. from Rebecca Ricks on Vimeo.

That being said, I think I build something pretty cool even if it was only one piece of what I’d planned to build.

The initial plan.

After I nailed down the concept, I talked to Pedro about the different kinds of sensors that were available to me. We discussed some different potentiometers: photosensors, lasers, etc. Since I was really looking to build a series of simple switches, he suggested I keep things simple by using what is called an end switch. I decided that I wanted the water to fall on 10 switches. As the participant interacted with the water, it would trigger different sounds.

IMG_2473

Step one: Fold the plexiglass into a shape that would create a waterfall wall of water. 

I sketched out a few different ideas for the shape of the plexiglass. Ultimately I decided it would make the most sense to build a waterfall that would stand on its own and sit on a tabletop surface. Using the plastic heater, it was a laborious process to bend the plexiglass but I was able to get it into a shape that I liked.

IMG_2410

IMG_2413

Step two: Test out the waterfall with different configurations.

The initial plan was to set up a system whereby the water drips straight off the plastic into a container and is then pumped back up to the top and drips out a pipe with holes drilled in it. I set up the components – piping, pump, acrylic – and started testing the water.

IMG_2437

The result of my experimentation was extremely frustrating. It seemed like there were so many factors I had failed to consider when I’d decided to work with water. First of all, the water made a huge mess, which I hadn’t anticipated. More importantly, water has an affinity to plastic and acrylic and so I wasn’t getting the consistent blanket waterfall shape I’d planned on working with.

It seemed like everyone on the floor had ideas about hydrologics and water pressure. I tested out different materials for making a lip for the acrylic but nothing seemed to even out the stream.

Step three: Build the hardware components.

After three days of testing the waterfall, I decided to shift gears and begin building the actual switches and the circuits that would connect to the Arduino.

I laser cut some acrylic “keys” that would serve as an extension of the end switches, which the waterfall would be hitting. I also laser cut a board with ten holes to fit the switches. I soldered the switches to wires that led to the breadboard, which connected the 10 switches to digital pins 3-11.

IMG_2448

The wires were connected correctly and I knew I would need to figure out a way to protect the hardware from getting wet. That would prove to be a really important issue if I got the waterfall to actually work.

I did like the feeling of pushing on the keys. You can push them in a wave pattern, parodying the feeling of water falling on them. It felt sufficiently tactile and I decided that since I was in a time crunch, I would have to adjust my concept slightly to account for the fact that I still hadn’t figured out the best way to make the water fall evenly.

IMG_2461

Step four: Write the code and add the sounds in p5.js.

I tossed around a few different ideas for the types of sounds I wanted to play. I thought about playing funny noises, spooky noises, water noises, human voices, and various tones, but the piece of music I kept returning to was Richard Wagner’s Vorspiel (overture) from Das Rheingold, the first opera in his Ring Cycle.

The opening of the opera is a realization of emergence, of becoming as process. Wagner was obsessed with origin stories and stripping away stories to their mythic core. Unlike Beethoven’s chaos, Wagner’s music begins with a monotonous E flat, building into more and more complex figurations of the chord of E flat major, which is meant to mimic the motion of the Rhine River, which runs through Germany. The piece lasts 136 bars and approximately four minutes.

There is something very watery about the piece of music. In his book Decoding Wagner, Thomas May writes: “The swirling textures of sound readily transmit the idea of water rushing and complement the music’s quickening into life.”

I chopped up the overture into 10 distinct “parts” that would correspond to the 10 keys. The result would be a layering of sounds as you run your hands over the keys.

IMG_2458

 

f5ce8a5716220b4adbae524670be1ac1Photograph by Eric Rose.

In keeping with the general tenor of my physical computation projects, I will continue to look at creative ways to provoke interactions with water.

I want a lot of my future projects to be an exploration of cymatics – a subset of modal vibrational phenomena in which a surface is vibrated and different patterns emerge in some kind of medium (paste, liquid, water, etc). Cymatics is essentially a process by which soundwaves are made visible. I like the idea of measuring a person’s heart rate and then visualizing that vibration pattern in a liquid, for instance.

According to Andrew Defrancesco, the more I thought about this midterm project, though, the more I was struck by the delightful feeling of running one’s fingers through a steady stream of water. I want to build the entire interaction around that tactile experience.

So here’s my proposal: I plan to build a water harp. This is the initial sketch of the project:

IMG_2389

The harp will consist of a rounded plexiglass board that water flows over, creating a waterfall effect. The water will hit a series of 8 sensors (either moisture sensors, photosensors, or another conductive material). There will be a water pump that pumps up the water and brings it back to the top.

Each sensor will be paired with a sound of a different frequency that will play from the computer using p5.js. I’m still trying to decide what kind of sound will be best suited to this project. It could be a series of different noises triggered by each sensor (such as rainfall, thunder, rivers, etc). I was also thinking a lot about using human voices singing at different pitches that would then harmonize with each other.

When the participant runs his/her hand through the waterfall, it will create gaps in the water, triggering different sensors. Overall, I want the experience to be as tactile and delightful as possible.

IMG_2249

I’ve spent about six years living in Utah, where the climate is arid and drought is a constant concern. According to the U.S. Drought Monitor, about a quarter of the state continues to experience severe (D2) level drought. The region often doesn’t receive the rainfall it needs to keep its reservoirs at capacity.

In keeping with my interest in humans’ relationship with their physical environment and ecological processes, I decided that I wanted my next project to collect information about precipitation.

Using a rain gauge that my friend Joao Costa had lent me, I was able to measure the accumulation of rainfall over time.
IMG_2231

A rain gauge is a self-emptying tipping bucket that collects and dispenses water. It allows you to display daily and accumulated rainfall as well as the rate of rainfall. The gauge essentially acts as a switch, making contact when a specified amount of water enters the bucket.

Rain collects at the top of the bucket, where a funnel collects and channels the precipitation into a small seesaw-like container. After 0.011 inches (0.2794 mm) of rain has fallen, the lever tips and dumps the collected water. An electrical signal is sent back to the Arduino where the digital counter can record what’s called the “interrupt input.”

IMG_2233

Once I decided that I wanted to measure the precipitation using the rain gauge, I did some research into the particular gauge I was using. According to its datasheet, the rain gauge connected to an adaptor, which then only connected to two center conductors. I hooked up the rain bucket like so:

IMG_2245

The thing that is really powerful about the rain gauge is that it can measure the cumulative rainfall over a period of time. I decided that I wanted to connect the LCD screen to the Arduino as an output in order to display the amount of precipitation.

Connecting the LCD screen was very difficult because of the number of wires that needed to be connected to the screen (eight!). The LCD screen also required that I set up a potentiometer to control the brightness of the screen, so I added that to the breadboard.

IMG_2250

Once the setup was complete, I wrote the Arduino program that would display the precipitation information I wanted. I figured out how to set up the LCD screen to display text on a single line.`

Next, I needed to figure out how to display the actual amount of precipitation that had fallen into the bucket. To do so, I created a variable “rainTipperCounter.” Every time the see-saw in the gauge filled with water and tipped, the count went up by one.

Screen Shot 2015-09-29 at 7.49.16 PM

I knew that each time the count increased, 0.011 inches of rainfall had collected in the rain gauge. I programmed the LCD to display the rainTipperCounter, multiplied by 0.011, so that the actual amount of accumulated precipitation was displayed.

Screen Shot 2015-09-29 at 7.51.36 PM

IMG_2247

And just like that, I’d set up a simple rain gauge that tracked cumulative precipitation. It wasn’t raining outside today so I had to test the switch by pouring water into the gauge. Here’s how the final product turned out:

Measuring cumulative rainfall with a rain gauge from Rebecca Ricks on Vimeo.

IMG_2108

“Art to me is our expression of being in love with (and fearing for) our world — our efforts to capture and predict the patterns, colours, movement we see around us,” says environmental strategist Dekila Chungyalpa.

Those words were written to coincide with a massive art installation piece Ice Watch by Icelandic artist Olafur Eliasson. For the piece, Eliasson obtained huge chunks of Arctic ice and installed them in front of Copenhagen’s city hall, where they slowly melted, a powerful reminder to the public of the reality of climate change.

I’ve been thinking a lot lately about ways in which we can sonify many of the natural geological processes that are simply not audible to human ears: the sound of water levels gradually dropping, the sound of tectonic plates sliding, the sounds of topography and mountain ranges, the sounds of glaciers melting, for instance. These objects have their own internal auditory patterns and acoustics.

Eliasson’s installation piece, along with Paul Kos’ “The Sound of Ice Melting” inspired my analog assignment this week. I wanted to generate a sound from the process of ice chunks melting.

To do so, I first purchased a simple rain/moisture sensor that would function as the analog input in the circuit. The sensor is essentially a pentiometer because it has a variable resistance: The amount of resistance varies based on the amount of water/moisture present.

IMG_2099

IMG_2100The board that senses the presence of moisture.

Using the laser printer (first time yeah!) I cut a piece of acrylic to mount the sensor on. I had to melt hot glue on some of the wires to make sure the water wouldn’t interfere with the electricity.

I connected the sensor to the Arduino board via analog pin #A0 and then connected the Piezo as a digital output from the digital pin #3.

IMG_2112

After wiring up the board, I needed to test the sensor to see what range of signal values I would be dealing with. This is the code that printed those values, which ranged from 0 to 1023.

Screen Shot 2015-09-22 at 11.03.16 PM

Eventually, I knew that I wanted the sounds emitted by the dripping ice to create a sense of urgency or anxiety. To do so, I needed to change the code so that the tone sped up.

After determining the range of values available, I decided to write some additional code that would change the delay between tones based on how much water was present. When there was only a tiny droplet, the Piezo would buzz at a slow rhythm. As more water dripped onto the sensor, the rhythm would speed up.

Screen Shot 2015-09-22 at 11.03.30 PM

Finally, I froze water in different sized chunks to create the ice. I put the chunks in fish netting and dangled them above the sensor, letting them melt at room temperature. Gradually the sounds sped up as the ice melted more quickly.

output_ioVLd3

Here is the final product! 

The sound of melting ice. from Rebecca Ricks on Vimeo.

Full code below.

#define rainSensor A0
#define buzzer 3

void setup() { //analog input from rainSensor, digital output from buzzer
Serial.begin(9600);
//pinMode(rainSensor, INPUT);
pinMode(buzzer, OUTPUT);

}

void loop() {
int sensorValue = analogRead(rainSensor); //read the analog input from pin A0.
Serial.println(sensorValue);
delay(100);

if(sensorValue == 0) { // write the digital output. Values from 0 to 1023.
tone(buzzer, 440);
} else if(sensorValue > 0 && sensorValue < 300) {
tone(buzzer, 440);
delay(20);
noTone(buzzer);
delay(20);
} else if(sensorValue > 300 && sensorValue < 600) {
tone(buzzer, 440);
delay(50);
noTone(buzzer);
delay(50);
} else if(sensorValue > 600 && sensorValue < 900) {
tone(buzzer, 440);
delay (100);
noTone(buzzer);
delay(100);
} else if(sensorValue > 900 && sensorValue < 1023) {
tone(buzzer, 440);
delay (200);
noTone(buzzer);
delay(200);
}

Overall, I had hoped to do a lot more with this project I think. For instance, I would have liked to have changed the setup so that the melting ice was interacting with a sensor in a more interesting way (beyond just dangling above the sensor).

IMG_2015

This week, we learned how to build a simple circuit using the Arduino Uno device. We set up a circuit in which a push button functioned as the switch, causing an LED light to turn on and off. Simple enough.

For homework this week, we were challenged to design our own switch. There were so many possibilities that I spent a good amount of time thinking about what kind of human interaction I wanted to initiate the switch. There are endless ways that we interact with our environment; for instance, poking, tapping, blowing, touching, and blinking.

Ultimately I decided that it would be interesting if the interaction were watering a plant. The action would complete the circuit, producing an outcome that would act as an alarm that the plant had received adequate water. In other words, I wanted to create a system where a plant starts yelling at you when you’ve overwatered it.

IMG_2027

At first, I imagined setting up some type of pulley system in which the weight of the water pulled the pot down to the ground and connected a piece of metal on the bottom of the pot with a piece of metal on the ground.

IMG_2028

I quickly realized that a much simpler solution was available. Since water is conductive (to some degree – it does have resistance), I decided that I would put the plant on a plate and let the water trickle out of the bottom of the plant. The ends of the wires would sit on the plate and when the puddle formed, it would touch both of the wires and complete the circuit.

IMG_2014

 

It took some trial and error, but I was able to complete a basic circuit using the plant watering action. One obstacle was that I had to dissolve some salt into the water in order for the water to be conductive enough to allow the electrical current to flow through the puddle (that’s probably a gardening 101 no-no). Another obstacle was that the wires kept oxidizing, which meant I had to keep snipping off the tops for each trial.

I jumped ahead in the lesson and learned how to set up a circuit that communicated with the Arduino software so that there was a digital signal sent back to a Piezo. To do so, I connected the wires so that the digitalRead() function would get its input from pin #7 and then the output would come out of pin #3, which was connected to the Piezo.

IMG_2011

The result was that the Piezo played some sounds. I wrote some commands that played different tones on the Piezo. I really wanted to play the song “Psycho Killer” by the Talking Heads, so I wrote some code that produced a series of tones that corresponded to the chorus of the song.

The tone() function takes the pin # and a frequency number, which corresponds to a musical note. The delay() function is what creates the pauses between the notes.

Here’s what the full code looked like:

int sound;

void setup() {
pinMode(7,INPUT);
pinMode(3,OUTPUT);
}

void loop() {
sound = digitalRead(7);
if (sound==1) {
tone(3,440);
delay(500);
tone(3,440);
delay(500);
tone(3,440);
delay(500);
tone(3,440);
delay(1000);
tone(3,413);
delay(500);
tone(3,413);
delay(500);
tone(3,413);
delay(1000);

tone(3,390);
delay(500);
tone(3,440);
delay(500);
tone(3,390);
delay(500);
tone(3,520);
delay(1000);
tone(3,420);
delay(100);
tone(3,390);
delay(500);
tone(3,420);
delay(500);
tone(3,390);
delay(500);
tone(3,520);
delay(1000);
tone(3,390);
delay(500);

} else {
noTone(3);
}
}

Honestly, I wasn’t entirely successful at playing the Talking Heads song because it never actually ran the way I imagined it would, but I did succeed in getting the Piezo to play a really funny sound when the circuit was completed.

Here is the final product:

Watering a plant with Arduino from Rebecca Ricks on Vimeo.

 

 

tumblr_m31sp34PRP1r2ysm3o1_1280Robert Irwin at Pace Gallery. “Red Drawing, White Drawing, Black Painting.” 2009. Source.

How do we define physical interaction?

According to Chris Crawford, interaction is a cyclic process that requires two actors, human or otherwise, who alternately listen, think, and speak. In this way, physical interaction can be considered a kind of two-way conversation. Interactivity, he argues, is a deliberate behavior in animals that developed millions of years ago in order to help humans better retain information about their environment. Crawford is quick to clarify that not every encounter is interactive, however. For instance, reading a book is participatory, not interactive, because it is a one-sided communication.

While this definition might seem fairly straightforward, many designers contend that it might be too narrow. In his piece “A Brief Rant on the Future of Interactive Design,” Bret Victor suggests that many interactive designers are too quick to abandon the tactile in favor of the visual. Victor argues that our interaction with everyday objects can be considered interactive because they offer us physical feedback. Holding a cup of water, for example, is a tactile form of physical interaction that gives us more information than interacting with an app on an iPhone.

I tend to align myself more closely with Victor’s definition of interactivity – after all, we are constantly receiving physical feedback from our environment that engage all of our senses, not just our vision. I don’t think designers should be so quick to dismiss all these other types of encounters as forms of physical interaction.

What is good physical interaction?

Regardless of how you choose to define interactivity, good physical interaction needs to be intuitive, engaging the various “languages” that humans already use to interact with the world, whether that be auditory, visual, or kinesthetic. A good interactive designer aims to minimize the memory load and the amount of mental effort humans need to put into the interaction.

In his book, Crawford argues that interactivity engages the mind more powerfully than any other form of expression. Research seems to back this assertion, as it’s been proven that interactive classroom environments help students better retain information.

What about non-interactive digital technology?

Most digital technology is to some degree interactive. However, I agree with Crawford that there are many experiences we have branded as “interactive” that aren’t really two-sided interactions. Reading a Kindle might require the user to swipe, but it’s otherwise pretty non-interactive. Watching a movie in a movie theater or on Netflix doesn’t require much interactivity.

In the future, we’ll witness the emergence of new forms of technology that turn what was once a non-interactive experience (for instance, watching a movie) into an interactive, participatory experience. Several films at Sundance last year explored this new form of storytelling, including “The Source (Evolving)” and “I Love Your Work.” I think we’ll continue to see artists deconstruct the interactive/non-interactive binary.