Measuring cumulative rainfall with a rain gauge.

IMG_2249

I’ve spent about six years living in Utah, where the climate is arid and drought is a constant concern. According to the U.S. Drought Monitor, about a quarter of the state continues to experience severe (D2) level drought. The region often doesn’t receive the rainfall it needs to keep its reservoirs at capacity.

In keeping with my interest in humans’ relationship with their physical environment and ecological processes, I decided that I wanted my next project to collect information about precipitation.

Using a rain gauge that my friend Joao Costa had lent me, I was able to measure the accumulation of rainfall over time.
IMG_2231

A rain gauge is a self-emptying tipping bucket that collects and dispenses water. It allows you to display daily and accumulated rainfall as well as the rate of rainfall. The gauge essentially acts as a switch, making contact when a specified amount of water enters the bucket.

Rain collects at the top of the bucket, where a funnel collects and channels the precipitation into a small seesaw-like container. After 0.011 inches (0.2794 mm) of rain has fallen, the lever tips and dumps the collected water. An electrical signal is sent back to the Arduino where the digital counter can record what’s called the “interrupt input.”

IMG_2233

Once I decided that I wanted to measure the precipitation using the rain gauge, I did some research into the particular gauge I was using. According to its datasheet, the rain gauge connected to an adaptor, which then only connected to two center conductors. I hooked up the rain bucket like so:

IMG_2245

The thing that is really powerful about the rain gauge is that it can measure the cumulative rainfall over a period of time. I decided that I wanted to connect the LCD screen to the Arduino as an output in order to display the amount of precipitation.

Connecting the LCD screen was very difficult because of the number of wires that needed to be connected to the screen (eight!). The LCD screen also required that I set up a potentiometer to control the brightness of the screen, so I added that to the breadboard.

IMG_2250

Once the setup was complete, I wrote the Arduino program that would display the precipitation information I wanted. I figured out how to set up the LCD screen to display text on a single line.`

Next, I needed to figure out how to display the actual amount of precipitation that had fallen into the bucket. To do so, I created a variable “rainTipperCounter.” Every time the see-saw in the gauge filled with water and tipped, the count went up by one.

Screen Shot 2015-09-29 at 7.49.16 PM

I knew that each time the count increased, 0.011 inches of rainfall had collected in the rain gauge. I programmed the LCD to display the rainTipperCounter, multiplied by 0.011, so that the actual amount of accumulated precipitation was displayed.

Screen Shot 2015-09-29 at 7.51.36 PM

IMG_2247

And just like that, I’d set up a simple rain gauge that tracked cumulative precipitation. It wasn’t raining outside today so I had to test the switch by pouring water into the gauge. Here’s how the final product turned out:

Measuring cumulative rainfall with a rain gauge from Rebecca Ricks on Vimeo.

Junk food heaven. Or, my first game in p5.

Screen Shot 2015-09-24 at 1.49.33 PM

This week’s assignment for computational media was to build something that uses (1) a button or slider and (2) an algorithm for animation.

I immediately knew that however this project played out, I wanted it to involve both junk food and the Beach Boys.

Play the game here!

I was assigned to work with Jamie Ruddy. Jamie and I decided within the first 30 seconds of our conversation that we wanted to build a simple game using interactive elements and animation. Because both of us were drawn to the playful Japanese emojis that we use when we text, we designed a simple game in which the player moves a tongue that catches various pieces of food: hamburgers, apples, eggplant, etc. The player gains points for eating healthy food and loses points for eating junk food.

Here’s the initial sketch I drew:

IMG_2135

There were a lot of steps to build this game. First, we had to create a welcome screen that changed to the game screen once you clicked a button.

Screen Shot 2015-09-24 at 1.32.29 PM

Then, we had to create an assets library of PNG images to load into the game. Each piece of food would drop into the screen where the x-coordinate was random and the y-coordinate started at 0 and increased at each frame.

Screen Shot 2015-09-24 at 1.57.28 PM

Then came the most difficult step in the entire process. In order to animate the emojis and make them interact with one another, we had to create an object. This was a first for me, so I spent a lot of time reviewing the p5 reference library’s explanation of how to create an object.

Within the object, we created a function Food() that displayed each item of food, made it drop at a different rate than the other food, and then checked to see if it had touched the tongue. It also checked to see if the food had hit the bottom of the screen (y=height); if yes, then start the food again at the top of the screen (y = 0).

From there, we needed to ensure the score was increasing/decreasing based on which object the tongue touched. For instance, the apple would +10 and the hamburger would -10. Get to 50 points and you win.

Screen Shot 2015-09-24 at 1.23.28 PM

And for a bonus surprise: Eat the poop emoji and you just straight up lose the entire game (because, gross).

Screen Shot 2015-09-24 at 1.22.29 PM

To make the game more fun, I added the Beach Boys’ “Kokomo” to the sketch. It was simple to call the mp3 in the library by using the preload() function.

Screen Shot 2015-09-24 at 1.15.51 PM

Success! There’s a lot more I’d like to do with this game. For instance, I’d like there to be a location-based animation whenever the food hits the tongue  (right now the screen flashes white).

Here’s a video of how the game works:

 

 

Homelessness in Greenpoint: Initial storyboard.

The homeless population has always had a persistant presence in New York, but in the past several years homelessness in the city has skyrocketed. According to recent reports, the number of homeless in city shelters has risen from 53,000 to 60,000 during mayor Bill de Blasio’s term.

According to de Blasio, people are increasingly losing their permanent housing due to a perfect storm of plateauing income and a hot real estate market reaching into new neighborhoods, like Bushwick or Greenpoint, where landlords once accepted homeless families and their subsidies. Simply put, many families can’t simply keep up and end up in the shelter system.

With that in mind, our group is interested in examining how the homelessness crisis is playing out in one particular street corner in Greenpoint, Brooklyn: the intersection of Milton and Manhattan.

There are two churches on the street, one of which sponsors a daily food pantry for the neighorhood’s homeless population. Alongside the lines of homeless people, however, are the vestiges of gentrification: designer coffee shops, a CVS, boutiques, and a bank. This is just one of the battlegrounds where New Yorkers are increasingly losing their housing and entering the shelter system.

Here is the initial storyboard for the 3-5 minute film we will make:

 

68B90310-BF72-4855-AC85-30BD19FEE632

E33EB94B-E294-4BB1-B5D7-2ECC025748F6

BE5E117C-8FDB-4931-B0D7-29BB8168A2F2

Sonifying ice melting.

IMG_2108

“Art to me is our expression of being in love with (and fearing for) our world — our efforts to capture and predict the patterns, colours, movement we see around us,” says environmental strategist Dekila Chungyalpa.

Those words were written to coincide with a massive art installation piece Ice Watch by Icelandic artist Olafur Eliasson. For the piece, Eliasson obtained huge chunks of Arctic ice and installed them in front of Copenhagen’s city hall, where they slowly melted, a powerful reminder to the public of the reality of climate change.

I’ve been thinking a lot lately about ways in which we can sonify many of the natural geological processes that are simply not audible to human ears: the sound of water levels gradually dropping, the sound of tectonic plates sliding, the sounds of topography and mountain ranges, the sounds of glaciers melting, for instance. These objects have their own internal auditory patterns and acoustics.

Eliasson’s installation piece, along with Paul Kos’ “The Sound of Ice Melting” inspired my analog assignment this week. I wanted to generate a sound from the process of ice chunks melting.

To do so, I first purchased a simple rain/moisture sensor that would function as the analog input in the circuit. The sensor is essentially a pentiometer because it has a variable resistance: The amount of resistance varies based on the amount of water/moisture present.

IMG_2099

IMG_2100The board that senses the presence of moisture.

Using the laser printer (first time yeah!) I cut a piece of acrylic to mount the sensor on. I had to melt hot glue on some of the wires to make sure the water wouldn’t interfere with the electricity.

I connected the sensor to the Arduino board via analog pin #A0 and then connected the Piezo as a digital output from the digital pin #3.

IMG_2112

After wiring up the board, I needed to test the sensor to see what range of signal values I would be dealing with. This is the code that printed those values, which ranged from 0 to 1023.

Screen Shot 2015-09-22 at 11.03.16 PM

Eventually, I knew that I wanted the sounds emitted by the dripping ice to create a sense of urgency or anxiety. To do so, I needed to change the code so that the tone sped up.

After determining the range of values available, I decided to write some additional code that would change the delay between tones based on how much water was present. When there was only a tiny droplet, the Piezo would buzz at a slow rhythm. As more water dripped onto the sensor, the rhythm would speed up.

Screen Shot 2015-09-22 at 11.03.30 PM

Finally, I froze water in different sized chunks to create the ice. I put the chunks in fish netting and dangled them above the sensor, letting them melt at room temperature. Gradually the sounds sped up as the ice melted more quickly.

output_ioVLd3

Here is the final product! 

The sound of melting ice. from Rebecca Ricks on Vimeo.

Full code below.

#define rainSensor A0
#define buzzer 3

void setup() { //analog input from rainSensor, digital output from buzzer
Serial.begin(9600);
//pinMode(rainSensor, INPUT);
pinMode(buzzer, OUTPUT);

}

void loop() {
int sensorValue = analogRead(rainSensor); //read the analog input from pin A0.
Serial.println(sensorValue);
delay(100);

if(sensorValue == 0) { // write the digital output. Values from 0 to 1023.
tone(buzzer, 440);
} else if(sensorValue > 0 && sensorValue < 300) {
tone(buzzer, 440);
delay(20);
noTone(buzzer);
delay(20);
} else if(sensorValue > 300 && sensorValue < 600) {
tone(buzzer, 440);
delay(50);
noTone(buzzer);
delay(50);
} else if(sensorValue > 600 && sensorValue < 900) {
tone(buzzer, 440);
delay (100);
noTone(buzzer);
delay(100);
} else if(sensorValue > 900 && sensorValue < 1023) {
tone(buzzer, 440);
delay (200);
noTone(buzzer);
delay(200);
}

Overall, I had hoped to do a lot more with this project I think. For instance, I would have liked to have changed the setup so that the melting ice was interacting with a sensor in a more interesting way (beyond just dangling above the sensor).

Learning to animate in p5.

Screen Shot 2015-09-17 at 1.22.58 AM

Last week we drew static drawings from primitive shapes. This week we were assigned the task of animating our sketches in p5 using a set of new functions, including frameRate(), rotate(), and transform(), or new variables like mouseX and mouseY.

To complete the assignment, our sketch needed to contain: (1) One element that changed over time independent of the mouse; (2) One element controlled by the mouse; and (3) One element that’s different every time you run the sketch.

I’ve been a cyclist for a few years now. In Salt Lake City, where I lived for the past two years, I loved biking up into the canyon in the late afternoon before sunset. That experience was the inspiration behind my sketch this week, in which a bike rides into the mountains.

See my final sketch here.

Here was my initial sketch:

IMG_2029
I started planning out the sketch in Adobe Illustrator to get a better sense of the composition of the drawing.

mountain

My plan was to make the sun and its rays rotate in a circle independent of the mouse. That actually proved a lot more difficult than I’d anticipated. The action required me to use a new set of functions, including push(), transform(), rotate(), and pop(). Here are the lines of code I wrote that drew the sun:

push();
translate(350,100);
rotate (rads);

for (var d = 0; d < 10; d ++){
noStroke();
fill(206+d,188+d,122+d);
ellipse(0,0,175,175);

stroke(206+d,188+d,122+d);
strokeWeight(3);

for (var i = 0; i < 36; i ++) {
line(0,0,x,y);
rotate(PI/20);
}
pop ();

rads = rads + 1.57;

I started layering on primitive shapes for the mountains and bicycle using the beginShape() function I we learned in class. I decided to make the bicycle a different color every time you load the page by calling out var r, var g, and var b and then defining them in the setup:

r = 230;
g = random(100,200);
b = random(100,200);

Then the most difficult task lay ahead: Getting the bicycle to move at a variable speed controlled by the mouse. I tried a lot of different things, including making an object out of the bicycle using some new tricks we learned in Javascript, but ultimately this is the code that really worked out:

var speedX = 1;

var speedY = 1;

//bicycle

stroke(r,g,b,255);
strokeWeight(5);
noFill();

ellipse(225+speedX,520-speedY,100,100);
ellipse(400+speedX,490-speedY,100,100); 
ellipse(225+speedX,520-speedY,10,10); 
ellipse(400+speedX,490-speedY,10,10); 
quad(225+speedX,520-speedY,305+speedX,505-speedY,350+speedX,435-speedY,265+speedX,450-speedY); 
line(260+speedX,445-speedY,305+speedX,505-speedY); 
line(400+speedX,490-speedY,352+speedX,435-speedY); 
ellipse(305+speedX,505-speedY,30,30);
quad(250+speedX,450-speedY,245+speedX,440-speedY,275+speedX,440-speedY,275+speedX,445-speedY);

speedX+=map(mouseX,0,width,1,5); 
speedY+=0.15;

What’s happening above is that I’ve created two new variables, speedX and speedY, that will determine the velocity of the bike in relation to my mouse. The last two lines of code establish the range of values that the speed can be. When mouseX is high (i.e. it’s on the far right of the sketch), the bike moves more quickly.

Here’s a video of the animation.

Full code below.

var fr = 30;
var x = 1200;
var y = 0;
var rads = 0; // declare variable rads, angle at which sun will rotate
var speedX = 1; //declare variable speedX
var speedY = 1; //declare variable speedY
var r;
var g;
var b;
function setup() {
createCanvas(1100,600);
background(252,240,232);
r = 230;
g = random(100,200);
b = random(100,200);

}

function draw() {

//draw sun and rays and make them rotate.
push();
translate(350,100);
rotate (rads);

for (var d = 0; d < 10; d ++){
noStroke();
fill(206+d,188+d,122+d);
ellipse(0,0,175,175);

stroke(206+d,188+d,122+d);
strokeWeight(3);

for (var i = 0; i < 36; i ++) {
line(0,0,x,y);
rotate(PI/20);

}
}
pop ();

rads = rads + 1.57;

//mountains

//mountains layer three
stroke(136,167,173);
fill(136,167,173);
beginShape();
vertex(0,600);
vertex(0,400);
vertex(200,300);
vertex(300,350);
vertex(400,250);
vertex(500,325);
vertex(600,100);
vertex(750,200);
vertex(875,60);
vertex(1000,150);
vertex(1100,100);
vertex(1100,600);
endShape();

//mountains layer two
stroke(92,109,120,100);
fill(92,109,120,100);
beginShape();
vertex(0,600);
vertex(0,400);
vertex(275,375);
vertex(350,400);
vertex(425,375);
vertex(575,375);
vertex(800,200);
vertex(900,300);
vertex(1100,250);
vertex(1100,600);
endShape();

//mountains layer three
stroke(92,109,112,200);
fill(92,109,112,200);
beginShape();
vertex(0,600);
vertex(0,550);
vertex(500,400);
vertex(575,425);
vertex(600,400);
vertex(800,400);
vertex(875,300);
vertex(925,375);
vertex(1100,300);
vertex(1100,600);
endShape();

//mountains layer four
stroke(213,207,225,25);
fill(213,207,225,25);
triangle(0,600,1100,425,1100,600);

//bicycle

stroke(r,g,b,255);
strokeWeight(5);
noFill();

ellipse(225+speedX,520-speedY,100,100); //bike wheel
ellipse(400+speedX,490-speedY,100,100); //bike wheel
ellipse(225+speedX,520-speedY,10,10); //inner bike wheel
ellipse(400+speedX,490-speedY,10,10); //inner bike wheel
quad(225+speedX,520-speedY,305+speedX,505-speedY,350+speedX,435-speedY,265+speedX,450-speedY); //frame
line(260+speedX,445-speedY,305+speedX,505-speedY); //frame
line(400+speedX,490-speedY,352+speedX,435-speedY); //frame
ellipse(305+speedX,505-speedY,30,30); //frame
quad(250+speedX,450-speedY,245+speedX,440-speedY,275+speedX,440-speedY,275+speedX,445-speedY);

speedX+=map(mouseX,0,width,1,5); //x coordinate of the mouse determine speed of the bike
speedY+=0.15;
}

 

The plant that tells you when you’re overwatering it.

IMG_2015

This week, we learned how to build a simple circuit using the Arduino Uno device. We set up a circuit in which a push button functioned as the switch, causing an LED light to turn on and off. Simple enough.

For homework this week, we were challenged to design our own switch. There were so many possibilities that I spent a good amount of time thinking about what kind of human interaction I wanted to initiate the switch. There are endless ways that we interact with our environment; for instance, poking, tapping, blowing, touching, and blinking.

Ultimately I decided that it would be interesting if the interaction were watering a plant. The action would complete the circuit, producing an outcome that would act as an alarm that the plant had received adequate water. In other words, I wanted to create a system where a plant starts yelling at you when you’ve overwatered it.

IMG_2027

At first, I imagined setting up some type of pulley system in which the weight of the water pulled the pot down to the ground and connected a piece of metal on the bottom of the pot with a piece of metal on the ground.

IMG_2028

I quickly realized that a much simpler solution was available. Since water is conductive (to some degree – it does have resistance), I decided that I would put the plant on a plate and let the water trickle out of the bottom of the plant. The ends of the wires would sit on the plate and when the puddle formed, it would touch both of the wires and complete the circuit.

IMG_2014

 

It took some trial and error, but I was able to complete a basic circuit using the plant watering action. One obstacle was that I had to dissolve some salt into the water in order for the water to be conductive enough to allow the electrical current to flow through the puddle (that’s probably a gardening 101 no-no). Another obstacle was that the wires kept oxidizing, which meant I had to keep snipping off the tops for each trial.

I jumped ahead in the lesson and learned how to set up a circuit that communicated with the Arduino software so that there was a digital signal sent back to a Piezo. To do so, I connected the wires so that the digitalRead() function would get its input from pin #7 and then the output would come out of pin #3, which was connected to the Piezo.

IMG_2011

The result was that the Piezo played some sounds. I wrote some commands that played different tones on the Piezo. I really wanted to play the song “Psycho Killer” by the Talking Heads, so I wrote some code that produced a series of tones that corresponded to the chorus of the song.

The tone() function takes the pin # and a frequency number, which corresponds to a musical note. The delay() function is what creates the pauses between the notes.

Here’s what the full code looked like:

int sound;

void setup() {
pinMode(7,INPUT);
pinMode(3,OUTPUT);
}

void loop() {
sound = digitalRead(7);
if (sound==1) {
tone(3,440);
delay(500);
tone(3,440);
delay(500);
tone(3,440);
delay(500);
tone(3,440);
delay(1000);
tone(3,413);
delay(500);
tone(3,413);
delay(500);
tone(3,413);
delay(1000);

tone(3,390);
delay(500);
tone(3,440);
delay(500);
tone(3,390);
delay(500);
tone(3,520);
delay(1000);
tone(3,420);
delay(100);
tone(3,390);
delay(500);
tone(3,420);
delay(500);
tone(3,390);
delay(500);
tone(3,520);
delay(1000);
tone(3,390);
delay(500);

} else {
noTone(3);
}
}

Honestly, I wasn’t entirely successful at playing the Talking Heads song because it never actually ran the way I imagined it would, but I did succeed in getting the Piezo to play a really funny sound when the circuit was completed.

Here is the final product:

Watering a plant with Arduino from Rebecca Ricks on Vimeo.

 

 

Sunday Mass: An exploration of religious ecotones in Bed-Stuy.

Processed with VSCOcam with p5 presetThe interior of a Baptist church on Fulton Avenue.

I moved to Bed-Stuy, Brooklyn three weeks ago, knowing very little about the religious landscape of my neighborhood. On my first night, I realized that there is a synagogue on my block in which the congregation is composed entirely of Ethiopian Jews. Three blocks away is an unusual Egyptian temple, where a small sect of black Muslims who call themselves the Nuwaubian Nation (a group that developed concurrent with the Nation of Islam) continue to worship. My second night, I heard the call to prayer sounding while I was at dinner a few blocks away. Religion is part of the culture and it’s what holds the neighborhood together.

With that in mind, I was interested in exploring how the different religious communities in Bed-Stuy meet, worship, interact, sing, and pray. I decided to call these liminal spaces religious ecotones. In ecology, an ecotone is a transition zone between two biomes. In other words, it is the space in which two communities meet and integrate.

In our audio recording, Katie and I sought to capture what worship sounds like across religions in Bed-Stuy. Each congregation has its own form of “mass” – a meeting in which the members pray and listen to their religious leaders speak. We captured some sounds from a Catholic church, a mosque, a Baptist church, and a Jewish friend during their different periods of worship.

The outcome was a patched-together narrative of what worship sounds like. Here’s the outcome:

Interestingly, I found that this project proved to be a good complement to this week’s reading, the short sci-fi piece “The Machine Stops” by E.M. Forster. The story explores a post-apocalyptic world in which humans are dependent on technology for all their needs, including social interaction. People in this world have a huge network of friends, many of whom they haven’t met. The story effectively predicts the rise of online culture and social media, as our generation increasingly flees traditional centers of community (such as religious spaces) and creates new, different communities online.

IMG_9731Fatima in the women’s section of a mosque on Fulton Avenue.

Churches, mosques, synagogues, and meditation classes are spaces in which individuals have in-person social interactions. They are also gathering places for religious communities in which intergenerational conversations can take place. I think our society needs both types of meeting spaces – those that occur online and those that are located within our communities.

Redesign of a sign.

For this week’s assignment, we were told to keep an eye out for examples of good and bad design on signage around the city. I began looking for confusing or wordy street signs, but I quickly realized that New York City’s Department of Transportation has been remarkably comprehensive with its redesign of all the street signs. There was a lot of uniformity, especially in NYC’s subway system. Here’s an example of an ad from the MTA that educates passengers about appropriate behavior on the subway:

1I thought the MTA ad was visually eye-catching, clever, and humorous. It communicated a lot of information in a few words. The left alignment of the text makes the sign more readable. The font choice – Helvetica – is consistent with other signage in the subway system and also makes the sign readable.

As I began looking for examples of less effective signage, I turned my attention to signs advertising goods or services and I found some really interesting examples of bad design.

A truck advertising plumbing services. Although the logo isn’t terrible, the hierarchy of information on this truck makes your eye move around before you can find the important information. I don’t even know what I’m looking for. IMG_1989

An ad on the subway for educational services. The font, the spacing, the choice of photographs and clip art adds up to a really ineffective advertisement. The placement of the “E” and the “L” in “Electrical” makes little sense.IMG_1992

A corner bodega. The sign isn’t terrible, however I think the hierarchy of information could be improved. IMG_1978

I decided to re-design the last sign, which advertises a grocery store that is open 24 hours a day. The store name, Havemeyer Grocery, should be most prominent, followed by the store hours (around the clock). The store offerings aren’t as important to include on the sign. Here’s the revised sign I designed:

Havemayer-Grocery

Portrait of a classmate with p5.js.

This week’s project was an exercise in patience because, to paraphrase the poet Mary Ruefle, I’m just a handmaiden with a broken urn when it comes to writing code. I felt like I learned a lot, though, and I’m slowly developing a process for these assignments.

In our first class, we were introduced to the basic functions in p5.js, function setup() and function draw(). Our assignment this week was to draw a portrait of one of our classmates using only primitive shapes in p5.js (translation: no complex shapes, no animation, no interactive elements).

Here’s a link to the final portrait.

My process was simple: (1) Sketch out the drawing in Illustrator; (2) Create a quick outline in p5.js; and (3) Code like hell.

I found it helpful before even writing a line of code to use Illustrator to make a sketch so I had a better sense of what kinds of shapes I’d be drawing. In terms of color, I used the site Coolors to generate the RGB values that matched the color scheme I envisioned.

Here’s my initial sketch and the outline of shapes I planned to use:

Sketch

Sketch---shapes

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

From there, I started coding.

Overall, I wanted the portrait to be symmetrical so I was doing a lot of math in my head to calculate some of the x and y coordinates for each shape that I drew. Just about all the shapes I drew were various combinations of quad(), rect(), ellipse(), triangle(), and line(). There was a lot of trial and error, since I was pretty much guessing at where different points would be on the canvas.

That proved to be a massive headache since I was constantly trying to remember which line of code corresponded to which shape in the drawing and which coordinate corresponded to which point on the shape. It’s no big deal when you have three shapes, but when you are drawing 30+ shapes it can get messy.

I found it helpful to organize my lines of code with some subheadings: //face, //hair, //shirt, etc. I think that in the future I will create a much more detailed system for labeling shapes in my code.

I quickly realized that the portrait I’d sketched out at the beginning of the project was going to be an ambitious undertaking with the limited tools that were available to me in p5.js (since we could only use functions that drew primitive shapes). It took me longer than I’d anticipated, but I found that I was able to add a lot of detail to the portrait despite these limitations.

Here’s a screenshot of the final portrait:

Portrait screenshot

Overall, the project was an excellent exercise in learning the basics of p5. Here’s my full code:

function setup() {
createCanvas(1000,1000);
background(132,220,207);
}

function draw() {

//shirt
stroke(33,131,128);
fill(33,131,128);
quad(50,450,80,350,420,350,450,450);
quad(80,350,190,300,310,300,420,350);
stroke(130,150,133);
fill(130,150,133);
triangle(190,300,290,400,220,395);
triangle(310,300,290,400,330,350);
stroke(60,132,131);
fill(60,132,131);
ellipse(232,382,10,10)
stroke(97,61,193);
fill(97,61,193);
quad(120,330,160,312,190,450,155,450);
quad(400,337,350,317,375,450,400,450);

//face
stroke(230,202,171);
fill(230,202,171);
triangle(190,300,310,300,290,400);
quad(190,300,200,270,300,270,310,300);
quad(180,230,320,230,280,300,220,300);
rect(180,160,140,70);
quad(180,160,320,160,300,90,200,90);

//hair
stroke(161,131,87);
fill(161,131,87);
quad(180,230,320,230,280,300,220,300);
triangle(180,230,180,190,199,230);
triangle(320,230,320,190,301,230);
quad(170,185,180,185,200,130,177,115);
quad(200,130,177,115,210,65,267,55);
quad(200,130,267,55,290,90,270,90);
quad(267,55,270,90,300,110,320,100);
triangle(270,90,250,110,260,90);
quad(300,110,320,100,330,140,310,150);
quad(310,150,330,140,320,185,315,185);
triangle(300,105,299,140,315,170);
triangle(200,130,196,155,180,175);
rect(207,149,23,4);
triangle(207,149,207,153,199,160);
rect(271,149,23,4);
triangle(294,149,294,153,303,160);

//face #2
stroke(230,202,171);
fill(230,202,171);
ellipse(175,185,26,50);
ellipse(325,185,26,50);
triangle(200,229,240,229,220,250);
triangle(300,229,260,229,280,250);
ellipse(250,260,33,11);

//facial features.
stroke(133,138,227,100);
strokeWeight(1.2)
fill(222,255,240);
quad(226,168,210,168,203,177,233,177);
quad(274,168,290,168,297,177,267,177);
stroke(86,88,87,190);
fill(86,88,87,190);
ellipse(223,172,8,7);
ellipse(287,172,8,7);
stroke(222,242,200);
fill(222,242,200);
ellipse(225,170,3,2);
ellipse(289,170,3,2);
stroke(133,138,227,0);
fill(133,138,227,110);
quad(203,177,233,177,226,181,210,181);
quad(297,177,267,177,274,181,290,181);
quad(243,230,259,230,267,223,235,223);
quad(267,223,235,223,242,218,260,218);
stroke(133,138,227,100);
strokeWeight(2.2);
line(258,217,258,163);
line(228,181,210,200);
line(272,181,290,200);
line(290,310,280,370);
line(230,313,260,350);

//hands
stroke(230,202,171);
fill(230,202,171);
rect(136,410,68,13);
rect(138,425,70,13);
rect(136,437,70,13);

//hair #2
stroke(161,131,87);
fill(161,131,87);
triangle(186,185,189,163,175,160);
triangle(314,185,311,163,325,160);
triangle(175,140,175,123,165,145);
triangle(210,67,190,79,210,75);
triangle(330,138,338,143,325,120)

//glasses
stroke(71,44,27,230);
strokeWeight(2.7);
noFill();
ellipse(217,175,44,24);
ellipse(284,175,44,24);
line(239,175,262,175);
line(195,175,182,170);
line(306,175,318,170);

}