This assignment is an opportunity to practice with sounds and visualizations.
You can start by cloning our multi-sketch template into a repository called HW10
.
Since the computer is gonna be doing all the work of filling our screen, please use: createCanvas(windowWidth, windowHeight)
.
Like previous assignments, please enable GitHub pages on your GitHub repos and use Brightspace to submit GitHub links to your HW10 repository.
This assignment will be graded on a scale from 0 to 10, taking the following criteria into account:
README.md
file with information about the process.In this exercise we will use the p5 sound library to create a LIVE visualization for a song.
Pick a song, any song. Feel free to create a song or use something that already exists. It should be about \(1\) minute long. If using something that is longer, please edit it down to one minute.
Include the p5 sound library in the project, following the instructions on the sound tutorial or the code from class.
You can use this URL for the sound library:
https://cdn.jsdelivr.net/npm/p5@1.7.0/lib/addons/p5.sound.js
Load the song into your sketch, set it up so it plays automatically or on a mouse click, and then use p5.Amplitude
and p5.FFT
objects to get volume and frequency information from the audio being played.
Once you have volume and frequency information from the live samples being played, use them to create patterns, drawings, movements, anything, on the canvas.
Since we want this to be dope!, make the canvas as big as the browser window:
createCanvas(windowWidth, windowHeight);
You don’t have to visualize \(1000\) frequencies, but include at least one frequency and volume.
Add a DOM element to change one aspect of your visualization.
Make it personal. Make it interesting. Imagine how this could be used as visuals for a performance.
Include in your README information about the song that you picked, and how the song’s parameters are contributing to the visualization.
This is similar, but different from the previous exercise.
Like above, we will use the p5 sound library to visualize a song, but this time, instead of visualizing the first minute of a song as it plays, we will create a static representation of the whole song.
Pick a song, any song. It can be the same song as exercise A, as long as it’s around $3$ minutes long.
Load the song into your sketch using the loadSound()
function, as above, but this time, use getPeaks()
to get all of the song’s samples at once, without playing the song.
The getPeaks()
function will return information about the entire song, but the length of the array returned should be $5$ times the width of the canvas. Use ALL of these samples to create a static representation of the song.
Since there are lots of samples to visualize, make the canvas as big as the browser window:
createCanvas(windowWidth, windowHeight);
A simple line graph won’t be enough. Make it personal. Make it interesting. You can even combine the sample information with text and/or images and other drawings to make a really nice poster or album cover.
Include in your README information about the song that you picked, and how the song’s samples are contributing to the visualization.
How physical is computing?
Keep this question in mind while you read and write a 200-word response to the following:
As always, your response should be personal, meaning that you should be expressing your views and opinions about the text and not just summarizing it. You can use the following rubric to guide your response:
Please submit your response via Brightspace.
Alternatively, you may create a reading.md
file in your HW10
repo and write your response in markdown. Just make sure to submit a link to the file using Brightspace.