Understanding Networks

Clients and RESTful API Servers for VJ Machine and Mood Jukebox

For the VJ machine, Ellen and Ridwan built the API server and the screen output. Their code is on Ellen’s repo. Alden and I built the controller interface. Basically the controller sends POST requests to the server which is storing info on the state of the VJ machine. Then the screen output is constantly pulling with GET requests to update the visuals. Our code for the VJ machine is here.

vj interface

We also worked with Vidia and Lucas on their Mood Jukebox. Their API is available on Vidia’s repo. We built their Jukebox website which functions both as the controller and the output client.

mood jukebox

The jukebox sends a GET to get a song based on the mood and receives a reply with a song that is determined by the API server. Then the site will send a POST request to add the new song to the playlist. The jukebox site is pulling from the server using GET requests to keep the playlist updated and to determine if the jukebox state is in play mode. To change the state to play music, the site makes a PUT request when the user hits the play button, which returns the first song in the playlist. All the songs (sung by Lucas) were served from the sever so we didn’t need to worry querying another source for music.

Keep reading for more details on things we encountered while working on this project (CORS, Axios, and hiding your API key).


RESTful Control Surface Design - Specification

Alden and I are working on a control interface for people to VJ. We surveyed a couple ITPers on the floor about what makes a good VJ set. People liked when:

  • Visuals reflect the mood and emotional content of the music
  • Timing matches the music
  • You warn people if you’re gonna do a strobe effect
  • There is a dynamic arc or storyline to the visuals

Features and System Diagram

Given our restraints, we knew that timing might be an issue since using REST might not provide instantaneous feedback needed for syncing up with music. Either way, we’re still doing it. In our spec, we would like to have the following options for the controller that will output to visuals (in a browser) that can be projected or shown on a screen. These are the user inputs we’d like to have on the controller.

  • gif mode (on/off)
    • keyword (text input)
  • abstract visual mode (on/off)
    • movement speed (range)
    • foreground color (HSL range)
    • background color (HSL range)
    • shapes/patterns (set of options)
      • rect, circle, triangle, squiggle, arc

We want the clients to communicate according to this system diagram. We are planning on using the Giphy API to pull content for the “gif mode”.

img of system diagram

Either a physical or digital interface will work for the controller (:

Update: Alden and I will be designing the control interface for the VJ visuals. Ellen and Ridwan will be designing the output visuals. The documentation for our RESTful API is available below. Here are the GET or POST requests that can be made by the clients.

POST Requests

Visualization Mode

POST /mode/{value}

value: 0 to turn off, 1 for gif mode, 2 for abstract visual mode

Gif Mode Keyword

POST /gifword/{string}

string: keyword that describes desired gif

Abstract Visual Movement Speed

POST /speed/{value}

value: choose from range 40-180 bpm to match music speed

Abstract Visual Foreground and Background Colors

POST /foreground?h={range}&s={range}&l={range}
POST /background?h={range}&s={range}&l={range}

“h” value: choose from range 0-360 for hue “s” value: choose from range 0-100 for saturation “l” value: choose from range 0-100 for lightness

Abstract Visual Shapes

POST /shapes?triangle={bool}&circle={bool}&square={bool}&squiggle={bool}&arc={bool}

bool: 0 to turn off or 1 to turn on shape visibility. Bool values for each shape are independent of each other.

GET Requests

Current State

Use this to request the current status of the system.

GET /state/

will return the following:

  "mode": {0, 1, 2},
  "gifword": {string},
  "gifurl": {URL},
  "speed": {40-180},
  "foreground": {
    "h": {0-360},
    "s": {0-100},
    "l": {0-100}
  "background": {
    "h": {0-360},
    "s": {0-100},
    "l": {0-100}
  "shapes": {
    "triangle": {bool},
    "ellipse": {bool},
    "square": {bool},
    "squiggle": {bool},
    "arc": {bool}

Giphy URL

Use this to request the current gif URL so the control interface can display the gif that comes back from the Giphy server.

GET /gifurl/

will return the following:

  "gifurl": {URL},
  "errors": {err}

Packet Sniffing a Google Home

picture of my google home mini

I decided to packet sniff a Google Home Mini that my sister gave me last year for Christmas. When I told her I didn’t want to use it because Google was probably collecting all our data, she was pretty upset that I didn’t appreciate her gift. I briefly used it out of guilt, but eventually it creeped me out too much so I quietly stopped using it (sorry, Ann). Now using Wireshark, I hope to report that my concerns are completely valid.


Traceroute of My Most Visited Sites

I started investivating by looking at what sites I visit most often. If you use Chrome, you can go to chrome://site-engagement/ and see which sites you use most frequently. It’s a little creepy that Chrome is tracking your engagement on all the sites your browse, but really it’s just the tip of the iceberg.

I’m interested in how location affects the traceroute, so I’m also going to compare Google in the US (google.com) and Google in Taiwan (google.com.tw). I also looked at my personal site (beverlychou.com), which is hosted through DreamHost, and my blog (itp.beverlychou.com), which is hosted through GitHub Pages. I think they’ll go through different paths. The domain is from Namecheap, which points to DreamHost. From there the subdomain for my ITP blog points to GitHub pages. I collected data at three locations - ITP, my home, and a coffee shop with public wifi.


Websocket Game Controller

gif of controller

I used the Feather Huzzah ESP8266 board to create my game controller. Here is an overview of what happens when using the controller.

  1. Controller will connect to the itpsandbox WiFi network.
  2. User hits a button to activate the connection to the server.
  3. The LED inside the button will light up when the controller is connected to the server.
  4. Rotary encoder dials on the top and side let the user scroll their paddle up/down and left/right.
  5. When you’re done playing the game, you can hit the button again to disconnect from the server and the LED will turn off.

Read more for a more detailed breakdown of the build & code.