The HTTP-based method for controlling lights taught this week has enabled me to bring to life a concept I first envisioned several weeks ago. The approach involves using an infrared emitter mounted on a pair of glasses—though ideally, an eye-tracker would be used to capture the user's focus—to select the specific piece of furniture to be manipulated, followed by the use of embodied gestures to perform remote control actions. The sketch below illustrates a scenario in which a person selects a lamp they wish to turn on simply by gazing at it, and then—by moving their hand up and down to mimic the motion of pulling a switch—successfully illuminates the light.

61477d21ac5477b5cef48b493698ad88.jpg

Personally, I felt this concept was a bit too ambitious for this week's assignment—especially since, even if I had ordered the necessary IR transmitter and receiver, they wouldn't have arrived in time via courier. Consequently, I opted for a simpler approach: controlling the light—turning it on and off—solely through physical gestures. Below, you will find the code and a record of the physical interaction interface:

code:

#include <SPI.h>
#include <WiFiNINA.h>
#include <ArduinoHttpClient.h>
#include <Arduino_LSM6DS3.h> 
#include "arduino_secrets.h"

char hueHubIP[] = "172.22.151.226";
String hueUsername = "QL6AHnxaf3-3YER9xoBCp8DRDsfweUcfuuAtmqP5";
int lightNumber = 9;

const float PULL_THRESHOLD = -1.5; 
const int COOLDOWN_MS = 1500;

WiFiClient wifi;
HttpClient client = HttpClient(wifi, hueHubIP, 80);
bool lightOn = false;
unsigned long lastTrigger = 0;

void setup() {
  Serial.begin(9600);

 
  int status = WL_IDLE_STATUS;
  while (status != WL_CONNECTED) {
    status = WiFi.begin(SECRET_SSID, SECRET_PASS);
    delay(5000);
  }
  Serial.println("WiFi connected");

  if (!IMU.begin()) {
    Serial.println("IMU Initializtion failed");
    while (1);
  }
  Serial.println("IMU OK");
}

void loop() {
  float ax, ay, az;

  if (IMU.accelerationAvailable()) {
    IMU.readAcceleration(ax, ay, az);

    Serial.print("X:"); Serial.print(ax);
    Serial.print(" Y:"); Serial.print(ay);
    Serial.print(" Z:"); Serial.println(az);  

    unsigned long now = millis();
    if (ax < PULL_THRESHOLD && (now - lastTrigger) > COOLDOWN_MS) {
      lastTrigger = now;
      lightOn = !lightOn;
      sendHueRequest(lightOn);
      Serial.println(lightOn ? "light on" : "light off");
    }
  }

  delay(50);
}

void sendHueRequest(bool on) {
  String body = "{\\"on\\":" + String(on ? "true" : "false") + "}";
  String path = "/api/" + hueUsername + "/lights/" + lightNumber + "/state";

  client.beginRequest();
  client.put(path);
  client.sendHeader("Content-Type", "application/json");
  client.sendHeader("Content-Length", body.length());
  client.beginBody();
  client.print(body);
  client.endRequest();

  Serial.println("HTTP: " + String(client.responseStatusCode()));
}

Physical Interface

image.png

bd46f16e77726ed169c66f524b907609.jpg

Local Network Concept

In my original concept, whenever a piece of smart furniture is selected, it transmits a signal to the central network hub, indicating that it has been chosen and is currently in standby mode. Subsequently, all gesture-based commands are directed specifically to that piece of furniture, which remains in the selected state. Meanwhile, the wrist-mounted gesture detector serves solely to detect the movements performed by the current user and report these gestures to the central hub. The hub then translates these gestures into specific commands and transmits them to the smart furniture.

Frame 22.png

The advantage of this approach is that—much like MQTT—it achieves decoupling between the sending and receiving layers, thereby allowing action data and commands to be separated. Of course, further details will require additional reconsideration at a later stage.