Last week I talked to a friend about putting some real-time data on my website, primarily for novelty but also because I thought it was an exciting problem to build a solution for.

I am building a way to capture some real-time data, getting it over to a server in a secure fashion and figuring out how to present it with some consideration for latency and bandwidth. Here's an account of how I built this.

The Design

When thinking about how I wanted to design this for the website, my main objective was to blend into the existing aesthetic and immediately get out of the way when you continue browsing the website. So all it's meant to be is a quirky detail.

Choices like making it a background feed just on the homepage and fading it out as you scroll down to the content made it, for the lack of a better word, less obnoxious.

I also wanted to limit my exposure in terms of privacy, which is why the feed is from my workspace, low-resolution, grayscale, streamed at a reasonably low frame-rate and can be easily paused without breaking the experience on the website.

The hardware

To learn more about the hardware development tools I used for this project, look here.

The camera is low-quality, but that's ideal for this purpose. A lower-quality, somewhat fuzzy image is less distracting and hides a lot of detail. I wrote a small program for the chip that utilized the camera frame buffer to capture an image at a set interval, crop it, and send it to a microservice to serve to the website.

Something to remember with a project like this is that you never want to open up a direct connection to a camera or any device on your local network. It's also a good idea to assume that all the content you push to the server could be recorded and made public outside of the context of your project.

The software

For the server, I picked Go to write a small microservice for its convenience and low-performance overhead for something like this.

If you're here for the specifics, one of the Go services receives images from the camera and stores a copy of the latest frame. The second service serves the newest frame to my website to power the live feed.

When someone visits the site, the front-end will poll the API to fetch the latest frame and sequentially fade it into view, recreating the video-like feed. It's real-time to within about 10 seconds when active, and when I pause the camera, the website will loop through a pre-captured sequence of images.

v1

My current approach to recreating a live feed using a sequence of images instead of creating an actual video live stream has a couple of pros and cons. For one, serving images from a server is less complicated than setting up a video stream. Additionally, I could host at least a couple thousand live viewers off a small server (2 vcpus, 2 gigs of memory).

Serving images that are ~20-30kb in size each at a frame rate of 30 frames per minute makes up about ~1Mb per minute in bandwidth. For reference, a video stream of roughly the same quality but at a much higher framerate only costs about ~2Mb per minute. Definitely room for improvement. For a future upgrade, I'm hoping to take a page out of video compression algorithms and see how far I can compress these individual frames or if there is a way for me to send diffs instead of full images.

v2

My second approach for handling the video stream is to serve a HLS feed. You can pipe a stream of images through ffmpeg to produce the video stream, and then serve it. I'll report back with how it goes in the future.

A couple of notes on enhancing privacy

As this Reddit user points out, sharing a live feed of yourself on the internet is not to be taken lightly. You could inadvertently leak information about you or others in your environment, so it's a good idea to consider these implications seriously.

I picked my workspace as the subject because I didn't mind working in public. It was also the type of presence I was aiming for aesthetically, but I'll likely point it somewhere else in the future.

If you've accepted the exposure and risk involved with live streaming, here are a few things I built into the streaming service that hope to enhance privacy further.

  • The camera hardware is not accessible over the internet.
  • Hardware shutoff such that the camera can be disabled physically. When the camera is not active, the website falls back to looping a pre-recorded sequence.
  • The video is only sometimes live. The service will loop pre-recorded sequences at random intervals to discourage anyone from tracking any activity over time. While this means that the video isn't always live, it's a fair trade-off for additional privacy and safety.
  • Minimize the number of elements in your stream that may allow someone to geolocate you. This means no landmarks, easily identifiable buildings, photos or items in the frame.

Some stats

It caught me by surprise how many folks were interested in this. You don't often have people sharing a casual stream outside of the context of places like Twitch or Youtube. In hindsight, I should've added a wave button so visitors could interact with the feed and say Hi.

  • ~9000 visitors on the site over the weekend; I didn't have logging setup the first day, the actual number is close to double this.
  • At the peak there were close to 100 concurrent viewers
  • ~40GB in bandwidth for the whole ordeal

stats

That's all there is to it. If you end up building something similar or have any thoughts about this, I'd love to hear about it.