Massive reorganization of posts.

This commit is contained in:
2024-10-15 15:42:42 +00:00
parent 1ee853844c
commit b564e7249a
146 changed files with 19 additions and 1 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 MiB

View File

@ -0,0 +1,90 @@
---
title: Remotely controlled blinkenlights with arduino
date: 2020-05-15
tags:
- arduino
- electronics
- esp8266
- golang
- programming
- technology
---
{{< raw >}}
<p class="l-deprecation_warning">
This post sat in drafts for years, and is being published as-is. I might update it at some point, but probably not. Sorry!
</p>
{{< /raw >}}
There are a lot of reasons I might not want someone to knock on or open my office door.
I work from home, which means meetings get interrupted. I have sensory processing issues,
which means sometimes I can barely deal with human voices. I have an anxiety disorder,
which means other times I can't deal with, well, human interaction.
Maybe it's stir-craziness during quarantine, or maybe this is just who I am, but I decided
to solve this problem in the most convoluted way I could think of. So I wrote a server and
arduino client for controlling gpio pins over wifi, and built myself a status indicator light.
## The build
Here is the initial prototype:
{{< imgproc "breadboard.jpg" >}}
Breadboard with wires, LEDs, resistors, and a microcontroller.
{{< /imgproc >}}
After some cleanup (and adding a fourth light) we get this nice little box:
{{/*< imgproc "wiring.jpg" />*/}}
{{/*< imgproc "finished.jpg" />*/}}
## The code
I tried to keep the code generic, but not too generic. I could have had the webhook server just record and relay
an arbitrary string, for instance. But the server needs to be aware of 'momentary' switches, (for future ideas
I have) and I want some sort of input validation, so instead the server expects a simple little json array.
The client code currently only works on ESP8266 chips. (specifically, my prototype is an Adafruit Feather Huzzah)
One 'fun' thing about the ESP8266, and more broadly Arduino programming in general, is that there are a lot of
different libraries that can be used to achieve the same thing, with varying levels of ease and documentation. Getting
HTTPS to work was surprisingly challenging, and I briefly went down a rabbit-hole of Arduino filesystems and hex-encoded
CA certificates before discovering BearSSL's WiFiClientSecure library.
The server code is [here](https://git.annabunch.es/annabunches/gpio-webhook-server) and the client code is
[here](https://git.annabunch.es/annabunches/gpio-webhook-arduino).
## Triggering the webhook
Anything that can POST to a URL can be used to write to the webhook and change the light. I'm using [IFTTT](https://ifttt.com)'s
Amazon Alexa integration, which seems to be the easiest way to get an Alexa to send a webhook.
## Understanding the pin mapping
The `PIN_MAP` variable in `config.h` needs to be configured based on the particular board you're using, and which GPIO pins
you've decided to use for that board. As mentioned above, this project is using an Feather Huzzah with the ESP8266 chip.
The pinout for that board looks like this:
{{< imgproc pinout.png >}}
Pinout diagram for Feather Huzzah ESP8266. Copyright Danny Nosonowitz. Original available [here](https://learn.adafruit.com/assets/46249").
{{< /imgproc >}}
I'm using the "bottom" four pins on the right-hand side, pins 4, 5, 2, and 16. So `PIN_MAP` looks like this:
{{< highlight C >}}
{
{4, 1}, // green
{5, 1}, // yellow
{2, 1}, // red
{16, 1} // blue
};
{{< /highlight >}}
The order in this array determines which pin maps to each webhook data point. For example, if the webhook data received looks like `[1,0,1,0]`, that would turn on the green and red lights.
## Security Concerns
Don't use this code for anything critical! The only protection this has against malicious input is the webhook
URL being kept a secret. Assuming your webhook server is either kept on your local network or secured behind an HTTPS proxy,
that should be a reasonable guarantee, but it's still a fairly shaky amount of security. I considered requiring POST requests
to be signed by a secret certificate of some sort, but that would rule out using something as simple as IFTTT to send the webhooks. So instead,
I've sacrificed a measure of security for ease of use.

Binary file not shown.

After

Width:  |  Height:  |  Size: 221 KiB

View File

@ -0,0 +1,68 @@
---
title: "How I Host Websites (plus musings about Web 2.0 and social media)"
date: 2023-09-13
tags:
- meta
- infrastructure
- web hosting
- containers
- static site generators
- rant
---
For years, I struggled to find the right solution for hosting a simple blog. Nothing quite felt like it fit my needs; it was either too opinionated, not featureful enough, or way too much infrastructure overhead for a simple blog.
<!--more-->
Tim Berners-Lee created the Web in 1989. It hasn't been strictly downhill from there, but we *have* lost something that feels crucial. This isn't just nostalgia; there are tools that do an admirable job of meeting many use cases, but there are a few use cases that should be better served than they are. So I want to talk about the websites I manage, how I chose the tools that I use, and what I feel is missing in 2023.
### The history
But first, some very broad history. In the early days of the web, everyone was writing HTML by hand. You could grab an account on Geocities, write up some nice static webpages, and go about your day. Early dynamic content was done via [CGI](https://en.wikipedia.org/wiki/Common_Gateway_Interface), which allowed you to plug more-or-less arbitrary backend code in any programming language that could speak CGI.[^1] So at this point, even getting some static content[^2] online where other people could see it required a decent chunk of technical knowledge, and anything with dynamic content like user comments often required substantially more.
Eventually Content Management Systems, of which blogging platforms like Wordpress are a subset, came along. These made it much easier to get a website with dynamic content up and running, and more importantly made it easy for users to create pages without a great deal of technical knowledge. This meant that one "admin" could easily host a website that had dozens of non-technical contributors.
And eventually, these systems (especially Wordpress) became the standard way to host a website. Hosted blogs (such as Livejournal and, more recently and perhaps relevantly, Medium) came along, and then of course the rise of social media made blogs largely irrelevant, but Wordpress offered a lot more in the way of customizability and having a presence that wasn't tied to a hosted platform. Wordpress is, in many ways, the last way to easily create an online space that feels like it belongs wholly to the author and not just content in a social media content mill. There are things like Squarespace, but they have premium price tags and are almost universally more geared toward small business, not individuals who just want a web presence.
### The problem
Wordpress is genuinely a good tool. But Wordpress is a very *heavy* application. It requires a database, it supports comments and has hundreds of themes and plugins all installable from its convenient admin interface. This makes it both resource-intensive *and* increases its attack surface substantially. Wordpress is probably the number one target of malicious scripts on the Internet, because it comes with non-optional dynamic features.
For many sites, that's a perfectly fine tradeoff. If comments or other features that make a database particularly useful are a crucial part of your site's experience, or there are plugins that are indispensable for your particular needs, then Wordpress is great.
But the web is *bloated*. Loading a simple page with text and a few images doesn't need to be a slow, multi-megabyte experience. Wordpress, CMSes, hosted blogging sites, and social media are often major overkill in terms of resource usage and page size. The ever-increasing availability of processing power and bandwidth (in the developed world, at least) has made programmers abandon efficiency and optimization in a lot of cases. We've learned to live with seemingly simple sites that take ages to load, and it really doesn't have to be this way.
If your needs are *simple*, if all you want is a place that feels like your own to write long-form thoughts on one or more topics, or post some pictures or links to things you find interesting, then Wordpress is like using a sledgehammer to drive in a thumbtack. Which is where static site generators would like to help.
Static site generators are intended to fill exactly this use case. They take a convenient, "human-readable" directory of content and create a set of simple, static webpages that you can upload to a webserver of your choice. The result is a lightning-fast website that can handle a ton of traffic.
Unfortunately, these generators require a non-trivial level of technical knowledge to use. You need to know, at a minimum, how to use markdown, a command line, `scp` or `ftp`, and probably how to write a little code. Ideally you would also understand source control and a host of other conceptual tools. Static generators are *great* for the technically savvy, and as a result are frequently used for technically-leaning blogs and websites.
But the gap comes in between these two user bases. Don't want to contribute to the bloat of the web or deal with the fragility of a database just to put your thoughts somewhere that Musk and Zuckerberg can't touch them? Well, there's Github pages, but that still requires writing your own HTML or using a static generator. (plus [Github is best avoided](/posts/2019-12-04-please-stop-using-github/)) There's Medium or Cohost, but they are still unnecessarily bloated for a lot of people[^3], plus lack the personal feeling of "here is a space on the web that is solely mine."
### My solutions
Of course, I have a decent level of technical ability, so some of my sites, including this one, are built with a static site generator. ([Hugo](https://gohugo.io/) is the one I prefer these days) You can see the source for this site on my [git server](https://git.annabunches.net/annabunches/annabunches.net). The built site does include some javascript, mostly for Google Analytics, but if I want to I could pare that down further even.
I could deploy these sites to a simple webserver over SSH or FTP, but instead I build the sites directly into `nginx` docker containers and deploy them into a docker environment running on [Linode](https://www.linode.com/) using `docker-machine` and the wonderful (jwilder/nginx-proxy)[https://hub.docker.com/r/jwilder/nginx-proxy] image. (I have to have *some* sort of overkill in my workflow or I feel unsatisfied. But this overkill doesn't affect the end user experience!)
Using containerization gives me a lot of clean, reproducible configuration; just about everything is configured as code, with very little in the way of state on the webserver itself. I can redeploy these onto a new host with zero loss of content at any time, should the need arise.
Plus, my deploy script can just run an elaborate version of this:
```
docker build
docker push
docker-compose pull
docker-compose up -d
```
Which, for a personal blog, feels like just the right amount of infrastructure to deploy.
I also host several Wordpress instances, most notably [Eruditorum Press](https://www.eruditorumpress.com/). I use some of the same infrastructure tricks, but these sites are largely authored by non-technical users and have a strong need for comments, so the static site solutions above aren't feasible. So Wordpress feels like the right solution there.
### In Conclusion
Ok, so this was mostly just a rant about website bloat. And it's possible I'm missing obvious solutions or am blowing the problem out of proportion. Maybe I've just gotten tired of websites being slow. But it really feels like we ought to have created something that's both easy to use without technical expertise but also produces clean, fast websites for people with simple use cases.
[^1]: And if your favorite language didn't have a library for CGI, you could write that support in yourself!
[^2]: Like, say, your [Revolutionary Girl Utena fanpage](https://www.geocities.ws/k_weird_girl/rgu.html).
[^3]: I loaded a random page on Cohost with no significant interactivity and got at least 400 kB of javascript. Cohost was somewhat worse. That may not seem like much, but remember that some people are still reading your blog on a 2G mobile connection.

View File

@ -0,0 +1,28 @@
---
title: Please Stop Using Github
date: 2019-12-04
tags:
- github
- ice
- activism
---
I mentioned on [twitter] that I've stopped using Github because of their continued support of ICE.
To be clear, it's not just that ICE uses Github. It's that they think [supporting ICE is okay] because
ICE is responsible for things other than putting children in cages. Their statement misses the point
in so many ways, I have very little hope they'll ever see past their own justifications on this.
So, no more Github. This has some very real impact for me - I've used github for a long time, I've used
its social media features and been a paid subscriber for years. Dropping github is annoying. But we
**must** stop supporting organizations that are in turn supporting fascism. I encourage anyone reading
this to follow my lead and give up on github.
Good alternatives exist: [Gitlab] has a free tier very similar to Github's if you want a drop-in hosted replacement.
[gitea] is what I've opted for, because at this point I just feel *exhausted* by large, hosted solutions; it's a lovely,
community-driven open source project. It has a docker image that is very easy to set up, and it supports U2F/webauthn.
And of course, you can also just keep bare git repositories on a server and push to them with SSH, although that loses
all of the convenience and discoverability of a git solution with a web interface.
[twitter]: https://twitter.com/annabunches/status/1202361836852654082
[supporting ICE is okay]: https://github.blog/2019-10-09-github-and-us-government-developers/
[gitlab]: https://about.gitlab.com/
[gitea]: https://gitea.io/en-us/