LiveView on Nerves

I’ve played with Nerves for almost as long as I’ve been learning and using Elixir. Nerves is a fantastic way of working with hardware along with the BEAM virtual machine and it is great fun for hobbyist projects like Raspberry Pis. Phoenix LiveView is currently my favorite way of making full-stack web development cohesive and keeping the complexity as low as possible for it. I haven’t run into any short compelling demos of getting these started together. There is also a video covering this exact project that I made.

It is not exceedingly hard to do. There are may compelling first demos we could arguably make to achieve it. This is just one way to do something fun with Nerves and LiveView in a fairly minimal way. I have also made a slightly more involved demo project for an event that you can find on my GitHub. That one includes Tailwind CSS, basic Ecto migrations with SQLite and some other niceties but it is another demo. The docs for that one are sparse, steps are likely missing. You have been warned.

Let’s stick to the slightly simpler demo and unpack the process.

Let’s get going

You will need a Raspberry Pi device, an SD card reader and some patience with me. I ran this on a Pi Zero W and a Pi 400.

The fundamental steps are taken from the Nerves documentation for User Interfaces which has a whole thing about setting up a “poncho” project with Phoenix.

First, make sure you have installed Nerves and have Elixir working.

mkdir keybored
cd keybored
mix keybored_ui --module KeyboredUI --no-ecto --no-mailer

This gives us a project directory for the whole thing, we create a Phoenix project inside of it. We make sure the name is properly capitalized and then we exclude Ecto (no database, plz) and the default mailer (no email, plz). Ecto would require a few extra steps to handle. The mailer just isn’t used.

I want a way of providing interesting and useful inputs to play with so we will bring in a special linux-only dependency. It will work on your host machine if it runs Linux and assuming you put yourself in the input group. It will also work very nicely on the Raspberry Pi with Nerves as Nerves is built on Linux.

In your mix.exs, find the deps function and add this one:

elixir keybored_ui/mix.exs
{:input_event, "~>1.0"}

Then run:

mix deps.get

InputEvent uses the Linux Input subsystem userspace API (apparently) and there is some neat reference here. Essentially it lets you get events from things like buttons, keyboards, mice, touchscreens and such in your system.

This calls for a GenServer

Time to make a nice and rough GenServer to capture those events. If you want to run this part of the Elixir application on a non-Linux host you’ll need to fake some events on your own.

Create the file lib/keybored_ui/inputter.ex and write up the following:

elixir keybored_ui/lib/keybored_ui/inputter.ex
defmodule KeyboredUI.Inputter do
  use GenServer

  def start_link(_) do
    GenServer.start_link(KeyboredUI.Inputter, nil, name: Keybored.Inputter)

  @impl true
  def init(_) do
    devices =
      |> {device, info} ->
        {:ok, _pid} = InputEvent.start_link(device)
        {device, info}

    {:ok, devices}

  @impl true
  def handle_call(:fetch, _, devices) do
    {:reply, devices, devices}

  @impl true
  def handle_info({:input_event, _device, _values} = event, devices) do
    Phoenix.PubSub.broadcast!(KeyboredUI.PubSub, "inputs", event)
    {:noreply, devices}

This GenServer starts, registers with a name. In the init function it does the wildest possible thing and gets the list of devices available for InputEvent. All of them. For each device it then starts a link to it which means it will starts sending messages to use.

We store the list of devices since there is some good stuff in there. The name of the device for example. Take a look at the data though, it also provides a bunch of information about the events you can expect from the device and all.

Then we implement one call-handler which will let us fetch the device list. We’ll use it later.

The other handler is the input event handler. The only thing that it does is take the event and broadcast it to the “inputs” topic via Phoenix PubSub.

Then go to your lib/keybored_ui/application.ex and add it to the Supervision tree as:

elixir keybored_ui/lib/keybored_ui/application.ex

Next we build a LiveView

Create the folder lib/keybored_ui_web/live and the file input_live.ex in it. It should look as follows:

elixir keybored_ui/lib/keybored_ui_web/live/input_live.ex
defmodule KeyboredUIWeb.InputLive do
  use KeyboredUIWeb, :live_view

  @impl true
  def mount(_params, _session, socket) do
    devices =, :fetch)

    Phoenix.PubSub.subscribe(KeyboredUI.PubSub, "inputs")

    {:ok, assign(socket, devices: devices, events: [], dot: {50, 50})}

  @impl true
  def handle_info({:input_event, device, values}, socket) do
    events = [{device, values} |]
    dot = process_movements(, values)

    {:noreply, assign(socket, events: events, dot: dot)}

  defp process_movements(dot, []) do

  defp process_movements({x, y} = dot, [value | values]) do
    dot =
      case value do
        {:ev_rel, :rel_x, points} -> {x + points, y}
        {:ev_rel, :rel_y, points} -> {x, y + points}
        {:ev_key, :key_up, 0} -> {x, y - 5}
        {:ev_key, :key_down, 0} -> {x, y + 5}
        {:ev_key, :key_left, 0} -> {x - 5, y}
        {:ev_key, :key_right, 0} -> {x + 5, y}
        _ -> dot

    process_movements(dot, values)

  def render(assigns) do
    <svg viewBox="0 0 100 100" style="position: absolute; top: 0; left: 0; height: 100vh; width: 100vw;">
      <circle cx={elem(@dot, 0)} cy={elem(@dot, 1)} r="6" />

    <div style="position: relative; max-height: 800px; overflow: hidden;">
    <%= for {d, e} <- Enum.take(@events,100) do %>
      <div><%= @devices[d].name %>: <%= inspect(e) %></div>
    <% end %>

As we mount the LiveView we pull the list of devices and subscribe to the “inputs” topic. We set up some initial state with the list of devices, an empty list of events and .. a dot?

We add the handle_info/3 callback matching for the input events we expect to be working with. We add the event information to the events list and for particular events we let a function called process_movements/2 update the dot. We update the assigns from this.

Our render-function renders the assigns by showing an SVG dot on the screen and printing out some recent events.

Let’s add it to the router.ex file by replacing the existing entry for "/" with:

elixir keybored_ui/lib/keybored_ui_web/router.ex
live "/", InputLive

That’s it for the LiveView part. If you are on Linux you can test this on your host by making sure your user is in the input group sudo usermod -a -G input <username>, restuffing your shell with newgrp input and then running mix phx.server.

On to the Nerves!

More Nerves!

We need to generate the Nerves firmware project next to the keybored_ui project. This gives us the foundation for a poncho-style project. It is a way of structuring related Elixir projects where the defining feature is that it isn’t an Umbrella project. It also let’s you run the UI part of your application without fiddling with the Nerves firmware config. In this case, it mostly speeds up project creation, we can use the Nerves generator and the Phoenix generator without needing to merge their efforts.

In our main keybored folder:

mix keybored_firmware

Go into the new project’s mix.exs file and add your UI project as a dependency, turning these two projects into a poncho:

elixir keybored_firmware/mix.exs
{:keybored_ui, path: "../keybored_ui"},

Now we want to slightly change our usage of esbuild in the UI project.

In our keybored_ui/mix.exs we change the esbuild line to read:

elixir keybored_ui/mix.exs
{:esbuild, "~> 0.3", runtime: Mix.env() == :dev && == :host},

Specifically adding the bit about the

In the keybored_firmware project run:

shell keybored_firmware/
mix deps.get

We do some configuration. We set up the wifi unless you know your USB tethering and want to do that. WiFi is reasonable enough to set up, tethering by USB is still immensely useful if something goes wrong so on a Pi Zero, Pi 3A+ or Pi 4 you should be able to take advantage of that.

In the firmware project we hit the keybored_firmware/config/config.exs to get our config configured out:

elixir keybored_firmware/config/config.exs

config :vintage_net,
  regulatory_domain: "SE",
  config: [
    {"usb0", %{type: VintageNetDirect}},
       type: VintageNetEthernet,
       ipv4: %{method: :dhcp}
    {"wlan0", %{
      type: VintageNetWiFi,
      vintage_net_wifi: %{
        networks: [
            key_mgmt: :wpa_psk,
            ssid: "Kontoret",
            psk: "underjord"
      ipv4: %{method: :dhcp}


# config from the nerves UI guide
config :keybored_ui, KeyboredUIWeb.Endpoint,
  url: [host: "nerves.local"],
  http: [port: 80],
  cache_static_manifest: "priv/static/cache_manifest.json",
  secret_key_base: "HEY05EB1dFVSu6KykKHuS4rQPQzSHv4F7mGVB/gnDLrIu75wE/ytBXy2TaL3A6RA",
  live_view: [signing_salt: "AAAABjEyERMkxgDh"],
  check_origin: false,
  render_errors: [view: KeyboredUIWeb.ErrorView, accepts: ~w(html json), layout: false],
  pubsub_server: KeyboredUI.PubSub,
  # Start the server since we're running in a release instead of through `mix`
  server: true,
  # Nerves root filesystem is read-only, so disable the code reloader
  code_reloader: false

# Use Jason for JSON parsing in Phoenix
config :phoenix, :json_library, Jason


For the WiFi I suggest you change your SSID and PSK in the configuration to match your actual WiFI. OR, and this is certainly an option, you change your WiFi access point to have the same credentials as mine. I’m saying, you have options.

The later part is essentially copy-pasted from the Nerves UI guide. Keep your secret key base very secret, don’t betray your signing salt, use this for fun not function. The reason we set up the full Phoenix config in here is that the config in a dependency, such as keybored_ui which we treat as a dependency, is not inherited by your application. We need this stuff in here or Phoenix has no clue what it should do.

Back to keybored_ui and run the following:

shell keybored_ui/
export MIX_ENV=dev
export MIX_TARGET=host
mix deps.get
mix assets.deploy

This should build the static assets we need for the firmware project. In keybored_firmware we run the following snippet. You can prepare by putting the SD card in the card reader. Please note the MIX_TARGET and match it to your intended hardware device:

shell keybored_firmware/
export MIX_ENV=dev
export MIX_TARGET=rpi0 # or rpi3a, rpi4 or what matches your hardware
mix deps.get # Downloads the appropriate system
mix firmware # Compiles the firmware
mix firmware.burn # Burn to the card, if you have multiple devices it asks

After burning. Shove the card in the Pi. Give it power and within a little while it should go on to the configured WiFi or show up via USB tethering if you do that. You should find it at http://nerves.local which works on everything except Android devices because … boooh. If something went wrong tethering is your best bet. You can reach it via ssh nerves.local if you have a network connection to it. Otherwise, reburn? I don’t know, troubleshooting is out of scope. Poke me in the #nerves channel on the Elixir Slack if you want.

If you plug a keyboard, mouse or other input device into this Pi it should let you steer the dot via arrows or pointer events, it should show the events captured and give you a sense of what LiveView could do for you. If you want to make code changes that’s surprisingly simple. Edit the code.

shell keybored_firmware/
mix firmware # Recompile
mix firmware.gen.script # Generate an upload script (only necessary once)
./ nerves.local # Upload new firmware over SSH

I think that’s it!

Try LiveView, try Nerves, try ‘em together. I enjoy them both immensely. Knock yourself out. If you want more updates about Nerves, get the Nerves Newsletter. If you want more of my shenanigans, get my weekly no-tracking newsletter.

If you have questions, thoughts or more of a comment really. Let me know at or on Twitter where I’m @lawik.