Case Study: Inky - An elixir library

Underjord is a tiny, wholesome team doing Elixir consulting and contract work. If you like the writing you should really try the code. See our services for more information.

This is a post covering the creation and refinement of an open source project within the Elixir ecosystem. More words than code. Be warned.

I bought* a display from Pimoroni called the Inky PHAT, the red version. It is an eInk display. That means it can show an image even without power being supplied to the screen. In this case it can show black, white and red. It refreshes very slowly (12 seconds or more) but is a cool and potentially useful piece of tech.

* I got it as a replacement after buying another eInk device that was dead on arrival and when I got in touch it was no longer in stock. Fair enough.

I tried it on Raspbian to make sure it worked. But to be honest most of my interests for the Raspberry Pi tends towards using Nerves these days. So I wanted to get it working there. The Pimoroni libraries for Inky are in Python. Getting Python going on Nerves is definitely feasible but I felt like trying to port the library to Elixir instead to avoid pulling in all of Python. Inky isn't a huge code base. A good learning experience I figured.

I'll try to cover the steps the library has gone through.

  • Get it working - A straight port from Python
  • Cleaning it up - Initial refactor, a bit more Elixir
  • Make it pretty? - Refactoring towards testability
  • A use-case, the Scenic Driver
  • Make it beautiful - Isolating state in a GenServer
  • A curiosity - The host development library
  • Publishing
  • What's next?

Get it working - A straight port from Python

GitHub reference: Specific commit

Getting it actually working took a bit of work. I haven't had a chance to do much Elixir previously. I had no real specification for the hardware aside from the Python library. I had to understand what the Python library did in the finicky details to actually get pixels, correctly aligned, to show up on the screen.

The ElixirALE, and later, Circuits libraries were quite nice to work with both for GPIO and SPI. So no problems there.

The Python library is object-oriented in style and simply provides a class that you instantiate with some configuration for your specific device (also supports the WHAT form-factor and the yellow variants instead of red). So I started by mkaing a straight port working quite close to what the original class did. Inky.setup takes some configuration and generates a state struct. The state struct is then passed around for all further calls.

The flow of the application entirely quite in line with the Pimoroni Inky version I worked from at the time. Same level of abstractions, same binary magic numbers/commands in the long Inky.update. I stripped out some things to try to isolate some problems I had (anything touching the what or yellow displays was removed).

Testing was done by running on hardware and printing different colored stripes on the display and hoping that they would line up correctly. After a while they did. Finally.

This is as far as you would need to go for just getting your thing working and it is perfectly valid to just do what you need and get your stuff going. If you do, give a shout and a GitHub link in the relevant channel on the elixir-lang slack and you've increased the chance of someone else getting value out of your work.

Cleaning it up - Initial refactor, a bit more Elixir

GitHub reference: https://github.com/pappersverk/inky/tree/0.1/lib

A friend I made at ElixirConf EU (2019, Prague) offered to review the code once I had it working. And I got some good feedback there. The focus became readability and breaking out the incredibly specific sequences of magic numbers to discrete commands.

The setup function went from this:

elixir
  def setup(state \\ nil, type, luts_color)
      when type in [:phat, :what] and luts_color in [:black, :red, :yellow] do
    state =
      case state do
        %State{} ->
          state

        nil ->
          {:ok, dc_pid} = GPIO.open(@dc_pin, :output)
          {:ok, reset_pid} = GPIO.open(@reset_pin, :output)
          {:ok, busy_pid} = GPIO.open(@busy_pin, :input)
          # GPIO.write(gpio_pid, 1)
          {:ok, spi_pid} = SPI.open("spidev0." <> to_string(@cs0_pin), speed_hz: 488_000)
          # Use binary pattern matching to pull out the ADC counts (low 10 bits)
          # <<_::size(6), counts::size(10)>> = SPI.transfer(spi_pid, <<0x78, 0x00>>)
          %State{
            dc_pid: dc_pid,
            reset_pid: reset_pid,
            busy_pid: busy_pid,
            spi_pid: spi_pid,
            color: luts_color
          }
      end

    GPIO.write(state.reset_pid, 0)
    :timer.sleep(100)
    GPIO.write(state.reset_pid, 1)
    :timer.sleep(100)

    state =
      case type do
        :phat -> InkyPhat.update_state(state)
        :what -> InkyWhat.update_state(state)
      end

    soft_reset(state)
    busy_wait(state)
    state
  end

And was changed into this:

elixir
  def setup(state \\ nil, type, luts_color)
      when type in [:phat, :what] and luts_color in [:black, :red, :yellow] do
    state
    |> init_state(luts_color)
    |> init_reset
    |> init_type(type)
    |> setup_derived_values
    |> soft_reset
    |> busy_wait
  end

Which felt good and was easier to follow.

Inky.update was a monster that looked like this:

elixir
  defp update(state, buffer_a, buffer_b) do
    setup(state, state.type, state.color)

    ## Straight ported from python library, I know very little what I'm doing here

    # little endian, unsigned short
    packed_height = [
      :binary.encode_unsigned(Enum.fetch!(state.resolution_data, 1), :little),
      <<0x00>>
    ]

    # Skipped map ord thing for packed_height..
    # IO.puts("Starting to send shit..")

    # Set analog block control
    # IO.inspect("# Set analog block control")
    send_command(state, 0x74, 0x54)

    # Set digital block control
    # IO.inspect("# Set digital block control")
    send_command(state, 0x7E, 0x3B)

    # Gate setting
    # IO.inspect("# Gate setting")
    send_command(state, 0x01, :binary.list_to_bin(packed_height ++ [0x00]))

    # Gate driving voltage
    # IO.inspect("# Gate driving voltage")
    send_command(state, 0x03, [0b10000, 0b0001])

    # Dummy line period
    # IO.inspect("# Dummy line period")
    send_command(state, 0x3A, 0x07)

    # Gate line width
    # IO.inspect("# Gate line width")
    send_command(state, 0x3B, 0x04)

    # Data entry mode setting 0x03 = X/Y increment
    # IO.inspect("# Data entry mode setting 0x03 = X/Y increment")
    send_command(state, 0x11, 0x03)

    # Power on
    # IO.inspect("# Power on")
    send_command(state, 0x04)

    # VCOM Register, 0x3c = -1.5v?
    # IO.inspect("# VCOM Register, 0x3c = -1.5v?")
    send_command(state, 0x2C, 0x3C)
    send_command(state, 0x3C, 0x00)

    # Always black border
    # IO.inspect("# Always black border")
    send_command(state, 0x3C, 0x00)

    # Set voltage of VSH and VSL on Yellow device
    if state.color == :yellow do
      send_command(state, 0x04, 0x07)
    end

    # Set LUTs
    # IO.inspect("# Set LUTs")
    send_command(state, 0x32, get_luts(state.color))

    # Set RAM X Start/End
    # IO.inspect("# Set RAM X Start/End")
    send_command(state, 0x44, :binary.list_to_bin([0x00, trunc(state.columns / 8) - 1]))

    # Set RAM Y Start/End
    # IO.inspect("# Set RAM Y Start/End")
    send_command(state, 0x45, :binary.list_to_bin([0x00, 0x00] ++ packed_height))

    # 0x24 == RAM B/W, 0x26 == RAM Red/Yellow/etc
    for data <- [{0x24, buffer_a}, {0x26, buffer_b}] do
      {cmd, buffer} = data

      # Set RAM X Pointer start
      # IO.inspect("# Set RAM X Pointer start")
      send_command(state, 0x4E, 0x00)

      # Set RAM Y Pointer start
      # IO.inspect("# Set RAM Y Pointer start")
      send_command(state, 0x4F, <<0x00, 0x00>>)
      # IO.inspect("# Buffer thing")
      send_command(state, cmd, buffer)
    end

    # Display Update Sequence
    # IO.inspect("# Display Update Sequence")
    send_command(state, 0x22, 0xC7)

    # Trigger Display Update
    # IO.inspect("# Trigger Display Update")
    send_command(state, 0x20)

    # Wait Before Deep Sleep
    :timer.sleep(50)
    busy_wait(state)

    # Enter Deep Sleep
    # IO.inspect("# Enter deep sleep")
    send_command(state, 0x10, 0x01)
  end

After the refactor, breaking out the different commands into separate functions we had:

elixir
  defp update(state, buffer_a, buffer_b) do
    state
    |> setup(state.type, state.color)
    |> set_analog_block_control
    |> set_digital_block_control
    |> set_gate
    |> set_gate_driving_voltage
    |> dummy_line_period
    |> set_gate_line_width
    |> set_data_entry_mode
    |> power_on
    |> vcom_register
    |> set_border_color
    |> configure_if_yellow
    |> set_luts
    |> set_dimensions
    |> push_pixel_data_to_device(buffer_a, buffer_b)
    |> display_update_sequence
    |> trigger_display_update
    |> wait_before_sleep
    |> deep_sleep
  end

I find it gave a significantly clearer overview of the sequence of events. There was also a large amount of pure spring-cleaning. Removing printing that was no longer relevant and comments that ended up redundant.

The larger structure of the library remained roughly the same. But it became quite a bit more readable using the Elixir pipe syntax and by breaking the whole thing up into functions that clarified different purposes.

Make it pretty? - Refactoring towards testability

GitHub reference: https://github.com/pappersverk/inky/tree/0.2/lib/inky (and onward through 0.3)

In the tags 0.2 and 0.3 we can see that Nyaray joined the effort. We'd spoken in the #nerves channel on the elixir-lang Slack and also met up during Code Beam Sto (2019, Stockholm), after which he gave me some help getting things in order. And then increasingly he brought his FP-knowledge to bear on this thing.

What we started to do was break apart the commands bit from the SPI/GPIO calls. The goal here was to be able to test things without the device and get some protection against regressions. The library was fully functional. But relatively small changes risked screwing things up. Verifying on hardware every time takes a lot of time.

This also gave birth to the idea in my head that we might be able to give the developer a host development option. But how could we do that in the best way with minimal dependencies. I'd heard there was a UI lib in erlang. Was it wx? Lets put a pin in that.

Along the way there was less passing around of the entire state and we would instead just pass the relevant data, making functions easier to reason about. We also spent some time stripping out the vestiges of a default Elixir application to make it into a proper library. Something you can include in other things. My little library grew up.

Anyway, with time we (well, Nyaray) even got the tests in there. A pluggable IO-module is used for the tests instead of the actual hardware, allowing them to run fast and without the hardware. And then we added CircleCI to the repository. Glorious.

A use-case, the Scenic Driver

We've been quite curious about the Scenic UI framework for a while. Both of us have done some hobby project work with Scenic and want to do more. Since a while back I'd considered using Scenic to render to Inky, mostly to avoid rendering text.

For Scenic to render to a display it needs a driver. To implement a driver usually means rendering the resolved Scenic graph to some kind of OpenGL subsystem. But there is already a driver for Raspberry Pi. It is intended for the official Pi touchscreen display. But it generates a frame buffer at /dev/fb0. Those are useful.

Sidebar: I attended the Nerves training session at ElixirConf EU where Justin Schneck (@mobileoverlord) ran us through a rather in-depth game project using Scenic, the Adafruit OLED Bonnet, Nerves and NervesHub. They made a simple driver for the OLED Bonnet's black-and-white display, using Scenic, mostly to avoid having to render text on their own but also for doing UI in general.

Taking a look at their driver it was fairly straight-forward to remove some input-related code and use our Inky library to send data to our device instead of the OLED screen.

The most challenging thing was managing pixel color, which the OLED driver entirely by-passed. But I wanted the accent color to work as intended. Some pixels I expected to be white were off-white and not all blacks were equal, there was some blending and some anti-aliasing. But with some thresholds it worked. The display could show text. It can probably render images (badly) as well, we just haven't tried it. From some discussion with Frank Hunleth of Nerves fame I have the feeling that the odd colors are probably related to some kind of color-adjustments being done near or on the GPU before the framebuffer is provided to us.

I've even added some configuration where you can let the driver automatically create shading by dithering some of the in-between colors. Anyway, the driver works. We can now render UI, text and such to the Inky display. With Scenic that means you can even calculate what text size you would need to be able to fit text on the display and such, using the font metrics the framework provides. The possibilities are pretty much endless.

You can find the driver here:

GitHub project: https://github.com/pappersverk/scenic_driver_inky

GitHub sample application: https://github.com/lawik/sample_scenic_inky

Make it beautiful - Isolating state in a GenServer

Since way back we've been planning to put the entire thing in a GenServer. There is no reason why your application should need to know about the display state of your Inky or care about the internal operations on that state. In the latest set of changes Nyaray brought the library into a GenServer after some lively discussion to settle on an API. It now exposes: start_link, set_pixels, show and stop

This has included more work on internal separation where the Commands module has been replaced by a pluggable HAL (Hardware Abstraction Layer) which allows another layer of tests along with some other niceties.

This is likely to be the API for Inky for the foreseeable future. And it will allow us to keep the library with good test coverage, making maintenance easier.

A curiosity - The host development library

GitHub: https://github.com/pappersverk/inky_host_dev

I mentioned wx, lets pull that pin back out and take a look. This is an erlang standard library thing where we have access to wxWidgets by default for native cross-platform GUI-creation.

I decided to use it to make a simulation of the inky for development on your "host", that is, your developer machine. This means you get a good-enough facsimile to work against while trying to convince your Inky to do your bidding without having to push firmware to device constantly since it is rather time-consuming even with all the conveniences of Nerves. It is a separate dependency that you can add for development only. It was a fun thing to make and hopefully it ends up useful to someone. I was quite pleased to be able to do this using the standard library. I imagine I'll revisit wx at some point. I'm a sucker for pain.

Note: if you are using Scenic anyway, you already have rendering on the host with the glfw driver. Though it uses a driver that is significantly more capable than Inky, you'd need to use that to get everything else that Scenic offers. The Inky Scenic Driver will not work on your host machine since it is specific to the Raspberry Pi environment.

Publishing

We matched up our libraries, ran our tests, made sure everything was good and then we went through the mild process of publishing packages to Hex. It was very straight-forward. I had no real issues, I had a brief confusion trying to figure out if we needed an organization or not. We didn't. Other than that, smooth sailing.

We've created an org called (pappersverk) where we gather this stuff and possibly some other projects. Keep an eye out.

What's next?

Performance optimizations

Nyaray has been going wild on benchmarking, measuring and optimizing on some of our slower code paths lately. So I imagine that'll be in a release soon. The hardware remains quite slow. The best thing you can do speed-wise is to run it as a black-and-white configuration to avoid the time-consuming accent color. I think it cuts 5-10 seconds from the visible refresh time. I haven't tried it recently.

Community contributions

One community-member got in touch because his wHAT wasn't working. Since neither me or Nyaray have that hardware this was an incredible help in ironing out the issues. I had simply removed too much code when building the first version and it never made its way back in. It should now work with yellow devices and wHAT form factors. Last I saw these fixes were being merged so probably ready for the next release.

Version updates and regressions

Another community member got in touch about some hard errors with the Scenic driver. Turns out things broke with Elixir 1.9 and Nerves 1.5. So we could track down an upstream issue with framebuffer capture module which lead to Frank Hunleth tracking down some missing functionality in the RPi firmware version that Nerves was shipping.

Final thoughts

I must say it has been incredibly cool to hear about people using our library, someone is experimenting with using it for a veterinarian system where they need signs with the names of the animals and such (you can find the post on the Elixir Forums). And people getting in touch and letting us know about problems and helping us verify the issue and drive towards a resolution. That has been very satisfying.

My post with the guide about this library also hit the Hacker News front page and went here and there on the Internet which lends some weight to my theory that a lot of people pay attention to Elixir right now. Or people love eInk displays. Or both?

Thanks for reading about the process and let me know if you have any particular thoughts, corrections or feedback at lars@underjord.io. I'm also open to talking business if you need some help with some of the things I do.

Underjord is a 4 people team doing Elixir consulting and contract work. If you like the writing you should really try the code. See our services for more information.

Note: Or try the videos on the YouTube channel.