"I will stone you, stone you.." No images? Click here Did you know that I'm putting together a conference? Probably. Did you know that the waiting list of people wanting to be kept updated on the progress is now almost as full of people as the conference can handle? We are planning a 150 people maximum conference and the list of interested people willing to hand us an email is 123 people, just a month after announcing. I could not be more pleased. And sweating mildly. This is gonna be fun. The only official Elixir and Nerves swag is available for pre-order/order at oswag.org. Now shipping with stickers :) How Stuff Gets Done (sometimes), In Open SourceI won't hide the point. Relationships. It is always relationships. So let's try and unpack the journey of our fledgling support for the Raspberry Pi AI Kit that has just reached a very nice technical milestone. Every effort starts somewhere and in the Nerves project the Raspberry Pi has had support for a very long time and significant effort has been put in by core team members and community contributors in spite of them making up a fairly small part of the commercial use of Nerves. The Raspberry Pi has found a place in industry and consumer devices but there are many alternative chips. The support for Raspberry Pi in Nerves was primarily in service to hobbyists and enthusiasts. That also serves as a good entry point and helps with adoption so it serves many purposes. Raspberry Pi doesn't usually release all that much hardware. A few cameras over the years and their camera tooling has been a bit of a journey. More recently they've been on a tear. The ones that need support from software are their AI Kit (Hailo 8L is the part), AI Camera (some Sony accelerator + a Pi chip) and their AI HAT+ (Hailo 8, not the 8L). Prior to this the easy-to-use accelerator story started and ended with the Coral TPU. A Google technology which you could get as a USB dongle. It is usefully fast but also getting a bit old and hasn't seen any revisions. It performs at 4 TOPS, while the Hailo 8L does 13 TOPS and the Hailo 8 does the double 26 TOPS. TOPS being an incredibly scientific and not at all game:able measurement... Anyway. With hype being what it is and also experimentation with ML being quite fun we certainly want people to be able to use their brand new AI Kit with their brand new Pi 5 (or 4 I believe). What will it take to support it? Certainly Hailo must provide a Linux driver that exposes a standard accelerator class interface to be accessed over DRM and Vulkan? Of course not. That's for suckers. Apparently. They provide a PCIe driver to talk to the thing and then I suppose the firmware is closed blobs. And then they have a whole suite of custom tooling for building models for their proprietary model format. They ship a bunch of models for easy integration via Python with gstreamer and whatnot. And to their credit they provide open source bindings at a C/C++ level. So that's a path for a NIF and using it from Elixir. There is also a cli tool that can at least be used to test that things are working. First order of business. Someone in the Elixir Slack asked about the AI Kit and Frank mentioned that an "lsmod" command from the Raspberry Pi OS install with the AI Kit operational would be good to get an idea of the hardware/driver needs. So I grabbed my newly acquired kit and performed the song and dance. The only output of relevance was "hailo_pci". Various people expressed interest and as part of my efforts to snipe the best nerd for the job I sent Cocoa Xu a Pi 5 with AI Kit. She was reasonably enough busy with actually having a job, moving and helping explain cc_precompiled to me. The effort sat still for a minute. Then Gus Workman expressed some interest and I gently shoved him at the problem. He found that Buildroot has integrated support for the kernel driver and started to look at building the "hailort" repository which would provide the CLI tool and the bindings. He got a custom package for Buildroot set up but had some troubles with the build. If someone knows how to fix a build of way too much C/C++ it is Cocoa. And at the same time Paulo Valente and other folk expressed interest in doing things on the AI Kit with Elixir. Poking these various people I finally pulled them together into a group DM. Conspiracy happened over at the Erlang Ecosystem Foundation's Slack. Cocoa, bless her, hacked away at the build problems based on Gus' foundation. Then when it finally built and ran without missing symbols there were som version mismatches between Buildroot's driver and the runtime tool, we didn't have the firmware blob in place and a few of those things. I spent a little time wrapping that up. And we could successfully run the CLI tool with a model for the first time. That's where things stand right now. There is a lot more to do in terms of packaging this up, providing some useful documentation and figuring out good ways to run inference without leaving Elixir. Whether people want to build NIF bindings or start on an Nx backend for this remains to be seen. If we can get Ortex to build under Nerves we could use the Hailo ONNX Execution Provider as well (if you know Rust tooling and cross-compilation, please help, it may be easy). What made this happen was shared enthusiasm and optimistic collaboration. No one had to do this, the strongest drivers for people were to make some conference talks happen. And people come and go, fade in and out. And occasionally someone gives a holler or needs help. My major contribution was that I really want it to work and was willing to bother people about it and connect people who were going at it independently. Maybe they all just did it to put a smile on my face. I doubt that was it but it did do that. Some things get built because they are needed at someone's job and can be release in the open. Other things get done because someone has a grand vision and tries to build The Thing. Then there are some that are just put together ad-hoc by roving bands of developers. This was that. Connecting over something that we want to make work. If you are curious about any of this or if you can fix Ortex for me you can reach me on the Fediverse where I'm @lawik@fosstodon.org or by responding to this email to lars@underjord.io. Thank you for reading. I appreciate it. |