sucks for optimists No images? Click here ![]() LLMs suckI find that they can do useful things. I find they can perform some really tedious work with little effort. They could enable some mind-boggling levels of new accessibility in tech. They could help us all wrangle unstructured data like human language. Being excited about LLMs or "AI" is not an option. For me. It may be an option for you but I'd argue it shouldn't be. Brought to us by the people, the companies, who've repeatedly shown that they have not an ounce of moral fiber. Where disruption is the game. Just break the bone to see if we can profit from making it heal different. Better is not the goal. Cheaper is sometimes the goal. Profitable is sometimes the goal. But generally it is disruption for disruptions sake. We know we can't trust their intentions. Unless you are some kind of market liberal type of person in which case I guess terrible is great actually. I personally prefer living in a society. The technology is not mean and does not have bad intent. Because it is not sentient. It is full of bad bias because it reflects the output of humanity. A deeply imperfect tool. The people behind the major companies working on it are manipulative, extractive and playing big money and influence games. You should distrust them deeply. And they will not keep releasing competitive open models forever and we can't afford to train our own. I will learn to use the tool where I find it sensible because being bad at things is not part of my job description and closing my eyes has never helped prevent the world from changing. It is unclear if the tech markets will recover from the current downturn. A pendulum swing would require the ability to move in the other direction. Let's see:
When a dev team is straining under stress and pressure to ship things they do worse work. Corners cut, mistakes made, important matters left unconsidered. Most of the time when faced with pressure we sacrifice of ourselves to achieve results. So a lot of companies will feel like their new strategy of having fewer devs is working just as well as the larger team. Because they can't judge software quality and they don't see or care about the human cost. This is what I think LLMs will cause in development. And in the wider world. Slop. Finding information will be even worse because providing something that looks like information is as good as providing actual information now. There are no consequences for massive accidental misinformation in the service of serving ads. Full new world wide webs will be generated that are of completely unclear provenance and at very little cost. Customer support will be worse than it ever has been. Product descriptions would become hilariously bad except we won't find it hilarious because we'll be trying to read them. You will end up using products in your work that someone spent a weekend pretending they could build and which they then sold to your boss. I like to be optimistic. I like to be curious and excited about tech. I think LLMs are interesting in cool as a technology. It is wild what they've managed to create. And I'll deal with the world it brings because I have no choice in that. I am not optimistic about the consequences and outcomes. Things becoming cheaper to do but with worse quality does not trickle down the way one might hope. We get the worse and they get the cheaper. --- Have a good weekend. Talk to some humans. Consult an encyclopedia. Enjoy something that was carefully crafted. Put your fingers in the dirt if you like that sort of thing. I'm going to work on building our greenhouse and do extra hours on trying to gather about 180 of my peers in person. Thanks for reading. I appreciate it. September 10-12, Varberg, Sweden An Elixir conference that is just a little bit different. Featuring the first ever NervesConf EU. Check it out at goatmire.com. The officially blessed Elixir and Nerves shirts are ready, you can buy them at oswag.org. Our little shirt operation. EventsNervesConf EU Goatmire Elixir Oredev |