By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Spotlights

Lyra: Fast, In-memory, Full-text Search Engine

This interview is part of our OSS Spotlight series where we showcase founders of fast-growing open-source projects that are solving really unique problems and experiencing strong community adoption.

Sudip Chakrabarti spoke to Michele Riva, creator of Lyra, a fast, in-memory, typo-tolerant, full-text search engine.

Michele shared with us his motivation behind creating Lyra and how he is keeping up with all the fast growth of the project.

Tell us about yourself. What did you do before, what has been your journey until you started working on Lyra?

Michele: I have had a nontraditional start to my career. I chose not to go to college, but instead went to an experimental high school in Italy - one of its kind in the country - that was dedicated to teaching practical applications. The first two years, I studied diverse subjects such as cinema and photography, but eventually specialized in IT for the final three years. My first job out of high school was in Florence where I worked on Java CMS and learned a lot about the intricacies of programming. Since then I have built my expertise as a JavaScript developer working on all aspects of the JAM stack and have become a Google Developer Expert and a Microsoft MVP. I am currently working as a Senior Software Architect at NearForm. One fun fact about me is that I am also the author of “Real-World Next.js,” a book on how to build scalable, high-performance, modern web applications.

Michele being interviewed in his high school senior year about a mobile app he had built to help people with ADHD keep their focus while studying by listening to binaural beats

Tell us about your motivation to create Lyra. Did the world need another full-text search engine after Elastic?

Michele: I started working on Lyra somewhat accidentally. A few months ago, I was invited to give a talk on search engines at the WeAreDevelopers conference - one of the largest developer conferences in the world. In the past, I had worked with search engines like Elastic, Algolia, and a few of the newer, alternative full-text search engines. However, I have always wondered why all the major full-text search engines are difficult to deploy and scale and have footprints that are way too heavy to run on the edge. Channeling my inner Richard Feynman here - “what I cannot create, I do not understand” - I decided to build a full-text search engine of my own to see if I could really build something that was highly performant, very easy to scale and yet, tiny enough to run on the edge. That’s how Lyra was born. I named it after the Lyra constellation due to its distributed and highly scalable nature.

Tell us some of the history of the project. When did you start it, what were the early days like? What made you decide to open-source the project?

Michele: In the beginning, Lyra was a very simple project written in JavaScript and was only a tiny fraction of what it is today. As soon as I realized the potential of the project based on some early user feedback, I teamed up with Paolo Insogna (a core Node.js member and an expert in JavaScript performance) and rewrote most of the code over a few weeks. We released the first stable version in mid-August. In the first two days, the project got >1,000 stars on GitHub and then >3,000 stars in the first week! It was a bit crazy considering there were already several other search engines out there, but also understandable since none of those can run on the edge.

The decision to open source Lyra was a no-brainer. Since I do not have a college degree, whatever I have learnt as a developer I have learnt from the open-source community. So, I feel a strong sense of responsibility to give back to the community as much as I can. Lyra started as a way for me to teach others algorithms and data structure optimizations. It is my way of giving back to everyone who comes after me.

Metrics showing the growth of the Lyra open-source project (via the Decibel OSS Terminal):

What problem are you solving with Lyra? Why would someone use Lyra and not one of the other search engines out there?

Michele: Lyra is a modern, dependency-free, full-text search engine written in TypeScript. It has been built with speed in mind and completes most search lookups in microseconds. I built Lyra with the main goal of being able to run it on edge networks, such as AWS Lambda@Edge, Cloudflare Workers, and Netlify Functions. Lyra is really good for applications that need low latency and fast response times. For example, if you are traveling by car and switching from one cell tower to another, having low latency on the edge would make a world of difference to your user experience. Lyra is built to deliver that. Also, Lyra does not need any real management when it comes to scaling infrastructure automatically. This is a huge advantage, especially for edge networks for which the edges could be ephemeral.

Is there something unique about Lyra’s design that you would like to highlight for other open-source creators?

Michele: While designing Lyra, we spent a lot of time thinking about how we could make Lyra handle both greenfield and brownfield use cases and took several design decisions that I feel are greatly benefiting us now. For example, Lyra supports every single JavaScript runtime out there - Node.js, Deno, Bun, web browsers, V8 isolates, you name it. This allows a user to have their existing infrastructure - say on AWS - and deploy their edge workload on a different edge infrastructure - say CloudFlare - and have Lyra be the glue between those two. I believe that Lyra’s ability to serve brownfield use cases is greatly helping its adoption because users do not have to replace their existing infrastructure to use Lyra. I often see that new open-source projects primarily focus on greenfield use cases which slows down adoption because those projects then have to catch the users at the right place at the right time.

What have you learned since launching the project that you wish you knew when you had started? What would be one thing you would have done differently knowing what you know now?

Michele: I would’ve started working on adding plug-ins to Lyra much earlier. An example plug-in would be one that is needed to persist data so that it could be exported into multiple formats (JSON, binary, etc.). Another example would be the plug-in to host data on AWS S3 so that Lyra could be scaled infinitely. These are the plugins that a lot of Lyra users - both individuals and organizations - are clamoring for. Had I anticipated how quickly Lyra would catch on, I would have added some of those most commonly requested plug-ins much earlier. That said, the Lyra community is already chipping in and building some of those plug-ins like the one with Fastify.

What other OSS projects (besides the usual suspects) do you admire most and why?

Michele: I’d still mention Elastic even though it is a so-called “usual suspect” because it was the project that made me fall in love with full-text search. Among emerging projects, I am a big fan of Fastify, a popular web development framework for Nodejs. Seeing the rate at which people are adopting Fastify and moving away from Expressjs, which has been a very popular Nodejs framework historically, has been truly inspiring. Also, I believe great people = great products, and Fastify has some of the best developers I know.

What advice would you have for someone who is thinking of starting a new open-source project?

Michele: My only advice is that, if you plan to work in open source then make sure you build what you love and really care about, even if it might not be the most glamorous project. For example, I am essentially reinventing the wheel by building yet another search engine, but I really love the problem space and I am having a lot of fun doing so. Similarly, if you want to contribute to any open-source project, pick one that you deeply care about instead of just making inconsequential contributions to high-profile projects.