Today Pinecone announces its $100 million Series B financing. All of us at Wing congratulate the team on this achievement, and welcome new investor Andreessen Horowitz, and our longtime friend Peter Levine, to the Pinecone family.
It’s been more than four years since I met Pinecone’s founder and CEO Edo Liberty, and 2.5 years since I led the company’s seed financing. You can read the rationale behind that initial investment here; the original logic still holds. Meanwhile, almost everything else in the technology world seems to be changing at breakneck pace. Driving that change, of course, is the stunning emergence of AI.
The power of vectors
Edo deserves tremendous credit for his prescience. In our earliest conversation, he rattled off a list of AI-first workloads (from my notes in January 2019: “search and rank”, “near dup detection,” “media recommendation,” “personalization,” “anomaly detection,” “root cause analysis”) and asserted that “they are all the same under the hood.”
The opportunity he saw was not to make better models, but to run them better in production. I admit—it took me a while to get it. It didn’t help that when I asked Edo what his core technology (the “vector database”) would be exceptionally good for, he shrugged and said, “Pretty much everything, man!”
Turns out he was right.
The vector representation of data, and the types of processing it requires, is the lingua franca of AI and truly horizontal in its impact. An ever-expanding array of data, models, and applications benefit mightily from this new information architecture, and Pinecone has emerged as the leading data platform upon which they are built. But as insightful as Edo was, even he could not have predicted the tsunami of adoption unleashed by generative AI.
Generative AI, generative momentum
Generative AI (and the large language models that enable it) has a long research history, but it was the addition of a simple interface that let it burst into popular consciousness.
ChatGPT was not a technical breakthrough, but it has been of profound significance in making AI accessible to users and developers. Since its release, we have experienced a Cambrian explosion of innovation. The number of developers using LLM’s in all sorts of creative ways is growing exponentially, and those developers need a data platform of precisely the sort Edo and his team began building more than four years ago.
One of the most interesting aspects of Pinecone’s business is its product-led growth motion. When Wing first invested, there was some debate within the team as to what the most appropriate go-to-market motion might be. Should a direct sales effort be mounted towards major accounts? Or a bottoms-up motion emphasizing developer adoption? While both are in play to at least some degree, Pinecone has focused on enabling developers via PLG with meaningful results. The success of this motion bodes well for the pervasiveness of adoption and the ultimate efficiency of the business.
Long-term thinking, and long term memory for AI
Pinecone’s vision is to be the long-term memory for AI. As sophisticated as today’s LLM’s are, they lack a distinct memory system. To quote Edo yet again: “Today’s LLM has attempted to bake all of humanity’s knowledge into the weights of the neural net!”
Model effectiveness and its development can be massively improved by separating the knowledge base from the model itself. Pinecone enables this more modular AI architecture, with important implications for the pace of AI progress and the breadth of its application.
Investing in our AI-first future
I was honored to make a seed investment in Snowflake back in January 2013, and had the privilege of working closely with its amazing founders and team ever since. There are clear similarities between Snowflake and Pinecone, which is sometimes referred to as the “Snowflake of AI.”
But there are important differences too. Snowflake targeted the large pre-existing market for SQL analytics; Pinecone is catalyzing an entirely new class of AI-first applications. Snowflake unlocked pent-up demand by eliminating the friction of legacy data warehousing. Pinecone enables voracious new demand from developers eager to use AI in first-of-their-kind products.
It’s thrilling to imagine a future built on this new paradigm. There will be a host of new applications built on the new stack featuring the Pinecone data platform and LLM-based AI. That’ll require a new ecosystem of tooling, in some ways mirroring elements of prior data stacks, and in others meeting totally new needs native to the vector / LLM architecture.
Wing is actively investing in opportunities up and down the AI-first technology stack, from infrastructure to applications, and would love to collaborate with founders who are as excited as we are by the possibilities of the AI-first transformation.