Planning for Near-Free AI
Open-source LLMs are thriving. Since the summer of 2023, OS LLMs have rapidly approached human-level performance across a suite of benchmarks.
Regardless, open-source (or at least open-access) LLMs continue to catch up.
Self-hosting LLMs can help to protect customer data, prevent fluctuations in ability, and build more defensible tech. Surely, this option is complicated and costly? Quite the opposite. The price for GPT-3-level models is in freefall.
What would you build if GPT-4 were 10x cheaper? What about 100x cheaper? 100x faster?
This is not an outlandish vision but a near-inevitable trend. As we hurdle toward a future where human-level intelligence is virtually free and abundant, how can we prepare?
What if every student had a genius-level tutor, every SMB had Fortune 500-level marketing resources, and every online community had Stripe-level engineering? How do we ensure that these trends serve to close digital gaps rather than widen them?
Our future is not written in stone; it's coded in bytes and shaped by every line of code we write, every application we dream up, and every conversation we have about what comes next.
Whether you're looking to streamline existing processes, unlock new levels of personalization, or pioneer entirely new services, we’re happy to chat about what’s possible.