AI Food Fights in the Enterprise
Earlier this year we met with our Limited Partners. Their top question was “will the AI transition destroy your existing cloud companies?”
We began with a strong default of “no.” The classic battle between startups and incumbents is a horse race between startups building distribution and incumbents building product. Can the young companies with coo... See more
We began with a strong default of “no.” The classic battle between startups and incumbents is a horse race between startups building distribution and incumbents building product. Can the young companies with coo... See more
Pat Grady • Generative AI’s Act O1
Unlike consumers, enterprises want control over how their data is used and shared with companies, including the providers of AI software. Enterprises have spent a lot effort in consolidating data from different sources and bringing them in-house (this article Partner integrations + System of Intelligence: Today’s deepest Moat by fellow Medium autho... See more
AI Startup Trends: Insights from Y Combinator’s Latest Batch
Nicolay Gerold added
But just like the internet, someone will show up later and think about something like Uber and cab driving. Someone else showed up and thought, “hey, I wanna check out my friends on Facebook.” Those end up being huge businesses, and it’s not just going to be one model that OpenAI or Databricks or Anthropic or someone builds, and that model will dom... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
Nicolay Gerold added
Not sure about this one. Just an interesting snippet. I figure there are some reinforcing loops in the data, where the models get better with more data, attracting more users, generating more data. At the same time, I believe there are huge advantages in the knowledge on how to train and how to manage inference at scale, which makes a huge difference. I do not see anyone catching up to OpenAI at the moment, especially with their new finetuning offer.
An interesting factor might be figuring out the right data mix for pre-training and using a better screeing to weed out unwanted behavior. Whoever can figure that out at scale might have a huge advantage, if they can keep it a secret.
If you made a thousand versions of an LLM, that’s good at a thousand different things, and you have to load each of those into the GPUs and serve them, it becomes very expensive. The big holy grail right now that everybody’s looking for is: are there techniques, where you can just do small modifications where you can get really good results? There... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
Nicolay Gerold added
PEFT in a nutshell.
The fight between creators and AI companies is fierce. The current paradigm in AI is to build bigger and bigger models, and there is, at least currently, no getting around the fact that they require vast data sets hoovered from the internet to train on.
Claudia added
If we want to understand what AIs are going to look like, I think the proto AI that we have are corporations. Corporations are sort of these funny little beasts. They’re not small. I guess they’re not little beasts, but they’re strange. It takes special training to have humans be able to fit within them. They’re made out of humans mostly but they’r... See more
Recode Staff • Full transcript: Internet Archive founder Brewster Kahle on Recode Decode
madisen and added
The State of AI with Marc & Ben
open.spotify.commuizz added
AI and the Internet are different phenomena.
The internet increased the leverage of computers by connecting them to other computers. Amplifies things that benefit from wider distribution. Network effects.
LLMs are stateless calculators (computers). Amplifies things that have a lot of computational complexity. Quick and Reliable decision making.