r/MachineLearning - Reddit
People need to be more thoughtful building products on top of LLMs. The fact that they generate text is not the point.
thesephist.comstream.thesephist.comWhen we deliver a model we make sure we don't reach X seconds of latency in our API. Before even going into performance of LLMs for classification, I can tell you that with the current available tech they are just infeasible.
Reply
reply
LinuxSpinach
•
5h ago
^ this. And especially classification as a task, because businesses don’t want to pay llm... See more
Reply
reply
LinuxSpinach
•
5h ago
^ this. And especially classification as a task, because businesses don’t want to pay llm... See more
r/MachineLearning - Reddit
In general, I see LLMs to be used in two broad categories: data processing, which is more of a worker use-cases, where the latency isn't the biggest issue but rather quality, and in user-interactions, where latency is a big factor. I think for the faster case a faster fallback is necessary. Or you escalate upwards, you first rely on a smaller more... See more
Discord - A New Way to Chat with Friends & Communities
LLMs are powerful tools, but they are fundamentally backwards-looking in their reliance on training data, with limited ability to reason toward novel conclusions. Thus, they are quite specifically unsuited to early-stage venture capital investments where extreme idiosyncracy grapples with realities that may be a decade away.
That’s not to say they... See more
That’s not to say they... See more