LLMs
The new seed parameter enables reproducible outputs by making the model return consistent completions most of the time. This beta feature is useful for use cases such as replaying requests for debugging, writing more comprehensive unit tests, and generally having a higher degree of control over the model behavior. We at OpenAI have been using this... See more
New models and developer products announced at DevDay
- Multiple indices. Splitting the document corpus up into multiple indices and then routing queries based on some criteria. This means that the search is over a much smaller set of documents rather than the entire dataset. Again, it is not always useful, but it can be helpful for certain datasets. The same approach works with the LLMs themselves.
Matt Rickard • Improving RAG: Strategies
The OpenAI Assistants API offers more than a simple prompt-sharing interface; it provides a sophisticated framework for AI interactions. It allows for persistent conversation sessions with automatic context management (Threads), structured interactions (Messages and Runs), integration with various tools for enhanced capabilities, customization... See more
Discord - A New Way to Chat with Friends & Communities
- Right now, GPTs are the easiest way of sharing structured prompts, which are programs, written in plain English (or another language), that can get the AI to do useful things. I discussed creating structured prompts last week, and all the same techniques apply, but the GPT system makes structured prompts more powerful and much easier to create,
Ethan Mollick • Almost an Agent: What GPTs can do
However, a key risk with several of these startups is the potential lack of a long-term moat. It is difficult to read too much into it given the stage of these startups and the limited public information available but it’s not difficult to poke holes at their long term defensibility. For example:
- If a startup is built on the premise of taking base
AI Startup Trends: Insights from Y Combinator’s Latest Batch
Matei Zaharia, Omar Khattab, Lingjiao Chen, et al. • The Shift From Models to Compound AI Systems
Document search and synthesis
Scores of organizations want to harness generative AI so employees can easily find the most relevant documents through improved search results and summaries. For example, your organization can reduce the time it takes employees to find answers to common HR- and process-related questions. Internal manuals and sites are... See more
Scores of organizations want to harness generative AI so employees can easily find the most relevant documents through improved search results and summaries. For example, your organization can reduce the time it takes employees to find answers to common HR- and process-related questions. Internal manuals and sites are... See more
Donna Schut • The Prompt: Takeaways from hundreds of conversations about generative AI - part 1 | Google Cloud Blog
Jail-Breaked & Offline Appliances: It’s becoming increasingly clear that we’ll be able to interact with everyday appliances and devices with natural language. As locally run LLMs become more efficient and powerful, the prospects of having a conversation with your coffee machine in the morning aren’t unreasonable. After all, who wants to tinker with... See more