673 views
In this session of the last BASTA! you will discover the possibilities of Generative AI for semantic search in your business applications. Immerse yourself in the world of Large Language Models (LLMs) and Retrieval-augmented Generation (RAG) and learn how to use these techniques effectively beyond simple tutorials. Learn more about: Semantic search: The potential of generative AI for searching your data Large Language Models (LLMs): How these AI models work and their application in semantic search Retrieval-augmented Generation (RAG): Combine LLMs with retrieval techniques for more precise search results Vectors and embeddings: Deepen your knowledge of the basics of semantic search Vector databases: Use the power of these databases for efficient search Splitting when creating embeddings: Optimizing the performance of your semantic search Indexing: Learn the most important points for correctly indexing your data Tools: A comparison of different tools such as LangChain, Azure Cognitive Search, Qdrant and Chroma Efficient RAG process: Integrate all steps into your own application Meaningful answers: Provide your users with the relevant information to their questions Speaker: Sebastian Gingter looks back on over twenty years of experience as a professional software developer. In the course of his work as a consultant at Thinktecture AG, however, his gaze is firmly directed towards the future: towards modern (web) technologies, both on the client with Type and JavaScript, on Angular on the server with JavaScript under Node.js or with C# and .NET Core. Since 2008, he has been a passionate and fun explainer, speaks at international conferences and has published specialist articles. ► To the BASTA! website: https://basta.net/ ► The full program: https://basta.net/programm/ ► Your ticket to success: https://basta.net/tickets/