6,510 views
Have you ever wondered how machines understand human language? ???? The answer lies in EMBEDDINGS! In today's video, we're diving into the fascinating world of vector representation of words with Word2Vec and Skip-Gram. ???????? But that's not all... we're taking this theory to an EPIC DUEL with Yu-Gi-Oh cards! ????✨ Get ready to transform Yu-Gi-Oh card text into powerful vectors that reveal the magic behind words and their context. LLMs Video: • It's not all about ChatGPT - Introduction to LLMs... Prompt engineering: • Prompt Engineering Interacting with... Embeddings: • The magic of Machine Learning: Embedd... RAG: • BETTER and CHEAP: How RAG is... Repository with the code: https://tcsg.dev/embeddings SUPPORT ME: Join the channel and enjoy benefits: https://www.youtube.com/@feregri_no/join Buy me a coffee: https://www.buymeacoffee.com/feregrino SOCIAL: /feregri_no / feregri_no https://twitch.com/feregri_no / feregri_no https://github.com/fferegrino https://kaggle.com/ioexception https://feregri.no TIMESTAMPS: 00:00:00 Start 00:01:02 Introduction to the problem 00:08:13 Introduction to embeddings 00:12:15 Creating embeddings with Skip-Gram 00:25:58 What is Graph2Vec 00:31:53 Creating embeddings with co-occurrences 00:35:29 Dataset for the project 00:37:57 Downloading the data 00:39:24 Inspecting the data 00:42:41 Processing the decks 00:45:38 Processing the card information 00:49:05 Generating the co-occurrence matrix 00:51:39 Generating the co-occurrence matrix in Python 00:54:16 Calculating embeddings using SVD 01:01:31 Calculating SVD with Python 01:02:36 Running vector queries 01:03:58 Vector database 01:06:06 Vector database usage example 01:16:05 Extra - generating embeddings using OpenAI 01:19:12- Conclusions 01:20:24- Farewell