Loading...
「ツール」は右上に移動しました。
12いいね 443回再生

Why do we need Positional Encoding in Transformers?

#transformers #positionalencodings #naturallanguageprocessing
Transformers unlike LSTMs do not inherently account for the order of tokens in a sequence, positional encodings provide a way for the model to understand the position of each token, which is crucial for many tasks such as natural language processing. In this video, we try to answer the Why, What, and Where of Positional Encodings in Transformers.

⏩ IMPORTANT LINKS
Research Paper Summaries:    • Simple Unsupervised Keyphrase Extract...  
Rotatory Positinal Encoding - arxiv.org/abs/2104.09864
A Simple and Effective Positional Encoding for Transformers - arxiv.org/abs/2104.08698
The Impact of Positional Encoding on Length Generalization in Transformers - arxiv.org/abs/2305.19466

Enjoy reading articles? then consider subscribing to Medium membership, it is just 5$ a month for unlimited access to all free/paid content.
Subscribe now - prakhar-mishra.medium.com/membership

*********************************************
⏩ Youtube -    / @techvizthedatascienceguy  
⏩ LinkedIn - linkedin.com/in/prakhar21
⏩ Medium - medium.com/@prakhar.mishra
⏩ GitHub - github.com/prakhar21
*********************************************

⏩ Please feel free to share out the content and subscribe to my channel -    / @techvizthedatascienceguy  

Tools I use for making videos :)
⏩ iPad - tinyurl.com/y39p6pwc
⏩ Apple Pencil - tinyurl.com/y5rk8txn
⏩ GoodNotes - tinyurl.com/y627cfsa

#techviz #datascienceguy #deeplearning #ai #openai #chatgpt #machinelearning #recommendersystems

About Me:
I am Prakhar Mishra and this channel is my passion project. I am currently pursuing my MS (by research) in Data Science. I have an industry work-ex of 4+ years in the field of Data Science and Machine Learning with a particular focus on Natural Language Processing (NLP).