← Back to Directory

Row Details #2187

Data:

{
  "text": "I just started reading about Transformers model. I have barely scratched the surface of this concept. For starters, I have the following 2 questions\n\n1. How positional encoding are incorporated in the transformer model? I see that immediately after the word embedding, they have positional encoding. But I'm not getting in which part of the entire network it is being used?\n\n2. For a given sentence, the weight matrices of the query, key and value, all of these 3 have the length of the sentence itself as one of its dimensions. But the length of the sentence is a variable, how to they handle this issue when they pass in subsequent sentences?",
  "label": "r/tensorflow",
  "dataType": "post",
  "communityName": "r/tensorflow",
  "datetime": "2023-07-12",
  "username_encoded": "Z0FBQUFBQm5LakwwMU5RUDlzSG9NYnRvcjlybHQxLUhaYTRaZGNQQnhRSmFPOE5USFc4MmdpeS1pSkVsNHY4S1RuakljeHlwXzZFRXpoVUtOc0F3bmhZT0g2WnVpTzNYT1NVd05kdUNNay1fb2Rwak9TMlFQWUU9",
  "url_encoded": "Z0FBQUFBQm5Lak9FcXQxQ1hfRnJUWW1qLTdDX3hMeGdNWUdZcXVxNC1SZ3hfLWV2X29aYU5RV25JU2VsLXFzQlBkYVNEV1Rfb2NsZTlPTU9WMERQaWtTNkVNUjY0TWM4bTN3VmxCWUNoQzN4a0NYZzlJSXc4NVdBdEZMUUZOTXFEMS12M1F4X1ZRSU9TX2VMZ1lVa0pwd3l3Y01fR2Q5TUZHUE4wR3ZxS0psdE5yRF94Vm5LUk02TVhpNGJnUWl2d2hHNHNjNHRtR0hU"
}