Skip to content

Instantly share code, notes, and snippets.

@abhi1868sharma
Last active November 28, 2021 12:57
Show Gist options
  • Select an option

  • Save abhi1868sharma/69909a15ccaea6646a36c989b91ddde0 to your computer and use it in GitHub Desktop.

Select an option

Save abhi1868sharma/69909a15ccaea6646a36c989b91ddde0 to your computer and use it in GitHub Desktop.
This gist creates a positional encoding of size x.size(1),512
import numpy as np
d_model=512
def positional_embedding(pos,d_model=512):
dpow=np.array([2*i/d_model for i in range(d_model//2)]) # takes care of 10k^ 2*i/d_model
# defining the sin/cos denom
sindenom=np.power(np.repeat(10000,d_model//2),dpow)
cosdenom=np.power(np.repeat(10000,d_model//2),dpow)
# creates the sin cos series
sin_series=np.sin(pos/sindenom)
cos_series=np.cos(pos/cosdenom)
pe=np.empty(d_model) # initializes an empty array of size d_model
# assign alternating positions of pe sine/cosine pairs
pe[0::2]=sin_series
pe[1::2]=cos_series
return pe
pe_numpy=np.array([positional_embedding(i) for i in range(x.size(1))])
def cosine_similarity(pe0,pe1):
num=np.dot(pe0,pe1)
denom=np.linalg.norm(pe0)*np.linalg.norm(pe1)
return num/denom
cosine_similarity(pe_numpy[0],pe_numpy[1])
cosine_similarity(pe_numpy[0],pe_numpy[9])
# the difference between line-36 and line-38 is a desirable property as you want neighbouring positions to have similar embeddings
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment