Skip to content

Instantly share code, notes, and snippets.

View jvpoulos's full-sized avatar

Jason Poulos jvpoulos

View GitHub Profile
@jvpoulos
jvpoulos / Attention.py
Created August 25, 2017 20:36 — forked from luthfianto/Attention.py
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
from keras.layers.core import Layer
from keras import initializers, regularizers, constraints
from keras import backend as K
class Attention(Layer):
def __init__(self,
kernel_regularizer=None, bias_regularizer=None,
kernel_constraint=None, bias_constraint=None,
use_bias=True, **kwargs):
"""