1. 程式人生 > >keras-anomaly-detection 代碼分析——本質上就是SAE、LSTM時間序列預測

keras-anomaly-detection 代碼分析——本質上就是SAE、LSTM時間序列預測

encoding urn odin forward mean code -a reat ati

keras-anomaly-detection

Anomaly detection implemented in Keras

The source codes of the recurrent, convolutional and feedforward networks auto-encoders for anomaly detection can be found in keras_anomaly_detection/library/convolutional.py and keras_anomaly_detection/library/recurrent.py and keras_anomaly_detection/library/feedforward.py

The the anomaly detection is implemented using auto-encoder with convolutional, feedforward, and recurrent networks and can be applied to:

  • timeseries data to detect timeseries time windows that have anomaly pattern
    • LstmAutoEncoder in keras_anomaly_detection/library/recurrent.py
    • Conv1DAutoEncoder in keras_anomaly_detection/library/convolutional.py
    • CnnLstmAutoEncoder in keras_anomaly_detection/library/recurrent.py
    • BidirectionalLstmAutoEncoder in keras_anomaly_detection/library/recurrent.py
  • structured data (i.e., tabular data) to detect anomaly in data records
    • Conv1DAutoEncoder in keras_anomaly_detection/library/convolutional.py
    • FeedforwardAutoEncoder in keras_anomaly_detection/library/feedforward.py 看LSTM的模型吧:
          def create_model(time_window_size, metric):
              model = Sequential()
              model.add(LSTM(units=128, input_shape=(time_window_size, 1), return_sequences=False))
      
              model.add(Dense(units=time_window_size, activation=‘linear‘))
      
              model.compile(optimizer=‘adam‘, loss=‘mean_squared_error‘, metrics=[metric])
              print(model.summary())
      return model
      

      再看feedforward的模型:

          def create_model(self, input_dim):
              encoding_dim = 14
              input_layer = Input(shape=(input_dim,))
      
              encoder = Dense(encoding_dim, activation="tanh",
                              activity_regularizer=regularizers.l1(10e-5))(input_layer)
              encoder = Dense(encoding_dim // 2, activation="relu")(encoder)
      
              decoder = Dense(encoding_dim // 2, activation=‘tanh‘)(encoder)
              decoder = Dense(input_dim, activation=‘relu‘)(decoder)
      
              model = Model(inputs=input_layer, outputs=decoder)
              model.compile(optimizer=‘adam‘,
                            loss=‘mean_squared_error‘,
      metrics=[‘accuracy‘])
      

      CNN的:

          def create_model(time_window_size, metric):
              model = Sequential()
              model.add(Conv1D(filters=256, kernel_size=5, padding=‘same‘, activation=‘relu‘,
                               input_shape=(time_window_size, 1)))
              model.add(GlobalMaxPool1D())
      
              model.add(Dense(units=time_window_size, activation=‘linear‘))
      
              model.compile(optimizer=‘adam‘, loss=‘mean_squared_error‘, metrics=[metric])
              print(model.summary())
              return model
      

      都是將輸出設置成自己,異常點就是查看偏離那90%的預測error較大的點。

keras-anomaly-detection 代碼分析——本質上就是SAE、LSTM時間序列預測