Difference between revisions of "Adversarial/LitRev"

From srakrn | Wiki
Jump to navigation Jump to search
Line 4: Line 4:
 
* Conference: NIPS 2019
 
* Conference: NIPS 2019
 
* URL: [https://arxiv.org/abs/1904.12843]
 
* URL: [https://arxiv.org/abs/1904.12843]
 +
 +
* Propose "recycling" of the gradients for adversarial training.
 +
* Count each "replay" as one (non-true) epochs, therefore reducing time used.
  
 
== Fast is better than free: Revisiting adversarial training ==
 
== Fast is better than free: Revisiting adversarial training ==

Revision as of 20:18, 16 July 2020

Literature Reviews on selected adversarial papers!

Adversarial Training for Free!

  • Conference: NIPS 2019
  • URL: [1]
  • Propose "recycling" of the gradients for adversarial training.
  • Count each "replay" as one (non-true) epochs, therefore reducing time used.

Fast is better than free: Revisiting adversarial training

Adversarial Training Can Hurt Generalization

  • Conference: ICML 2019 Workshop
  • URL: [3]

Initializing Perturbations in Multiple Directions for Fast Adversarial Training

  • Conference: N/A
  • URL: [4]

Towards Understanding Fast Adversarial Training

  • Conference: N/A
  • URL: [5]

Overfitting in adversarially robust deep learning

  • Conference: ICML 2020
  • URL: [6]

Certified Adversarial Robustness with Additive Noise

  • Conference: NIPS 2019
  • URL: [7]

Randomization matters: How to defend against strong adversarial attacks

  • Conference: ICML 2020
  • URL: [8]