Summer Reading Group Data Science Lab

We are group of members of Data Science Lab at Iowa State University. We get together to read and understand the very recent papers related to Theoretical and algorithmic aspects of nonlinear parameter estimation and deep learning. We meet twice a week. Please see our Reading list and Schedule page for more details.

week 11 - Tuesday, 8st August

Towards Understanding the Dynamics of Generative Adversarial Networks

In our 13th meet on Tuesday, Praneeth presented the paper titled [Towards Understanding the Dynamics of Generative Adversarial Networks](https://arxiv.org/abs/1706.09884.

Notes for the meeting are available here.

week 10 - Tuesday, 1st August

Towards Understanding the Invertibility of Convolutional Neural Networks

In our 12th meet on Tuesday, Gauri presented the paper titled Towards Understanding the Invertibility of Convolutional Neural Networks.

Notes for the meeting will be posted here once available.

Week 8 - Friday, 28th July

Data Science Reading Group Seminars – Invited Talks

In our 11th meet on Friday, instead of our routine talks, we hosted invited speakers Ju Sun and Ludwig Schmidt as a part of Data Science Reading Group Seminars. Details below :

Talk 1 – 2 - 3:30 pm, Coover 3043

Speaker : Ju Sun, Postdoctoral Research Fellow at Stanford University

Title : “When Are Nonconvex Optimization Problems Not Scary?”

For more details, Click Here.

Talk 2 – 3:45 - 5:15 pm, Coover 3043

Speaker : Ludwig Schmidt, PhD student at MIT

Title : Faster Constrained Optimization via Approximate Projections (tentative)

Week 7 - Tuesday, 18th July

Stabilizing GAN Training with Multiple Random Projections

In our 10th meet on Tuesday, Viraj presented the paper titled Stabilizing GAN Training with Multiple Random Projections

Notes for the meeting are available here.

Week 6 - Tuesday, 11th July

Maximal Sparsity with Deep Networks?

In our 9th meet on Tuesday, Thanh presented the paper titled Maximal Sparsity with Deep Networks?

Notes for the meeting are available here.