Neural Network Workshop: Difference between revisions
Jump to navigation
Jump to search
Mschachter (talk | contribs) No edit summary |
Mschachter (talk | contribs) No edit summary |
||
| Line 11: | Line 11: | ||
=== Workshop Agenda === | === Workshop Agenda === | ||
*Math Preliminaries | *Math Preliminaries | ||
**Universal Approximators: [http://en.wikipedia.org/wiki/Universal_approximation_theorem Cybenko's Theorem] | **Universal Function Approximators: [http://en.wikipedia.org/wiki/Universal_approximation_theorem Cybenko's Theorem] | ||
**Linear Algebra: vectors, matricies | **Linear Algebra: vectors, matricies | ||
**Optimization Theory: error functions, gradients | **Optimization Theory: error functions, gradients | ||
Revision as of 19:58, 23 December 2010
What: A hands-on workshop courtesy of the Machine Learning Group using Neural Networks that includes both the theory and practical implementation.
When: Tenatively January 26, 2011
Why: To raise Neural Network Awareness (NNA) and money for Noisebridge. Donations towards Noisebridge will be encouraged and appreciated.
Where: In the back classroom
Who: Anyone who wants to participate, either come and learn or help teach. Join the mailing list!
Workshop Agenda
- Math Preliminaries
- Universal Function Approximators: Cybenko's Theorem
- Linear Algebra: vectors, matricies
- Optimization Theory: error functions, gradients
- Machine Learning: regression, classification
- Neural Networks
- Basic Architecture
- Activation Functions
- Error Functions and Output Layers
- Regression (univariate and multivariate)
- Classification (binary and multi-class, logistic and softmax)
- Training
- Backpropagation
- Implementation
- Identifying Faces with Neural Nets
- ??? Other such ideas
Software
This is just a list of packages used for constructing and training neural networks: