Neural Network Workshop: Difference between revisions
Jump to navigation
Jump to search
Mschachter (talk | contribs) |
Mschachter (talk | contribs) mNo edit summary |
||
| Line 20: | Line 20: | ||
**Basic Architecture | **Basic Architecture | ||
**Activation Functions | **Activation Functions | ||
***Which activation functions are good? Hornik's 1991 Paper | |||
**Error Functions and Output Layers | **Error Functions and Output Layers | ||
***Regression (univariate and multivariate) | ***Regression (univariate and multivariate) | ||
Revision as of 02:13, 28 December 2010
What: A hands-on workshop courtesy of the Machine Learning Group using Neural Networks that includes both the theory and practical implementation.
When: Tenatively January 26, 2011 7:00pm - 10:00pm
Why: To raise Neural Network Awareness (NNA) and money for Noisebridge. Donations towards Noisebridge will be encouraged and appreciated.
Where: In the back classroom
Who: Anyone who wants to participate, either come and learn or help teach. Join the mailing list!. The math will be explained, but is not essential to totally understand in order to work with the examples given.
Workshop Overview, Resources
- Math Preliminaries
- Universal Function Approximators
- Linear Algebra: vectors, matricies
- Optimization Theory: error functions, gradients
- Machine Learning: regression, classification
- Neural Networks
- Basic Architecture
- Activation Functions
- Which activation functions are good? Hornik's 1991 Paper
- Error Functions and Output Layers
- Regression (univariate and multivariate)
- Classification (binary and multi-class, logistic and softmax)
- Training
- Backpropagation
- Implementation
- Identifying Faces with Neural Nets
- ??? Other such ideas
Software
This is just a list of packages used for constructing and training neural networks: