<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=73.15.212.163</id>
	<title>Noisebridge - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=73.15.212.163"/>
	<link rel="alternate" type="text/html" href="https://wiki.extremist.software/wiki/Special:Contributions/73.15.212.163"/>
	<updated>2026-04-11T02:34:39Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=Bike_Party_Light_Party&amp;diff=60708</id>
		<title>Bike Party Light Party</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=Bike_Party_Light_Party&amp;diff=60708"/>
		<updated>2017-09-29T00:14:57Z</updated>

		<summary type="html">&lt;p&gt;73.15.212.163: /* How do I sign up? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;font size=&amp;quot;7&amp;quot;&amp;gt;Bike Party!&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:EastBayBikeParty.jpg]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Bike Party&amp;lt;/b&amp;gt;, n: &amp;lt;em&amp;gt;A party on bikes&amp;lt;/em&amp;gt;&lt;br /&gt;
&lt;br /&gt;
East Bay Bike Party and the San Francisco Bike Party are monthly events in which people decorate their bikes and ride around town with good music, beer, and 810978479 blinky lights.&lt;br /&gt;
&lt;br /&gt;
How does this intersect with noisebridge? Well everyone likes bikes, and we&#039;ve got a couple of soldering irons! So lets have a Bike Party Light Party.&lt;br /&gt;
&lt;br /&gt;
== The Skinny ==&lt;br /&gt;
&lt;br /&gt;
Come to Noisebridge on a Friday night and learn how to decorate your bike with programmable LEDs. In particular we&#039;ll be soldering Teensy 3.1 microcontrollers to WS2812B LED strips then using zipties to stick them on your two wheeled steed. Everything gets powered with a USB battery pack. These things are super lightweight on power and even the smallest of packs will make you go blink for hours at a time.&lt;br /&gt;
&lt;br /&gt;
October will have a class every week to cover these topics:&lt;br /&gt;
&lt;br /&gt;
* October 6th - Introduction to driving your first LEDs and mounting some strips on your bike. SF Bike Party starts at 8pm.&lt;br /&gt;
* October 13 - More basic animations and Stupid FastLED Tricks. East Bay Bike Party starts at 8pm.&lt;br /&gt;
* October 20 - Advanced color theory, FastLED tricks, and what makes for good animations. No bike party tonight.&lt;br /&gt;
* October 27 - Input controls. No bike party tonight either, but there&#039;s a special Critical Mass party the following week. The noisebridge contingent will be dazzling after this.&lt;br /&gt;
&lt;br /&gt;
[[User:tdfischer|Victoria]] will be the host of these sessions. Every session starts at 6pm and runs until we&#039;re done.&lt;br /&gt;
&lt;br /&gt;
== What you&#039;ll need ==&lt;br /&gt;
&lt;br /&gt;
Buy one of these LED strips if you don&#039;t already have some lights: https://www.amazon.com/dp/B00ZHB9M6A/&lt;br /&gt;
&lt;br /&gt;
One of those packages will come with enough lights to cover your bike twice in blinky goodness.&lt;br /&gt;
&lt;br /&gt;
You&#039;ll also want to acquire a Teensy 3.1/3.2 device. Victoria will have some available for purchase each week.&lt;br /&gt;
&lt;br /&gt;
Don&#039;t forget your bike. Thats kinda the whole point here.&lt;br /&gt;
&lt;br /&gt;
== How do I sign up? ==&lt;br /&gt;
&lt;br /&gt;
Add your name below:&lt;br /&gt;
&lt;br /&gt;
* Victoria Fierce&lt;br /&gt;
* Manish&lt;br /&gt;
* Steve&lt;/div&gt;</summary>
		<author><name>73.15.212.163</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59540</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59540"/>
		<updated>2017-07-09T19:11:52Z</updated>

		<summary type="html">&lt;p&gt;73.15.212.163: /* Ideas for future talks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Schedule ==&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches] 2013 school on Statistical physics, Optimization, Inference and Message-Passing algorithms. Contains links to papers/talks.&lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== High Level Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&#039;s] text is supposedly great for stat mech, although I haven&#039;t read it.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&#039;t looked at this yet, but it seems promising.&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
*[https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;br /&gt;
&lt;br /&gt;
== Ideas for future talks ==&lt;br /&gt;
Here&#039;s some ideas for future talks. If you want to present one of these,&lt;br /&gt;
&lt;br /&gt;
A) Feel free to be advanced as you like -- assume an audience of graduate students.&lt;br /&gt;
&lt;br /&gt;
but&lt;br /&gt;
&lt;br /&gt;
B) Don&#039;t feel pressured to go any faster than you want. If you think you can give a pedagogical &#039;for dummies&#039; talk in the course of an hour and a half, go for it!&lt;br /&gt;
&lt;br /&gt;
* Derive capacity of Hopfield net and understand this limitation intuitively&lt;br /&gt;
* Explain similarity/relationship/identity of Bayesian inference and maximum entropy formalism.&lt;br /&gt;
* Deep intuitive dive on Lagrangian duals and what they really do/mean in the context of statistical inference/machine learning/stat mech&lt;/div&gt;</summary>
		<author><name>73.15.212.163</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59539</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59539"/>
		<updated>2017-07-09T19:11:07Z</updated>

		<summary type="html">&lt;p&gt;73.15.212.163: /* Ideas for future talks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Schedule ==&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches] 2013 school on Statistical physics, Optimization, Inference and Message-Passing algorithms. Contains links to papers/talks.&lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== High Level Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&#039;s] text is supposedly great for stat mech, although I haven&#039;t read it.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&#039;t looked at this yet, but it seems promising.&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
*[https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;br /&gt;
&lt;br /&gt;
== Ideas for future talks ==&lt;br /&gt;
Here&#039;s some ideas for future talks. If you want to present one of these,&lt;br /&gt;
&lt;br /&gt;
A) Feel free to be advanced as you like -- assume an audience of graduate students.&lt;br /&gt;
B) But don&#039;t feel pressured to go any faster than you want. If you think you can give a pedagogical &#039;for dummies&#039; talk in the course of an hour and a half, go for it!&lt;br /&gt;
&lt;br /&gt;
* Derive capacity of Hopfield net and understand this limitation intuitively&lt;br /&gt;
* Explain similarity/relationship/identity of Bayesian inference and maximum entropy formalism.&lt;br /&gt;
* Deep intuitive dive on Lagrangian duals and what they really do/mean in the context of statistical inference/machine learning/stat mech&lt;/div&gt;</summary>
		<author><name>73.15.212.163</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59524</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59524"/>
		<updated>2017-07-07T21:17:17Z</updated>

		<summary type="html">&lt;p&gt;73.15.212.163: /* UPDATE: FIRST MEETUP 7/6/17 AT 7PM */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== SCHEDULE ==&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches] 2013 school on Statistical physics, Optimization, Inference and Message-Passing algorithms. Contains links to papers/talks.&lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== High Level Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&#039;s] text is supposedly great for stat mech, although I haven&#039;t read it.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&#039;t looked at this yet, but it seems promising.&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
*[https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;/div&gt;</summary>
		<author><name>73.15.212.163</name></author>
	</entry>
</feed>