<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.extremist.software/index.php?action=history&amp;feed=atom&amp;title=Deepnet</id>
	<title>Deepnet - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.extremist.software/index.php?action=history&amp;feed=atom&amp;title=Deepnet"/>
	<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=Deepnet&amp;action=history"/>
	<updated>2026-04-05T02:24:50Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=Deepnet&amp;diff=59525&amp;oldid=prev</id>
		<title>Fineline: Created page with &quot; == SCHEDULE == === 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===  PREREADINGS:   [https://libgen.unblocked.srl/book/index.php?...&quot;</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=Deepnet&amp;diff=59525&amp;oldid=prev"/>
		<updated>2017-07-07T22:27:57Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot; == SCHEDULE == === 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===  PREREADINGS:   [https://libgen.unblocked.srl/book/index.php?...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&lt;br /&gt;
== SCHEDULE ==&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&amp;#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&amp;#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&amp;#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&amp;#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&amp;#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&amp;#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&amp;#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&amp;#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&amp;#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches] 2013 school on Statistical physics, Optimization, Inference and Message-Passing algorithms. Contains links to papers/talks.&lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&amp;#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&amp;#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== High Level Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&amp;#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&amp;#039;s] text is supposedly great for stat mech, although I haven&amp;#039;t read it.&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&amp;#039;t looked at this yet, but it seems promising.&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
*[https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;/div&gt;</summary>
		<author><name>Fineline</name></author>
	</entry>
</feed>