<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=72.182.125.7</id>
	<title>Noisebridge - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=72.182.125.7"/>
	<link rel="alternate" type="text/html" href="https://wiki.extremist.software/wiki/Special:Contributions/72.182.125.7"/>
	<updated>2026-04-05T13:41:16Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59978</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59978"/>
		<updated>2017-08-05T13:31:02Z</updated>

		<summary type="html">&lt;p&gt;72.182.125.7: /* 8/10/17 - Talk and Discussion: Dr. Sohaib Alam - Quantum Entanglement in Neural Network States */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
=== 8/10/17 - Informal Meetup: Bring papers you&#039;ve been reading, material/texts you&#039;ve been working through, questions to ask, and Things You Understand to teach.&lt;br /&gt;
&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Dr. Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [https://libgen.unblocked.pub/book/index.php?md5=2E486AA1F672242DAA3D0B6116450D78 Proceedings] of the [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches 2013 school] on Statistical physics, Optimization, Inference and Message-Passing algorithms. Fairly advanced. &lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
These are my ongoing personal written working notes. They are a mess, but you can at least use them to see what I&#039;m working on.&lt;br /&gt;
&lt;br /&gt;
* [https://1drv.ms/o/s!AmTN0QVCYp0Og-BvczbkIA4xN1FfKg Steve&#039;s AI Learning Notes (read only, so people don&#039;t draw butts or flowers all over everything)]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
* Zdeborová et al. - [https://arxiv.org/abs/1511.02476 arXiv:1511.02476]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
* Deng et al. - Quantum Entanglement in Neural Network States [https://arxiv.org/abs/1701.04844 arXiv:1701.04844]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
&lt;br /&gt;
=== IMPORTANT NOTICE ON PIRACY AND INTELLECTUAL PROPERTY ===&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
&lt;br /&gt;
=== Stat Mech ===&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&#039;s] text is supposedly great for stat mech, although I haven&#039;t read it.&lt;br /&gt;
&lt;br /&gt;
=== Stat Mech and Stat Inference ===&lt;br /&gt;
* [https://libgen.unblocked.pub/book/index.php?md5=4CFDD37EEDB946D6E944750F746DB72B Bishop - Pattern Recognition and Machine Learning] Great pedagogical introduction to the basics. Good treatment of exponential family.&lt;br /&gt;
&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&#039;t looked at this yet, but it seems promising.&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=0AF108CB2FEFA5A20F7B186BC2C88656 Jaynes - Probability Theory, The Logic of Science] An excellent introduction to Bayesian reasoning and probability theory in general, from a very pedagogical, opinionated point-of-view. Dives into the motivations behind some information theory as well.&lt;br /&gt;
&lt;br /&gt;
== Ideas for future talks ==&lt;br /&gt;
Here&#039;s some ideas for future talks. If you want to present one of these,&lt;br /&gt;
&lt;br /&gt;
A) Feel free to be advanced as you like -- assume an audience of graduate students.&lt;br /&gt;
&lt;br /&gt;
but&lt;br /&gt;
&lt;br /&gt;
B) Don&#039;t feel pressured to go any faster than you want. If you think you can give a pedagogical &#039;for dummies&#039; talk in the course of an hour and a half, go for it!&lt;br /&gt;
&lt;br /&gt;
* Derive capacity of Hopfield net and understand this limitation intuitively&lt;br /&gt;
* Explain similarity/relationship/identity of Bayesian inference and maximum entropy formalism.&lt;br /&gt;
* Deep intuitive dive on Lagrangian duals and what they really do/mean in the context of statistical inference/machine learning/stat mech&lt;/div&gt;</summary>
		<author><name>72.182.125.7</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59945</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59945"/>
		<updated>2017-08-02T07:19:29Z</updated>

		<summary type="html">&lt;p&gt;72.182.125.7: /* 8/10/17 - Talk and Discussion: Dr. Sohaib Alam - Quantum Entanglement in Neural Network States */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
=== 8/10/17 - Talk and Discussion: Dr. Sohaib Alam - Quantum Entanglement in Neural Network States ===&lt;br /&gt;
Dr. Alam will give an overview of [https://arxiv.org/abs/1701.04844 arXiv:1701.04844]. This meeting will have the feel of a journal club session, rather than a class.&lt;br /&gt;
&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Dr. Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [https://libgen.unblocked.pub/book/index.php?md5=2E486AA1F672242DAA3D0B6116450D78 Proceedings] of the [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches 2013 school] on Statistical physics, Optimization, Inference and Message-Passing algorithms. Fairly advanced. &lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
These are my ongoing personal written working notes. They are a mess, but you can at least use them to see what I&#039;m working on.&lt;br /&gt;
&lt;br /&gt;
* [https://1drv.ms/o/s!AmTN0QVCYp0Og-BvczbkIA4xN1FfKg Steve&#039;s AI Learning Notes (read only, so people don&#039;t draw butts or flowers all over everything)]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
* Zdeborová et al. - [https://arxiv.org/abs/1511.02476 arXiv:1511.02476]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
* Deng et al. - Quantum Entanglement in Neural Network States [https://arxiv.org/abs/1701.04844 arXiv:1701.04844]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
&lt;br /&gt;
=== IMPORTANT NOTICE ON PIRACY AND INTELLECTUAL PROPERTY ===&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
&lt;br /&gt;
=== Stat Mech ===&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&#039;s] text is supposedly great for stat mech, although I haven&#039;t read it.&lt;br /&gt;
&lt;br /&gt;
=== Stat Mech and Stat Inference ===&lt;br /&gt;
* [https://libgen.unblocked.pub/book/index.php?md5=4CFDD37EEDB946D6E944750F746DB72B Bishop - Pattern Recognition and Machine Learning] Great pedagogical introduction to the basics. Good treatment of exponential family.&lt;br /&gt;
&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&#039;t looked at this yet, but it seems promising.&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=0AF108CB2FEFA5A20F7B186BC2C88656 Jaynes - Probability Theory, The Logic of Science] An excellent introduction to Bayesian reasoning and probability theory in general, from a very pedagogical, opinionated point-of-view. Dives into the motivations behind some information theory as well.&lt;br /&gt;
&lt;br /&gt;
== Ideas for future talks ==&lt;br /&gt;
Here&#039;s some ideas for future talks. If you want to present one of these,&lt;br /&gt;
&lt;br /&gt;
A) Feel free to be advanced as you like -- assume an audience of graduate students.&lt;br /&gt;
&lt;br /&gt;
but&lt;br /&gt;
&lt;br /&gt;
B) Don&#039;t feel pressured to go any faster than you want. If you think you can give a pedagogical &#039;for dummies&#039; talk in the course of an hour and a half, go for it!&lt;br /&gt;
&lt;br /&gt;
* Derive capacity of Hopfield net and understand this limitation intuitively&lt;br /&gt;
* Explain similarity/relationship/identity of Bayesian inference and maximum entropy formalism.&lt;br /&gt;
* Deep intuitive dive on Lagrangian duals and what they really do/mean in the context of statistical inference/machine learning/stat mech&lt;/div&gt;</summary>
		<author><name>72.182.125.7</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59827</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=59827"/>
		<updated>2017-07-27T04:42:19Z</updated>

		<summary type="html">&lt;p&gt;72.182.125.7: /* 8/10/17 - Talk and Discussion: Dr. Sohaib Alam - Quantum Entanglement in Neural Network States */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
=== 8/10/17 - Talk and Discussion: Dr. Sohaib Alam - Quantum Entanglement in Neural Network States ===&lt;br /&gt;
Dr. Alam will give an overview of [https://arxiv.org/abs/1701.04831v1 arXiv:1701.04844]. This meeting will have the feel of a journal club session, rather than a class.&lt;br /&gt;
&lt;br /&gt;
=== 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===&lt;br /&gt;
&lt;br /&gt;
PREREADINGS: &lt;br /&gt;
&lt;br /&gt;
[https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 MacKay - Information Theory, Inference, and Learning Algorithms] Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.&lt;br /&gt;
&lt;br /&gt;
[[Media:hinton_lect11.pdf]] [[Media:hinton_lect12.pdf]] Lecture notes from Hinton&#039;s Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos [https://www.coursera.org/learn/neural-networks here]. They&#039;re also on [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Youtube].&lt;br /&gt;
&lt;br /&gt;
== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we&#039;ll cover only lightly.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of [https://libgen.unblocked.srl/book/index.php?md5=79185948E755CC9B9611346E586CD050 D. Lay&#039;s book])&lt;br /&gt;
*single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart&#039;s textbook). &lt;br /&gt;
*basics of statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
There are plenty of other places to learn this stuff. Eg you can review your&lt;br /&gt;
probability, stats and linear algebra from chapters 2 and 3 of [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow].&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Check out these cool links&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
* [https://libgen.unblocked.pub/book/index.php?md5=2E486AA1F672242DAA3D0B6116450D78 Proceedings] of the [http://www.lps.ens.fr/~krzakala/LESHOUCHES2013/home.htm Les Houches 2013 school] on Statistical physics, Optimization, Inference and Message-Passing algorithms. Fairly advanced. &lt;br /&gt;
* [https://www.youtube.com/playlist?list=PLoRl3Ht4JOcdU872GhiYWf6jwrk_SNhz9 Videos] from Geoff Hinton&#039;s neural net course on Coursera.&lt;br /&gt;
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we&#039;re trying to build.&lt;br /&gt;
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy&lt;br /&gt;
&lt;br /&gt;
These are my ongoing personal written working notes. They are a mess, but you can at least use them to see what I&#039;m working on.&lt;br /&gt;
&lt;br /&gt;
* [https://1drv.ms/o/s!AmTN0QVCYp0Og-BvczbkIA4xN1FfKg Steve&#039;s AI Learning Notes]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
=== Overviews ===&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;br /&gt;
&lt;br /&gt;
* Zdeborová et al. - [https://arxiv.org/abs/1511.02476 arXiv:1511.02476]&lt;br /&gt;
&lt;br /&gt;
=== Interesting papers ===&lt;br /&gt;
&lt;br /&gt;
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]&lt;br /&gt;
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]&lt;br /&gt;
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]&lt;br /&gt;
&lt;br /&gt;
== Books ==&lt;br /&gt;
&lt;br /&gt;
=== IMPORTANT NOTICE ON PIRACY AND INTELLECTUAL PROPERTY ===&lt;br /&gt;
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].&lt;br /&gt;
&lt;br /&gt;
=== Stat Mech ===&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang&#039;s] text is the bronze standard for grad level stat mech.&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler&#039;s] text is supposedly great for stat mech, although I haven&#039;t read it.&lt;br /&gt;
&lt;br /&gt;
=== Stat Mech and Stat Inference ===&lt;br /&gt;
* [https://libgen.unblocked.pub/book/index.php?md5=4CFDD37EEDB946D6E944750F746DB72B Bishop - Pattern Recognition and Machine Learning] Great pedagogical introduction to the basics. Good treatment of exponential family.&lt;br /&gt;
&lt;br /&gt;
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven&#039;t looked at this yet, but it seems promising.&lt;br /&gt;
&lt;br /&gt;
* [https://libgen.unblocked.srl/book/index.php?md5=0AF108CB2FEFA5A20F7B186BC2C88656 Jaynes - Probability Theory, The Logic of Science] An excellent introduction to Bayesian reasoning and probability theory in general, from a very pedagogical, opinionated point-of-view. Dives into the motivations behind some information theory as well.&lt;br /&gt;
&lt;br /&gt;
== Ideas for future talks ==&lt;br /&gt;
Here&#039;s some ideas for future talks. If you want to present one of these,&lt;br /&gt;
&lt;br /&gt;
A) Feel free to be advanced as you like -- assume an audience of graduate students.&lt;br /&gt;
&lt;br /&gt;
but&lt;br /&gt;
&lt;br /&gt;
B) Don&#039;t feel pressured to go any faster than you want. If you think you can give a pedagogical &#039;for dummies&#039; talk in the course of an hour and a half, go for it!&lt;br /&gt;
&lt;br /&gt;
* Derive capacity of Hopfield net and understand this limitation intuitively&lt;br /&gt;
* Explain similarity/relationship/identity of Bayesian inference and maximum entropy formalism.&lt;br /&gt;
* Deep intuitive dive on Lagrangian duals and what they really do/mean in the context of statistical inference/machine learning/stat mech&lt;/div&gt;</summary>
		<author><name>72.182.125.7</name></author>
	</entry>
</feed>