<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=72.182.43.214</id>
	<title>Noisebridge - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=72.182.43.214"/>
	<link rel="alternate" type="text/html" href="https://wiki.extremist.software/wiki/Special:Contributions/72.182.43.214"/>
	<updated>2026-04-06T08:55:52Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58812</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58812"/>
		<updated>2017-06-06T03:29:40Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Prerequisites */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are ~at upper division to graduate level in machine learning and statistical mechanics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58811</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58811"/>
		<updated>2017-06-06T03:27:06Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Links */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are ~at graduate level in machine learning and statistical mechanics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* A great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* Anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* The venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58810</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58810"/>
		<updated>2017-06-06T03:26:43Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Links */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are ~at graduate level in machine learning and statistical mechanics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley: [https://smml.io/ smml:2017]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58809</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58809"/>
		<updated>2017-06-06T03:26:23Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Links */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are ~at graduate level in machine learning and statistical mechanics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
* Stat Mech//Machine Learning conference 2017 at Berkeley [https://smml.io/ smml:2017]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58808</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58808"/>
		<updated>2017-06-06T03:15:58Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Prerequisites */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are ~at graduate level in machine learning and statistical mechanics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58807</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58807"/>
		<updated>2017-06-06T03:09:05Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
Less emphasis on the physics, more emphasis on the stat mech &amp;lt;-&amp;gt; statistical inference&lt;br /&gt;
connection.&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58806</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58806"/>
		<updated>2017-06-06T03:07:00Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
More stuff&lt;br /&gt;
* Mastromatteo - On the typical properties of inverse problems in stat mech - [https://arxiv.org/abs/1311.0190v1 arXiv:1311.0910v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58805</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58805"/>
		<updated>2017-06-06T03:04:55Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
&lt;br /&gt;
More stuff&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58804</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58804"/>
		<updated>2017-06-06T03:03:57Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* What */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be covering, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58803</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58803"/>
		<updated>2017-06-06T02:47:29Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be learning, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
* Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58802</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58802"/>
		<updated>2017-06-06T02:47:01Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be learning, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 arXiv:1301.7115v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58801</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58801"/>
		<updated>2017-06-06T02:46:34Z</updated>

		<summary type="html">&lt;p&gt;72.182.43.214: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be learning, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;br /&gt;
Advani et. al. - Stat mech of complex neural systems and high dimensional data - [https://arxiv.org/abs/1301.7115v1 q-bio:1301.7115v1]&lt;/div&gt;</summary>
		<author><name>72.182.43.214</name></author>
	</entry>
</feed>