<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=107.77.213.232</id>
	<title>Noisebridge - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=107.77.213.232"/>
	<link rel="alternate" type="text/html" href="https://wiki.extremist.software/wiki/Special:Contributions/107.77.213.232"/>
	<updated>2026-04-06T16:53:21Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58775</id>
		<title>NBDSM</title>
		<link rel="alternate" type="text/html" href="https://wiki.extremist.software/index.php?title=NBDSM&amp;diff=58775"/>
		<updated>2017-06-04T03:01:12Z</updated>

		<summary type="html">&lt;p&gt;107.77.213.232: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What ==&lt;br /&gt;
&lt;br /&gt;
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].&lt;br /&gt;
&lt;br /&gt;
We&#039;re focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we&#039;ll be learning, so our focus on it will be light.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
Our discussions are usually at graduate level in machine learning and theoretical physics. To be able to get something out of them, you should have at least undergrad proficiency in&lt;br /&gt;
&lt;br /&gt;
*linear algebra (at the level of D. Lay&#039;s book)&lt;br /&gt;
*single and multi-variable calculus, and vector calculus (all of Stewart)&lt;br /&gt;
*statistics, including bayesian&lt;br /&gt;
*statistical mechanics (at the level of [http://mcgreevy.physics.ucsd.edu/s12/index.html McGreevy&#039;s MIT lecture notes])&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
Here are some cool links (you can use these to figure out what to study to get up to speed)&lt;br /&gt;
&lt;br /&gt;
* a great [http://videolectures.net/deeplearning2016_ganguli_theoretical_neuroscience/ talk] by Ganguli at [https://sites.google.com/site/deeplearningsummerschool2016/ last year&#039;s deep learning summer school] in Montreal.&lt;br /&gt;
* anything recent by [https://arxiv.org/find/stat/1/au:+Ganguli_S/0/1/0/all/0/1 Ganguli] at the [https://ganguli-gang.stanford.edu/ Neural Dynamics and Computation Lab] as well.&lt;br /&gt;
*[https://calculatedcontent.com/ Calculated Content]&lt;br /&gt;
* the venerable [http://colah.github.io/ colah&#039;s blog]&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
Good large scale overview of why the stat mech side is important&lt;/div&gt;</summary>
		<author><name>107.77.213.232</name></author>
	</entry>
</feed>