The following is a list of helpful resources that are useful in order to continue the model in "Thought Curvature" paper:
-
Universal Intelligence: A Definition of Machine Intelligence
-
Exponential expressivity in deep neural networks through transient chaos (Concerns MFT in the Reimannian Geometry)
-
Relation between Mean Field Theory and Quantum Field Theory (Concerns MFT & QFT)
-
Supersymmetric Field-Theoretic Modelson a Supermanifold (Concerns QFT, in the regime of Physics)
-
Microtubules: Montroll's kink and Morse vibrations (Concerns QFT, in the regime of Cogntive Science)
-
Fluctuation-Dissipation Theorem and Models of Learning (Concerns QFT, in the regime of Biophysics)
-
Rethinking Neural Networks: Quantum Fields and Biological Data (Concerns QFT, in the regime of Biophysics)
-
Quantum Aspects of Semantic Analysis and Symbolic Artificial Intelligence (~Concerns a single section/4 paragraph thought experiment on "supersymmetric lsa's" (Section "Supersymmetry and dimensional Reduction", page 4 on the right hand side), i.e. supersymmetry based single value decomposition, absent neural/gradient descent. Most of the paper apparently otherwise focusses on comparisons between non supersymmetric LSA/Single Value Decomposition, traditional Deep Neural Networks and Quantum Information Theory. LSAs are described as typically interpretable/classical/non layered statistical machine learning methods used for dimensionality reductions, ie reducing dimension of words etc to help extract meaning/relationships, that have not been demonstrated to have the capacity to explicitly learn deep hierarchical representations, while neural networks are observed as brain inspired multi-layer systems, typically complex, and explicitly designed to learn deep/complex hierarchical representations, similar to the reality that biological brains have been estimated /demonstrated to possess deep hierarchical learning capacities.)
-
An unsupervised algorithm for learning Lie group transformations (Concerns Lie Algebra, in the regime of Deep Learning)
-
Learning Unitary Operators with Help From u(n) (Concerns Lie Algebra, in the regime of Deep Learning)
-
NYUSeminars, JHD, 2 February 2017 (Concerns Berezinians and Darboux Transformations on the superline)
-
Construction of Hamiltonians by machine learning of energy and entanglement spectra (Concerns Hamiltonian Construction on classical computers)
-
Diffeomorphisms and orthonormal frames (Concerns pseusdogroup spins)
-
The supersymmetric tensor hierarchy of N = 1, d = 4 supergravity (Concerns QFT, particularly supergravity theory on some quantum tensor formalism. This is not to be confused for Symmetric tensors as seen in old Higher Order Symmetric Tensor papers, that dont concern superspace, but falsely label said symmetric tensors as "supersymmetric tensors". Pertinently, see this paper, describing the phenomena of "super" tensor labelling errors. Notably, even recent higher order tensor papers, that likewise dont concern superspace are still invalidly commiting the "super" labelling error, as described in the error indicating paper prior cited).
-
Graded Lie Algebras, Supersymmetry, and Applications (Concerns graded lie alegras, on supersymmetric tensors)
-
Supersymmetry Groups (Concerns gauge symmetries and supergroups)
-
On the weak N-dependence of SO(N) and SU(N) gauge theories in 2+1 dimensions (Concerns relationships, showing that "low N pairs of SO(N) and SU(N′) theories are known to possess the same Lie algebras", including "SU(2) and SO(3)"...)
-
Advances in quantum machine learning in 2016 and in early 2017 (Concerns 100's of quantum learning papers, including sources ranging from classical algorithms, to quantum algorithms)
-
Early Visual Concept Learning with Unsupervised Deep Learning (Concerns model that learns some degrees of freedom in terms of laws of physics, for subsequent use in RL context)
-
N=1 SQCD and the Transverse Field Ising Model (Concerns Ising Super-Hamiltonian notation)
-
Quantum boltzmann machine using a quantum annealer (Concerns Extended Quantum Boltzmann Machine)
-
Group representations (Concerns a colourful overview of group representations]
-
Complex-Supermanifolds (Concerns commutative complex super algebras)
-
Information dynamics of neural networks with the aid of supersymmetry fields as the microscopic thermal flow (Concerns intriguing SUSY-breaking dynamic analysis of NNWs)
-
Hyperspherical Parameterization of Unitary Matrices (Concerns parameterization of special unitary matrices)
-
A study on neural learning on manifold foliations: the case of the Lie group SU(3) (Concerns neural learning on manifold foliations: the case of the Lie group SU(3); although I am instead interested in SU(m|n), or some ringed representation, for supersymmetric expression. This study does not partition the solution space in terms of Supercharges, which is required for supersymmetric expression.)
-
Gaussian elimination in unitary groups with an application to cryptography (Concerns gaussian elimination in the scope of the generalized special unitary group, with respect to cryptography.)
-
Statistical Inference and String Theory (Concerns a "speculative remark" regarding supersymmetric stastical inference)
-
Supersymmetric Nonlinear Sigma Models (Concerns a quite clear description of the grasmannian manifold)
-
Supersymmetric theory of stochastic dynamics (Concerns non linear supersymmetric sigma model, distinguished by a unique difusion operator)
-
Out-of-equilibrium dynamical mean-field equations for the perceptron model (Concerns an evaluation of the quality of the Perceptron (very similar to Yoshitake's 2000 paper), with the aid of supersymmetry fields)
-
Quantum Machine Learning Collection (Concerns yet another quantum machine learning resource list.)
-
Digital Memcomputing: from Logic to Dynamics to Topology(Concerns yet another resource that deals with using SUSY for explaining the operational efficiency, of Digital mem computing devices in particular.)
-
Criticality or Supersymmetry Breaking ?(Concerns yet another resource that deals with using SUSY for explaining the dynamics of the biological brain in particular.)
-
Branes with Brains: Exploring String Vacua with Deep Reinforcement Learning (Concerns an intriguing paper, that uses deep reinforcement learning to try to discover new geometries for string theory.)
-
Supersymmetric Generalizations of Matrix Models (Concerns supersymmetric description of matrix models)
-
This multiplet related paper, namely the "Multiplet Neural Network" is intriguing and reasonably useful, because my "Supersymmetric Artificial Neural Network" hypothesis concerns a type of "Supermultiplet Artifcial Neural Network". Nathan's multiplet paper seems intriguing, despite seemingly not actually being strongly related to group theory.
-
Could "Holomorphic Supercurve Learning? be generalized from from non-supersymmetric Holomorphic Complex Valued Neural Networks?
- The answer is reasonably yes:
- This 2018 holomorphic deep learning paper reports Cauchy-Riemann satisfying gradient descent compatible holomorphism. (See also Deep Complex Networks.
- Furthermore, this holomorphic supercurve paper from physics concerns a generalization from holomorphic curves, (reasonably seen in (1)) to holomorphic supercurves related to supersymmetry!
- This super Cauchy-Riemann equation resource on page 33 of 124 seems useful.