Hey Readers,
I used to write a lot about B-S-D conjecture and Yang mills (my favourite word being - SuperSymmetric Yang Mills - N = 4 -> N - 6) in this topological space called "Research Circles" (in 2010). Also, I used to write a lot on different machine learning approaches including NNs & Geoff Hionton's computer vision models.. CNNs etc.. to solve these 2 problems or rather a different way to look at these problems to generalize Ns.
Few updates on these 2 problems -
BSD Conjecture -
1. Machine learning invariants of arithmetic curves - https://www.sciencedirect.com/science/article/pii/S0747717122000839
Yang Mills -
1. Neural quantum states for supersymmetric quantum gauge theories - https://ml4physicalsciences.github.io/2021/files/NeurIPS_ML4PS_2021_46.pdf
Ideally, I used to write a lot about machine learning and Q neural networks. The paper suggests that these supersymmetric matrix models could become challenging benchmark tasks for developing new neural ansatz architectures and algorithms. On such benchmark tasks one could tune the difficulty by choosing the number of bosons (and even add fermions) and the gauge group algebra, and some tasks also have exact results or analytical predictions (e.g. at very large N). At large N and strong coupling, the wave function should contain details of quantum gravity, and hence an efficient numerical method would be both powerful and exciting
2. Preserving gauge invariance in neural networks - https://inspirehep.net/literature/1995502