Hi Readers,
I had a super-busy day at work yesterday and wrote a blog at 1 AM. But, today I have taken a leave to spend some time to chill out.
--- One of the reasons I speak about Mirror Symmetry and Maryam Mirzakhani is this Forbes article- https://www.forbes.com/sites/startswithabang/2017/08/01/maryam-mirzakhani-a-candle-illuminating-the-dark/?sh=5aa7449836c1
Sometimes Forbes writes really cool articles right.. I hope they get right influencers to spread awareness of such brilliant articles through influencer marketing while understanding stubborn & non-stubborn players in social networks with opinion mining - Ref - https://arxiv.org/abs/1609.03465 (it would be cool & optimum to spread awareness of math articles through math based influencer marketing right?)
--- One of the brilliant papers by Maryam Mirzakhani is Volume Growth on Teichmuller Space which speaks about topological entropy of the geodesic flow. (September 20, 2011- Lattice point asymptotics and volume growth on Teichmüller space Jayadev Athreya, Alexander Bufetov, Alex Eskin, Maryam Mirzakhani) Kind of cool paper to study Geodesic CNNs (Geodesic Convolutional Neural Networks bit more)...
--- The paper Hyperbolic spaces in Teichmüller Space by Christopher J. Leininger and Saul Schleimer explains the way hyperbolic space almost-isometrically embeds into the Teichmüller Space ( https://arxiv.org/abs/1110.6526 ) kind of very smart inspiration from Mirzakhani's paper..
--- which is further adopted in Hyperbolic Neural Networks by Octavian-Eugen Ganea ( https://arxiv.org/abs/1805.09112 )which focuses on hyperbolic embeddings - Möbius gyrovector spaces with the Riemannian geometry of the Poincaré model of hyperbolic spaces which helps to embed sequential data and perform classification in the hyperbolic space (for further Poincare Glove as well - HyperE at Stanford)
While this paper cites Sanja Fidler's paper on Order Embeddings, One of the best papers which has recently come out is by WeChat - Fully Hyperbolic Neural Networks .. one of the smartest point in the design being.. Fully Hyperbolic Attention Layer - https://arxiv.org/pdf/2105.14686.pdf
No comments:
Post a Comment