Unveiling the Kissing Number Mystery Through AI Innovation
In a remarkable fusion of artificial intelligence and pure mathematics, a collaborative team from Peking University, Fudan University, and the Shanghai Academy of AI for Science has shattered longstanding barriers in the kissing number problem. Using their groundbreaking reinforcement learning system, PackingStar, they have established new lower bounds for this elusive metric in dimensions 25 through 31, as well as a significant advancement in dimension 13.
The kissing number problem, denoted as τ(n) or K(n), seeks the maximum number of equal non-overlapping spheres that can simultaneously touch a central sphere of the same radius in n-dimensional Euclidean space. First debated by Isaac Newton and David Gregory in 1694, Newton posited 12 for three dimensions while Gregory argued for 13—a controversy resolved only in 1953 proving Newton's claim correct.
Historical Context and Known Kissing Numbers
The problem's allure lies in its simplicity masking profound depth. In low dimensions:
- Dimension 2: 6 (hexagonal packing).
- Dimension 3: 12 (icosahedral arrangement, proven 1953).
- Dimension 4: 24 (proven 2003 by Oleg Musin).
- Dimension 8: 240 (Leech lattice related).
- Dimension 24: 196,560 (exceptional lattice packing).
This stall underscores the challenge: as n increases, the 'curse of dimensionality' explodes the search space, rendering brute-force enumeration impossible. Enter AI, trained to explore irregular, non-lattice structures that dominate optimal packings.
PackingStar: Revolutionizing Sphere Packing with Reinforcement Learning
PackingStar reimagines the kissing number as a cooperative two-player matrix completion game on the Gram matrix, where entries are pairwise cosines between unit vectors from the central sphere to kissing sphere centers. This sidesteps coordinate instability, enabling GPU-accelerated computation.
Step-by-Step Process:
- Feature Identification: Simulate low-dimensional tangencies to derive discrete cosine sets C1 (e.g., dominant angles).
- Matrix Initialization: Start partial Gram matrix G(m0) from C1 or priors like Leech lattice slices.
- Player 1 (Filler): Uses Monte Carlo Tree Search (MCTS) to add entries g ∈ A(m), ensuring positive semidefiniteness (PSD) and full rank via Cholesky checks.
- Player 2 (Corrector): Neural policy selects indices I to prune suboptimal entries, extracting refined submatrix G[I,I].
- Decomposition & Restart: Distill final matrix into substructures (e.g., symmetric frames), seeding new games for parallel exploration.
Led by Chengdong Ma and Yaodong Yang at Peking University's AI Institute, with Yuan Qi at Fudan, the system ran from scratch, yielding over 6,000 novel structures.
Record-Breaking Results in High Dimensions
PackingStar's triumphs include:
| Dimension (n) | New Lower Bound | Previous | Improvement |
|---|---|---|---|
| 25 | 197,056 | 197,048 | +8 |
| 26 | 198,550 | 198,512 | +38 |
| 27 | 200,044 | 199,976 | +68 |
| 28 | 204,520 | 204,368 | +152 |
| 29 | 209,496 | 208,272 | +1,224 |
| 30 | 220,440 | 219,984 | +456 |
| 31 | 238,350 | 232,874 | +5,476 |
In dimension 13, rational kissing number reaches 1,146 (prev 1,130), first beyond 1971 structures. Generalized bounds under cosine ≤1/4 or 1/3 also improved, e.g., K(12,1/4)=81.
Read the full arXiv paper for configurations.Novel Structures and Paradigm Challenges
Beyond bounds, PackingStar uncovered non-antipodal setups defying symmetry norms, algebraic links to finite simple groups, and cross-dimensional geometries. In 25D, a 496-sphere shell from 28 8D frames + 24D cross hints at optimality. These inspire human refinements, e.g., better 22D packings.
Over 6,000 structures cataloged, many rational (integer cosines), aid coding theory via spherical codes.
Implications for Mathematics and Beyond
While no proofs, configurations tighten bounds, guiding proofs. Applications span:
- Coding Theory: Optimal sphere packings yield best error-correcting codes.
- Quantum Computing: High-dim lattices for qubit states.
- Signal Processing: Satellite comms packing signals without interference.
- Data Compression: Minimize bits via geometric efficiency.
Spotlight on Chinese Higher Education Leadership
Peking University, consistently China's top math program, leads via its AI Institute. Fudan University's AI Incubation Institute complements with expertise. SAIS, a national AI hub, bridges theory-application. This collab exemplifies China's 'Double First-Class' push, investing billions in AI-math fusion. For researchers eyeing such hubs, China university jobs and research positions abound at PKU and Fudan.
Expert Perspectives and Global Impact
"AI reshapes mathematical intuitions," notes the team.
China's rise: Tops Leiden 2025 research impact, fueling such feats. Aspiring academics, check academic CV tips.
Future Horizons: AI-Math Synergy
Next: Upper bounds, proofs via discovered structures? PackingStar scales higher, perhaps cracking dim 32+. Broader: RL for E8 lattices, quantum codes. Chinese unis gear for more, with AI labs expanding. Job seekers in AI-math: higher ed jobs, university jobs.
Photo by Filipp Romanovski on Unsplash
Conclusion: A New Era in Geometric Discovery
PackingStar exemplifies how Peking University and Fudan propel global math frontiers. Explore Rate My Professor for insights, higher ed jobs, career advice. Stay tuned for proofs validating these bounds.