Abstract
Transformer neural networks combined with local search in PatternBoost framework construct Hadamard matrices more effectively than traditional methods, especially for sparse combinatorial problems and larger matrix orders.
We present a new method for constructing Hadamard matrices that combines transformer neural networks with local search in the PatternBoost framework. Our approach is designed for extremely sparse combinatorial search problems and is particularly effective for Hadamard matrices of Goethals--Seidel type, where Fourier methods permit fast scoring and optimisation. For orders between 100 and 200, it produces large numbers of inequivalent Hadamard matrices, and for larger orders, it succeeds where local search from random initialisation fails. The largest example found by our method has order 252. In addition to these new constructions, our experiments reveal that the transformer can discover and exploit useful hidden symmetry in the search space.
Get this paper in your agent:
hf papers read 2604.11101 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper