ArXiv TLDR

S-LCG: Structured Linear Congruential Generator-Based Deterministic Algorithm for Search and Optimization

🐦 Tweet
2605.05198

Ahmed Qasim Mohammed, Haider Banka, Anamika Singh

math.OCcs.NE

TLDR

S-LCG is a novel deterministic optimization algorithm using a structured Linear Congruential Generator, outperforming competitors on benchmarks.

Key contributions

  • Introduces S-LCG, a deterministic optimization algorithm based on a structured Linear Congruential Generator.
  • Employs a two-level architecture with adaptive exploration-exploitation and bit splitting for multi-dimensional search.
  • Achieves global optimum within 1% in 83.3% of benchmark cases, outperforming eight cutting-edge algorithms.
  • Offers a strictly reproducible optimization framework with only one sensitive parameter to tune.

Why it matters

S-LCG is a novel, deterministic optimization algorithm offering high reproducibility, minimal tuning, and superior performance on complex benchmarks and engineering problems. This makes it a practical and efficient tool for various applications.

Original Abstract

This study presents a novel deterministic optimization algorithm based on a special variant of the Linear Congruential Generator (LCG). While conventional algorithms generally operate within the search space, the introduced technique follows a two-level architecture. In particular, an external loop that adaptively balances between exploration and exploitation, while the internal loop evaluates solutions. It is motivated by the intrinsic structure of the generator, the reason behind naming it the Structured Linear Congruential Generator (S- LCG). which enjoys a number of unique characteristics as follows: 1) a memoryless scheme, which ensures non-overlapping sequences based on distinct seeds, thus ensuring no evaluation redundancy; 2) bit splitting representation, which converts LCG states into multi-dimensional points to overcome the Marsaglia lattice effect; 3) adaptive exploration-exploitation of the generator space, which leads to implicit optimization of the surrogate smooth objective function; and 4) constant information gathering speed to avoid the problem of premature convergence. Extensive testing on 26 benchmark functions across dimensions d = 2 to 30 demonstrates that S-LCG comes within 1% of the global optimum in 83.3% of 138 cases (100% at d = 2, 81.2% at d = 30) while the nearest competitor GA achieved 75.4%. Statistical validation shows that S-LCG outperforms eight cutting-edge binary algorithms. Furthermore, its practical value is confirmed by validation on three constrained engineering design problems. In the end, S-LCG offers an optimization framework that is strictly reproducible and requires only one sensitive parameter to be tuned.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.