# Motivation

The spiral optimization (SPO) algorithm is an uncomplicated search concept inspired by spiral phenomena in nature. The motivation for focusing on spiral phenomena was due to the insight that the dynamics that generate logarithmic spirals share the diversification and intensification behavior. The diversification behavior can work for a global search (exploration) and the intensification behavior enables an intensive search around a current found good solution (exploitation).
﻿
﻿﻿﻿﻿﻿
 Fig.1: The blue spiral and red spiral are applicable to exploration (global search) and exploitation (intensive search), respectively.

## Algorithm

The SPO algorithm is a multipoint search algorithm that has no objective function gradient, which uses multiple spiral models that can be described as deterministic dynamical systems. As search points follow logarithmic spiral trajectories towards the common center, defined as the current best point, better solutions can be found and the common center can be updated.

The general SPO algorithm for a minimization problem under the maximum iteration $k_{\max}$ (termination criterion) is as follows:
----------------------------------------------------------------------------------------------------------------------------------------------------

### [Spiral Optimization Algorithm]

[Step 0] Set the number of search points $m\geq 2$  and the maximum iteration number $k_{\max}$.

[Step 1] Place the initial search points $x_i (0) \in \mathbb{X} \subset \mathbb{R}^n~(i=1, \ldots, m)$ and determine the center $x^{\star}(0)= x_{i_\text{b} }(0)$,  $\displaystyle i_\text{b} =\mathop{\text{argmin}}_{i=1,\ldots,m} \{ f(x_{i}(0)) \}$, and then set $k = 0$.

[Step 2] Decide the step rate $r(k)$ by a rule.

[Step 3] Update the search points:
\begin{align*}
x_i(k+1) = x^{\star}(k) + r(k) R(\theta) (x_i(k) - x^{\star}(k))~~~(i=1, \ldots, m).
\end{align*}

[Step 4] Update the center:
\begin{align*}
x^{\star}(k+1) =
\begin{cases}
x_{i_\text{b}}(k+1)~\big( \text{if} ~f(x_{i_\text{b}}(k+1)) < f(x^{\star}(k)) \big),\\
x^{\star}(k)~~~~~~~~~\big(\text{otherwise} \big), \\
\end{cases}
\end{align*}
where $\displaystyle i_\text{b} = \mathop{\text{argmin}}_{i=1,\ldots,m} \{ f(x_{i}(k+1)) \}$.

[Step 5] Set $k := k+1$. If $k=k_{\max}$ is satisfied then terminate and output $x^{\star}(k)$. Otherwise, return to Step 2.
----------------------------------------------------------------------------------------------------------------------------------------------------

## Settings

The search performance depends on setting the composite rotation matrix $R(\theta)$, the step rate $r(k)$, and the initial points $x_i(0)~(i=1,\ldots,m)$. Some effective settings with high theoritical backgrounds have been proposed to this date.
----------------------------------------------------------------------------------------------------------------------------------------------------

### [Setting 1 (Spiral Descent Direction Setting) by Ref.8]

This setting, proposed by Ref.8 in 2016, is an effective setting for high dimensional problems under the maximum iteration $k_{\max}$. The conditions on $R(\theta)$ and $x_i(0)~(i=1,\ldots,m)$ together ensure that the spiral models generate descent directions periodically. The condition $r$ works to utilize the periodic descent direction under the search termination $k_{\max}$.
•  Set $R(\theta)$ as follows:\begin{align*}R(\theta) = \begin{bmatrix}
0_{n-1}^\top &-1\\
I_{n-1}&0_{n-1}\\
\end{bmatrix}\end{align*} where $I_{n-1}$ is the  $(n-1)\times (n-1)$ identity matrix and $0_{n-1}$ is the $(n-1)\times 1$ zero vector.
• Place the initial points $x_i(0) \in \mathbb{X} \subset \mathbb{R}^n$ $(i = 1,\ldots, m)$ at random to satisfy the following condition:
\begin{align*}
\min_{i=1,\ldots,m} \{ \max_{j=1,\ldots,m} \bigl \{ \text{rank} \bigl[ d_{j,i}(0)~R(\theta)d_{j,i}(0)~~
\cdots~~R(\theta)^{2n-1}d_{j,i}(0) \bigr]\bigr\} \bigr\} = n
\end{align*}
where $d_{j,i}(0) = x_{j}(0) - x_i(0)$.  It is noted that this condition is almost all satisfied by a random placing and thus no check is actually fine.
• Set $r(k)$ at Step2 as follows:\begin{align*}r(k) = r =  \sqrt[ k_{\max} ]{\mathstrut \delta }~~~~\text{(constant value)}\end{align*}where  a sufficiently small $\delta> 0$ such as $\delta = 1/k_{\max}$ or $\delta = 10^{-3}$.
----------------------------------------------------------------------------------------------------------------------------------------------------

----------------------------------------------------------------------------------------------------------------------------------------------------

### [Setting 2 (Convergence Setting) by Ref.10]

This setting in Ref.10 ensures that the SPO algorithm converges to a stationary point under the maximum iteration $k_{\max} = \infty$. The settings of $R(\theta)$ and the initial points  $x_i(0)~(i=1,\ldots,m)$ are the same with the above Setting 1. The setting of $r(k)$ is as follows.
• Set $r(k)$ at Step2 as follows:  \begin{align*}r(k) =
\begin{cases}
1~~(k^\star \leqq k \leqq  k^\star + 2n-1), \\
h~~(k \geqq k^\star + 2n) ,
\end{cases}
\end{align*} where $k^\star$ is an iteration when the center is newly updated at Step4 and $h = \sqrt[ 2n ]{\mathstrut \delta }, \delta \in (0,1)$ such as $\delta = 0.5$.  Thus we have to add the following orders about $k^\star$ to the Algorithm.
• (Step 1) $k^\star = 0$.
• (Step 4) If $x^{\star}(k+1)\neq x^{\star}(k)$ then $k^\star = k+1$.
----------------------------------------------------------------------------------------------------------------------------------------------------

## Future works

• The algorithms with the above settings are deterministic. Thus, incorporating some random operations would make this algorithm powerful for the global optimization.
• To find an appropriate balance between diversification and intensification spirals depending on the target problem class including $k_{\max}$ is important to enhance the performance.

## References

1. K.Tamura and K. Yasuda, Primary Study of Spiral Dynamics Inspired Optimization,"IEEJ Transactions on Electrical and Electronic Engineering, vol. 6, no. S1, pp.98-100, Jan 2011.
2. K.Tamura and K. Yasuda, Spiral Dynamics Inspired Optimization,"
Journal of Advanced Computational Intelligence and Intelligent Informatics, vol. 132, no. 5, pp. 1116-1122, Jul 2011.
3. A. N. K. Nasir and M. O. Tokhi, A novel hybrid spiral-dynamics random-chemotaxis optimization algorithm with application to modelling of a flexible robot manipulator,''  in Proc. 16th Int. Conf. Climb. Walk. Robots (CLAWAR), Sydney, Australia, 2013, pp. 667-674.
4. A. Ouadi, H. Bentarzi, and A. Recioui, Optimal multiobjective design of digital filters using spiral optimization technique,'' Springerplus, vol. 2, no. 461, pp. 697-707, Sep. 2013.
5. L. Benasla, A. Belmadani, and M. Rahli, Spiral optimization algorithm for solving combined economic and Emission Dispatch,'' Int. J. Elect. Power & Energy Syst., vol. 62, pp. 163-174, Nov. 2014.
6. K.Tamura and K. Yasuda, A Parameter Setting Method for Spiral Optimization from Stability Analysis of Dynamic Equilibrium Point,"SICE Journal of Control, Measurement, and System Integration, vol. 7, no.3,  pp.173-182, May 2014.
7. K. A. Sidarto and A. Kania, Finding all solutions of systems of nonlinear equations using spiral dynamics inspired optimization with clustering,'' JACIII, vol. 19, no. 5, pp. 697-707, Sep. 2015.
8. K.Tamura and K. Yasuda, Spiral Optimization Algorithm Using Periodic Descent Directions,"SICE Journal of Control, Measurement, and System Integration, vol. 9 no.3,  pp.133-143, May 2016.
9. A. N. K. Nasir, R.M.T.R. Ismail and M. O. Tokhi, Adaptive spiral dynamics metaheuristic algorithm for global optimisation with application to modelling of a flexible system,''Appl. Math. Modell.,vol. 40, no. 9-10, pp. 5442-5461, May 2016.
10. K.Tamura and K. Yasuda, The Spiral Optimization Algorithm: Convergence Conditions and Settings,"IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. PP no.99, 16 pages, May 2017. (Early Access Version)