@@ -19,9 +19,10 @@ optimizing machine learning algorithms. It works with
19
19
different types of crossover, mutation, and parent selection operators.
20
20
`PyGAD <https://github.com/ahmedfgad/GeneticAlgorithmPython >`__ allows
21
21
different types of problems to be optimized using the genetic algorithm
22
- by customizing the fitness function.
22
+ by customizing the fitness function. It works with both single-objective
23
+ and multi-objective optimization problems.
23
24
24
- .. figure :: https://user-images.githubusercontent.com/16560492/101267295-c74c0180-375f-11eb-9ad0-f8e37bd796ce.png
25
+ .. image :: https://user-images.githubusercontent.com/16560492/101267295-c74c0180-375f-11eb-9ad0-f8e37bd796ce.png
25
26
:alt:
26
27
27
28
*Logo designed by * `Asmaa
@@ -108,6 +109,11 @@ equation.
108
109
A very important step is to implement the fitness function that will be
109
110
used for calculating the fitness value for each solution. Here is one.
110
111
112
+ If the fitness function returns a number, then the problem is
113
+ single-objective. If a ``list ``, ``tuple ``, or ``numpy.ndarray `` is
114
+ returned, then it is a multi-objective problem (applicable even if a
115
+ single element exists).
116
+
111
117
.. code :: python
112
118
113
119
def fitness_func (ga_instance , solution , solution_idx ):
@@ -213,7 +219,7 @@ PyGAD's Modules
213
219
8. The ``visualize `` module to visualize the results.
214
220
215
221
9. The ``utils `` module contains the operators (crossover, mutation,
216
- and parent selection).
222
+ and parent selection) and the NSGA-II code .
217
223
218
224
10. The ``helper `` module has some helper functions.
219
225
0 commit comments