Research article Special Issues

An efficient binary Gradient-based optimizer for feature selection


  • Feature selection (FS) is a classic and challenging optimization task in the field of machine learning and data mining. Gradient-based optimizer (GBO) is a recently developed metaheuristic with population-based characteristics inspired by gradient-based Newton's method that uses two main operators: the gradient search rule (GSR), the local escape operator (LEO) and a set of vectors to explore the search space for solving continuous problems. This article presents a binary GBO (BGBO) algorithm and for feature selecting problems. The eight independent GBO variants are proposed, and eight transfer functions divided into two families of S-shaped and V-shaped are evaluated to map the search space to a discrete space of research. To verify the performance of the proposed binary GBO algorithm, 18 well-known UCI datasets and 10 high-dimensional datasets are tested and compared with other advanced FS methods. The experimental results show that among the proposed binary GBO algorithms has the best comprehensive performance and has better performance than other well known metaheuristic algorithms in terms of the performance measures.

    Citation: Yugui Jiang, Qifang Luo, Yuanfei Wei, Laith Abualigah, Yongquan Zhou. An efficient binary Gradient-based optimizer for feature selection[J]. Mathematical Biosciences and Engineering, 2021, 18(4): 3813-3854. doi: 10.3934/mbe.2021192

    Related Papers:

    [1] Huilin Ge, Yuewei Dai, Zhiyu Zhu, Biao Wang . Robust face recognition based on multi-task convolutional neural network. Mathematical Biosciences and Engineering, 2021, 18(5): 6638-6651. doi: 10.3934/mbe.2021329
    [2] Chii-Dong Ho, Gwo-Geng Lin, Thiam Leng Chew, Li-Pang Lin . Conjugated heat transfer of power-law fluids in double-pass concentric circular heat exchangers with sinusoidal wall fluxes. Mathematical Biosciences and Engineering, 2021, 18(5): 5592-5613. doi: 10.3934/mbe.2021282
    [3] José M. Sigarreta . Extremal problems on exponential vertex-degree-based topological indices. Mathematical Biosciences and Engineering, 2022, 19(7): 6985-6995. doi: 10.3934/mbe.2022329
    [4] Qingqun Huang, Muhammad Labba, Muhammad Azeem, Muhammad Kamran Jamil, Ricai Luo . Tetrahedral sheets of clay minerals and their edge valency-based entropy measures. Mathematical Biosciences and Engineering, 2023, 20(5): 8068-8084. doi: 10.3934/mbe.2023350
    [5] Hao Wang, Guangmin Sun, Kun Zheng, Hui Li, Jie Liu, Yu Bai . Privacy protection generalization with adversarial fusion. Mathematical Biosciences and Engineering, 2022, 19(7): 7314-7336. doi: 10.3934/mbe.2022345
    [6] Meili Tang, Qian Pan, Yurong Qian, Yuan Tian, Najla Al-Nabhan, Xin Wang . Parallel label propagation algorithm based on weight and random walk. Mathematical Biosciences and Engineering, 2021, 18(2): 1609-1628. doi: 10.3934/mbe.2021083
    [7] Xinmei Liu, Xinfeng Liang, Xianya Geng . Expected Value of Multiplicative Degree-Kirchhoff Index in Random Polygonal Chains. Mathematical Biosciences and Engineering, 2023, 20(1): 707-719. doi: 10.3934/mbe.2023032
    [8] Fengwei Li, Qingfang Ye, Juan Rada . Extremal values of VDB topological indices over F-benzenoids with equal number of edges. Mathematical Biosciences and Engineering, 2023, 20(3): 5169-5193. doi: 10.3934/mbe.2023240
    [9] Stefano Cosenza, Paolo Crucitti, Luigi Fortuna, Mattia Frasca, Manuela La Rosa, Cecilia Stagni, Lisa Usai . From Net Topology to Synchronization in HR Neuron Grids. Mathematical Biosciences and Engineering, 2005, 2(1): 53-77. doi: 10.3934/mbe.2005.2.53
    [10] Qiming Li, Tongyue Tu . Large-pose facial makeup transfer based on generative adversarial network combined face alignment and face parsing. Mathematical Biosciences and Engineering, 2023, 20(1): 737-757. doi: 10.3934/mbe.2023034
  • Feature selection (FS) is a classic and challenging optimization task in the field of machine learning and data mining. Gradient-based optimizer (GBO) is a recently developed metaheuristic with population-based characteristics inspired by gradient-based Newton's method that uses two main operators: the gradient search rule (GSR), the local escape operator (LEO) and a set of vectors to explore the search space for solving continuous problems. This article presents a binary GBO (BGBO) algorithm and for feature selecting problems. The eight independent GBO variants are proposed, and eight transfer functions divided into two families of S-shaped and V-shaped are evaluated to map the search space to a discrete space of research. To verify the performance of the proposed binary GBO algorithm, 18 well-known UCI datasets and 10 high-dimensional datasets are tested and compared with other advanced FS methods. The experimental results show that among the proposed binary GBO algorithms has the best comprehensive performance and has better performance than other well known metaheuristic algorithms in terms of the performance measures.



    To exemplify the phenomena of compounds scientifically, researchers utilize the contraption of the diagrammatic hypothesis, it is a well-known branch of geometrical science named graph theory. This division of numerical science provides its services in different fields of sciences. The particular example in networking [1], from electronics [2], and for the polymer industry, we refer to see [3]. Particularly in chemical graph theory, this division has extra ordinary assistance to study giant and microscope-able chemical compounds. For such a study, researchers made some transformation rules to transfer a chemical compound to a discrete pattern of shapes (graph). Like, an atom represents as a vertex and the covalent bonding between atoms symbolized as edges. Such transformation is known as molecular graph theory. A major importance of this alteration is that the hydrogen atoms are omitted. Some chemical structures and compounds conversion are presented in [4,5,6].

    In cheminformatics, the topological index gains attraction due to its implementations. Various topological indices help to estimate a bio-activity and physicochemical characteristics of a chemical compound. Some interesting and useful topological indices for various chemical compounds are studied in [3,7,8]. A topological index modeled a molecular graph or a chemical compound into a numerical value. Since 1947, topological index implemented in chemistry [9], biology [10], and information science [11,12]. Sombor index and degree-related properties of simplicial networks [13], Nordhaus–Gaddum-type results for the Steiner Gutman index of graphs [14], Lower bounds for Gaussian Estrada index of graphs [15], On the sum and spread of reciprocal distance Laplacian eigenvalues of graphs in terms of Harary index [16], the expected values for the Gutman index, Schultz index, and some Sombor indices of a random cyclooctane chain [17,18,19], bounds on the partition dimension of convex polytopes [20,21], computing and analyzing the normalized Laplacian spectrum and spanning tree of the strong prism of the dicyclobutadieno derivative of linear phenylenes [22], on the generalized adjacency, Laplacian and signless Laplacian spectra of the weighted edge corona networks [23,24], Zagreb indices and multiplicative Zagreb indices of Eulerian graphs [25], Minimizing Kirchhoff index among graphs with a given vertex bipartiteness, [26], asymptotic Laplacian energy like invariant of lattices [27]. Few interesting studies regarding the chemical graph theory can be found in [28,29,30,31,32].

    Recently, the researchers of [33] introduced a topological descriptor and called the face index. Moreover, the idea of computing structure-boiling point and energy of a structure, motivated them to introduced this parameter without heavy computation. They computed these parameters for different models compare the results with previous literature and found approximate solutions with comparatively less computations. This is all the blessings of face index of a graph. The major concepts of this research work are elaborated in the given below definitions.

    Definition 1.1. [33] Let a graph G=(V(G),E(G),F(G)) having face, edge and vertex sets notation with F(G),E(G),V(G), respectively. It is mandatory that the graph is connected, simple and planar. If e from the edge set E(G), is one of those edges which surrounds a face, then the face f from the face set F(G), is incident to the edge e. Likewise, if a vertex α from the vertex set V(G) is at the end of those incident edges, then a face f is incident to that vertex. This face-vertex incident relation is symbolized here by the notation αf. The face degree of f in G is described as d(f)=αfd(α), which are elaborated in the Figure 1.

    Figure 1.  An example of face degree.

    Definition 1.2. [33] The face index FI(G), for a graph G, is formulated as

    FI(G)=fF(G)d(f)=αf,fF(G)d(α).

    In the Figure 1, we can see that there are two faces with degree 4, exactly two with five count and four with count of 6. Moreover, there is an external face with count of face degree 28, which is the count of vertices.

    As the information given above that the face index is quite new and introduced in the year 2020, so there is not so much literature is available. A few recent studies on this topic are summarized here. A chemical compound of silicon carbides is elaborated with such novel definition in [34]. Some carbon nanotubes are discussed in [35]. Except for the face index, there are distance and degree-based graphical descriptors available in the literature. For example, distance-based descriptors of phenylene nanotube are studied in [36], and in [37] titania nanotubes are discussed with the same concept. Star networks are studied in [38], with the concept of degree-based descriptors. Bounds on the descriptors of some generalized graphs are discussed in [39]. General Sierpinski graph is discussed in [40], in terms of different topological descriptor aspects. The study of hyaluronic acid-doxorubicin ar found in [41], with the same concept of the index. The curvilinear regression model of the topological index for the COVID-19 treatment is discussed in [42]. For further reading and interesting advancements of topological indices, polynomials of zero-divisor structures are found in [43], zero divisor graph of commutative rings [44], swapped networks modeled by optical transpose interconnection system [45], metal trihalides network [46], some novel drugs used in the cancer treatment [47], para-line graph of Remdesivir used in the prevention of corona virus [48], tightest nonadjacently configured stable pentagonal structure of carbon nanocones [49]. In order to address a novel preventive category (P) in the HIV system known as the HIPV mathematical model, the goal of this study is to offer a design of a Morlet wavelet neural network (MWNN) [50].

    In the next section, we discussed the newly developed face index or face-based index for different chemical compounds. Silicate network, triangular honeycomb network, carbon sheet, polyhedron generalized sheet, and generalized chain of silicate network are studied with the concept of the face-based index. Given that the face index is more versatile than vertex degree-based topological descriptors, this study will aid in understanding the structural characteristics of chemical networks. Only the difficulty authors will face to compute the face degree of a generalized network or structure, because it is more generalized version and taking degree based partition of edges into this umbrella of face index.

    Silicates are formed when metal carbonates or metal oxides react with sand. The SiO4, which has a tetrahedron structure, is the fundamental chemical unit of silicates. The central vertex of the SiO4 tetrahedron is occupied by silicon ions, while the end vertices are occupied by oxygen ions [51,52,53]. A silicate sheet is made up of rings of tetrahedrons that are joined together in a two-dimensional plane by oxygen ions from one ring to the other to form a sheet-like structure. The silicate network SLn symbol, where n represents the total number of hexagons occurring between the borderline and center of the silicate network SLn. The silicate network of dimension one is depicted in Figure 2. It contain total 3n(5n+1) vertices are 36n2 edges. Moreover, the result required is detailed are available in Table 1.

    Figure 2.  A silicate network SL1.
    Table 1.  The number of f12, f15 and f36 in each dimension.
    Dimension |f12| |f15| |f36|
    1 24 48 7
    2 32 94 14
    3 40 152 23
    4 48 222 34
    5 56 304 47
    6 64 398 62
    7 72 504 79
    8 80 622 98
    . . . .
    . . . .
    . . . .
    n 8n+16 6n2+28n+14 n2+4n+2

     | Show Table
    DownLoad: CSV

    Theorem 2.1. Let SLn be the silicate network of dimension n1. Then the face index of SLn is

    FI(SLn)=126n2+720n+558.

    Proof. Consider SLn the graph of silicate network with dimension n. Suppose fi denotes the faces of graph SLn having degree i. that is, d(fi)=αfid(α)=i and |fi| denotes the number of faces with degree i. The graph SLn contains three types of internal faces f12, f15, f36, and single external face which is usually denoted by f.

    If SLn has one dimension then sum of degree of vertices incident to the external face is 144 and when SLn has two dimension then sum of degree of incident vertices to the external face is 204 whenever SLn has three dimension then sum of degree of incident vertices to the external face is 264. Similarly, SLn has ndimension then sum of degree of incident vertices to the external face is 60n+84.

    The number of internal faces with degree in each dimension is mentioned in Table 1.

    By using the definition of face index FI we have

    FI(SLn)=αfF(SLn)d(α)=αf12F(SLn)d(α)+αf15F(SLn)d(α)+αf36F(SLn)d(α)+αfF(SLn)d(α)=|f12|(12)+|f15|(15)+|f36|(36)+(60n+84)=(8n+16)(12)+(6n2+28n+14)(15)+(n2+4n+2)(36)+60n+84=126n2+72n+558.

    Hence, this is our required result.

    A chain silicate network of dimension (m,n) is symbolized as CSL(m,n) which is made by arranging (m,n) tetrahedron molecules linearly. A chain silicate network of dimension (m,n) with m,n1 where m denotes the number of rows and each row has n number of tetrahedrons. The following theorem formulates the face index FI for chain silicate network.

    Theorem 2.2. Let CSL(m,n) be the chain of silicate network of dimension m,n1. Then the face index FI of the graph CSL(m,n) is

    FI(CSL(m,n))={48n12if m=1, n1;96m12if n=1, m2;168m60if n=2,m2;45m9n+36mn42if both m,n are even45m9n+36mn21otherwise.

    Proof. Let CSL(m,n) be the graph of chain silicate network of dimension (m,n) with m,n1 where m represents the number of rows and n is the number of tetrahedrons in each row. A graph CSL(m,n) for m=1 contains three type of internal faces f9, f12 and f15 with one external face f. While for m2, it has four type of internal faces f9, f12, f15 and f36 with one external face f. We want to evaluate the algorithm of face index FI for chain silicate network. We will discuss it in two different cases.

    Case 1: When CSL(m,n) has one row (m=1) with n number of tetrahedrons as shown in the Figure 3.

    Figure 3.  Chain silicate network CSL(m,n) with particular value of m=1.

    The graph has three type of internal faces f9, f12 and f15 with one external face f. The sum of degree of incident vertices to the external face is 9n and number of faces are |f9|=2, |f12|=2n and |f15|=n2. Now the face index FI of the graph CSL(m,n) is given by

    FI(CSL(m,n))=αfF(CSL(m,n))d(α)=αf9F(CSL(m,n))d(α)+αf12F(CSL(m,n))d(α)+αf15F(CSL(m,n))d(α)+αfF(CSL(m,n))d(α)=|f9|(9)+|f12|(12)+|f15|(15)+(9n)=(2)(9)+(2n)(12)+(n2)(15)+9n=48n12.

    Case 2: When CSL(m,n) has more than one rows (m1) with n number of tetrahedrons in each row as shown in the Figure 4.

    Figure 4.  Chain silicate network CSL(m,n).

    The graph has four type of internal faces f9, f12, f15 and f36 with one external face f. The sum of degree of incident vertices to the external face is

    αfF(CSL(m,n))d(α)={18mif n=1, m1;27mif n=2, m1;30m+15n30if both m,n are even30m+15n33otherwise.

    The number of faces are |f9|, |f12|, f15 and |f36| are given by

    |f9|={2if  m is odd3+(1)nif  m is even.|f12|={2(2m+n1)if m is odd4(n+12+2m1)if m is even|f15|=(3m2)nm|f36|={(m12)(n1)if m is odd(2n+(1)n14)(m22)nif m is even.

    Now the face index FI of the graph CSL(m,n) is given by

    FI(CSL(m,n))=αfF(CSL(m,n))d(α)=αf9F(CSL(m,n))d(α)+αf12F(CSL(m,n))d(α)+αf15F(CSL(m,n))d(α)+αf36F(CSL(m,n))d(α)+αfF(CSL(m,n))d(α)=|f9|(9)+|f12|(12)+|f15|(15)+|f36|(36)+αfF(CSL(m,n))d(α).

    After some mathematical simplifications, we can get

    FI(CSL(m,n))={48n12if m=196m12if n=1,m168m60if n=2,m45m9n+36mn42if both m,n are even45m9n+36mn21otherwise.

    There are three regular plane tessellations known to exist, each constituted from the same type of regular polygon: triangular, square, and hexagonal. The triangular tessellation is used to define the hexagonal network, which is extensively studied in [54]. A dimensioned hexagonal network THk has 3k23k+1 vertices and 9k215k+6 edges, where k is the number of vertices on one side of the hexagon. It has 2k2 diameter. There are six vertices of degree three that are referred to as corner vertices. Moreover, the result required detailed are available in the Table 2.

    Figure 5.  Triangular honeycomb network with dimension k=3.
    Table 2.  The number of f12, f14, f17 and f18 in each dimension.
    Dimension |f12| |f14| |f17| |f18|
    1 6 0 0 0
    2 6 12 12 12
    3 6 24 24 60
    4 6 36 36 144
    5 6 48 48 264
    6 6 60 60 420
    7 6 72 72 612
    8 6 84 84 840
    . . . . .
    . . . . .
    . . . . .
    k 6 12(k1) 12(k1) 18k242k+24

     | Show Table
    DownLoad: CSV

    Theorem 2.3. Let THk be the triangular honeycomb network of dimension k1. Then the face index of graph THk is

    FI(THk)=324k2336k+102.

    Proof. Consider THk be a graph of triangular honeycomb network. The graph TH1 has one internal and only one external face while graph THk with k2, contains four types of internal faces f12, f14, f17, and f18 with one external face f.

    For TH1 the sum of degree of incident vertices to the external face is 18 and in TH2 the sum of degree of incident vertices to the external face is 66. Whenever the graph TH3, the sum of degree of incident vertices to the external face is 114. Similarly, for THk has ndimension then sum of degree of incident vertices to the external face is 48k30.

    The number of internal faces with degree in each dimension is given in Table 2.

    By using the definition of face index FI we have

    FI(THk)=αfF(THk)d(α)=αf12F(THk)d(α)+αf14F(THk)d(α)+αf17F(THk)d(α)+αf18F(THk)d(α)+αfF(THk)d(α)=|f12|(12)+|f14|(14)+|f17|(17)+|f18|(18)+(48k30)=(6)(12)+(12(k1))(14)+(12(k1))(17)+(18k242k+24)(18)+48k30=324k2336k+102.

    Hence, this is our required result.

    Given carbon sheet in the Figure 6, is made by grid of hexagons. There are few types of carbon sheets are given in [55,56]. The carbon sheet is symbolize as HCSm,n, where n represents the total number of vertical hexagons and m denotes the horizontal hexagons. It contain total 4mn+2(n+m)1 vertices and 6nm+2m+n2 edges. Moreover, the result required detailed are available in Tables 3 and 4.

    Figure 6.  Carbon Sheet HCSm,n.
    Table 3.  The number of f15, f16, and f18 in each dimension.
    Dimension m |f15| |f16| |f18| |f|
    2 3 2(n1) n1 20n+7

     | Show Table
    DownLoad: CSV
    Table 4.  The number of f15, f16, f17, f18, and f in each dimension.
    Dimension m |f15| |f16| |f17| |f18| |f|
    2 3 2(n1) 0 n1 20n+7
    3 2 2n 1 3(n1) 20n+17
    4 2 2n 3 5(n1) 20n+27
    5 2 2n 5 7(n1) 20n+37
    6 2 2n 7 9(n1) 20n+47
    . . . . . .
    . . . . . .
    . . . . . .
    m 2 2n 2m5 2mn2m3n+3 20n+10m13

     | Show Table
    DownLoad: CSV

    Theorem 2.4. Let HCSm,n be the carbon sheet of dimension (m,n) and m,n2. Then the face index of HCSm,n is

    FI(HCSm,n)={70n+2ifm=236mn142(n4m)ifm3.

    Proof. Consider HCSm,n be the carbon sheet of dimension (m,n) and m,n2. Let fi denotes the faces of graph HCSm,n having degree i, which is d(fi)=αfid(α)=i, and |fi| denotes the number of faces with degree i. A graph HCSm,n for a particular value of m=2 contains three types of internal faces f15, f16, f17 and f18 with one external face f. While for the generalize values of m3, it contain four types of internal faces f15, f16 and f17 with one external face f in usual manner. For the face index of generalize nanotube, we will divide into two cases on the values of m.

    Case 1: When HCSm,n has one row or HCS2,n.

    A graph HCSm,n for a this particular value of m=2 contains three types of internal faces |f15|=3, |f16|=2(n1) and |f18|=n1 with one external face f. For the face index of carbon sheet, details are given in the Table 3. Now the face index FI of the graph NT2,n is given by

    FI(HCS2,n)=αfF(HCS2,n)d(α)=αf15F(HCS2,n)d(α)+αf16F(HCS2,n)d(α)+αf18F(HCS2,n)d(α)+αfF(HCS2,n)d(α)=|f15|(15)+|f16|(16)+|f18|(18)+20n+7.=3(15)+2(n1)(16)+(n1)(18)+20n+7.=70n+2.

    Case 2: When HCSm,n has m3 rows.

    A graph HCSm,n for generalize values of m3 contains four types of internal faces |f15|=2, |f16|=2n, |f17|=2m5 and |f18|=2mn2m3n+3 with one external face f. For the face index of carbon sheet, details are given in the Table 4. Now the face index FI of the graph NTm,n is given by

    FI(HCSm,n)=αfF(HCSm,n)d(α)=αf15F(HCSm,n)d(α)+αf16F(HCSm,n)d(α)+αf17F(HCSm,n)d(α)+αf18F(HCSm,n)d(α)+αfF(HCSm,n)d(α)=|f15|(15)+|f16|(16)+|f17|(17)+|f18|(18)+20n+10m13.=36mn2n+8m14.

    Given structure of polyhedron generalized sheet of C28 in the Figure 7, is made by generalizing a C28 polyhedron structure which is shown in the Figure 8. This particular structure of C28 polyhedron are given in [57]. The polyhedron generalized sheet of C28 is as symbolize PHSm,n, where n represents the total number of vertical C28 polyhedrons and m denotes the horizontal C28 polyhedrons. It contain total 23nm+3n+2m vertices and 33nm+n+m edges. Moreover, the result required detailed are available in Tables 3 and 5.

    Figure 7.  Polyhedron generalized sheet of C28 for m=n=1, or PHS1,1.
    Figure 8.  Polyhedron generalized sheet of C28 or PHSm,n.
    Table 5.  The number of f14,f15,f16,f17,f18,f20, and f35 in each dimension.
    m |f14| |f15| |f16| |f17| |f18| |f20| |f35|
    1 2n+1 2 4n2 0 0 2n1 0
    2 2n+2 2 8n2 2 2n2 4n2 2n1
    3 2n+3 2 12n2 4 4n4 6n3 4n2
    . . . . . . . .
    . . . . . . . .
    . . . . . . . .
    m 2n+m 2 4mn2 2m2 2mn2(m+n)+2 2mnm 2mn(m+2n)+1

     | Show Table
    DownLoad: CSV

    Theorem 2.5. Let PHSm,n be the polyhedron generalized sheet of C28 of dimension (m,n) and m,n1. Then the face index of PHSm,n is

    FI(PHSm,n)=210mn2(3m+5n).

    Proof. Consider PHSm,n be the polyhedron generalized sheet of C28 of dimension (m,n) and m,n1. Let fi denotes the faces of graph PHSm,n having degree i, which is d(fi)=αfid(α)=i, and |fi| denotes the number of faces with degree i. A graph PHSm,n for the generalize values of m,n1, it contain seven types of internal faces f14,f15,f16,f17,f18,f20 and f35 with one external face f in usual manner. For the face index of polyhedron generalized sheet, details are given in the Table 5.

    A graph PHSm,n for generalize values of m,n1 contains seven types of internal faces |f14|=2n+m, |f15|=2, |f16|=4nm2, |f17|=2(m1), |f18|=2nm2(m+n)+2, |f20|=2nm2mnm, and |f35|=2mnm2n+1 with one external face f. Now the face index FI of the graph PHSm,n is given by

    FI(PHSm,n)=αfF(PHSm,n)d(α)=αf14F(PHSm,n)d(α)+αf15F(PHSm,n)d(α)+αf16F(PHSm,n)d(α)+αf17F(PHSm,n)d(α)+αf18F(PHSm,n)d(α)+αf20F(PHSm,n)d(α)+αf35F(PHSm,n)d(α)+αfF(PHSm,n)d(α)=|f14|(14)+|f15|(15)+|f16|(16)+|f17|(17)+|f18|(18)+|f20|(20)+|f35|(35)+37m+68n35.=210mn6m10n.

    With the advancement of technology, types of equipment and apparatuses of studying different chemical compounds are evolved. But topological descriptors or indices are still preferable and useful tools to develop numerical science of compounds. Therefore, from time to time new topological indices are introduced to study different chemical compounds deeply. In this study, we discussed a newly developed tool of some silicate type networks and generalized sheets, carbon sheet, polyhedron generalized sheet, with the face index concept. It provides numerical values of these networks based on the information of faces. It also helps to study physicochemical characteristics based on the faces of silicate networks.

    M. K. Jamil conceived of the presented idea. K. Dawood developed the theory and performed the computations. M. Azeem verified the analytical methods, R. Luo investigated and supervised the findings of this work. All authors discussed the results and contributed to the final manuscript.

    This work was supported by the National Science Foundation of China (11961021 and 11561019), Guangxi Natural Science Foundation (2020GXNSFAA159084), and Hechi University Research Fund for Advanced Talents (2019GCC005).

    The authors declare that they have no conflicts of interest.



    [1] M. Chen, S. Mao, Y. Liu, Big data: A Survey, Mobile Netw. Appl., 19 (2014), 171-209. doi: 10.1007/s11036-013-0489-0
    [2] I. Guyon, A. Elisseeff, An introduction of variable and feature selection, J. Mach. Learn Res., 3 (2003), 1157-1182.
    [3] Y. Wan, M. Wang, Z. Ye, X. Lai, A feature selection method based on modified binary coded ant colony optimization algorithm, Appl. Soft Comput., 49 (2016), 248-258. doi: 10.1016/j.asoc.2016.08.011
    [4] H. Liu, H. Motoda, Feature selection for knowledge discovery and data mining, Kluwer Academic, 2012.
    [5] Z. Sun, G. Bebis, R. Miller, Object detection using feature subset selection, Pattern Recogn., 37 (2004), 2165-2176. doi: 10.1016/j.patcog.2004.03.013
    [6] H. Liu, H. Motoda, Feature Extraction, Construction and Selection: A Data Mining Perspective Springer Science & Business Media, Boston, MA, 1998.
    [7] Z. Zheng, X. Wu, R. K. Srihari, Feature selection for text categorization on imbalanced data, ACM Sigkdd Explor. Newsl., 6 (2004), 80-89. doi: 10.1145/1007730.1007741
    [8] H. Uguz, A two-stage feature selection method for text categorization by using information gain, principal component analysis and genetic algorithm, Knowl.-Based Syst., 24 (2011), 1024-1032. doi: 10.1016/j.knosys.2011.04.014
    [9] H. K. Ekenel, B. Sankur, Feature selection in the independent component subspace for face recognition, Pattern Recogn. Lett., 25 (2004), 1377-1388. doi: 10.1016/j.patrec.2004.05.013
    [10] H. R. Kanan, K. Faez, An improved feature selection method based on ant colony optimization (ACO) evaluated on face recognition system, Appl. Math. Comput., 205 (2008), 716-725.
    [11] F. Model, P. Adorjan, A. Olek, C. Piepenbrock, Feature selection for DNA methylation based cancer classification, Bioinformatics, 17 (2001), S157-S164. doi: 10.1093/bioinformatics/17.suppl_1.S157
    [12] N. Chuzhanova, A. J. Jones, S. Margetts, Feature selection for genetic sequence classification, Bioinformatics, 14 (1998), 139-143. doi: 10.1093/bioinformatics/14.2.139
    [13] S. Tabakhi, A. Najafi, R. Ranjbar, P. Moradi, Gene selection for microarray data classification using a novel ant colony optimization, Neurocomputing, 168 (2015), 1024-1036. doi: 10.1016/j.neucom.2015.05.022
    [14] D. Liang, C. F. Tsai, H. T. Wu, The effect of feature selection on financial distress prediction, Knowl.-Based Syst., 73 (2015), 289-297. doi: 10.1016/j.knosys.2014.10.010
    [15] M. Ramezani, P. Moradi, F. A. Tab, Improve performance of collaborative filtering systems using backward feature selection, in The 5th Conference on Information and Knowledge Technology, (2013), 225-230.
    [16] B. Tseng, Tzu Liang Bill, C. C. Huang, Rough set-based approach to feature selection in customer relationship management, Omega, 35 (2007), 365-383. doi: 10.1016/j.omega.2005.07.006
    [17] R. Sawhney, P. Mathur, R. Shankar, A firefly algorithm based wrapper-penalty feature selection method for cancer diagnosis, in International Conference on Computational Science and Its Applications, Springer, Cham, (2018), 438-449.
    [18] B. Guo, R. I. Damper, S. R. Gunn, J. D. B. Nelson, A fast separability-based feature-selection method for high-dimensional remotely sensed image classification, Pattern Recogn., 41 (2008), 1653-1662. doi: 10.1016/j.patcog.2007.11.007
    [19] R. Abraham, J. B. Simha, S. S. Iyengar, Medical datamining with a new algorithm for feature selection and naive bayesian classifier, in 10th International Conference on Information Technology (ICIT 2007), IEEE, 2007, 44-49.
    [20] L. Yu, H. Liu, Feature selection for high-dimensional data: A fast correlation-based filter solution, in Proceedings of the 20th international conference on machine learning (ICML-03), (2003), 856-863.
    [21] L. Cosmin, J. Taminau, S. Meganck, D. Steenhoff, A survey on filter techniques for feature selection in gene expression microarray analysis, IEEE/ACM Trans. Comput. Biol. Bioinf., 9 (2012), 1106-1119. doi: 10.1109/TCBB.2012.33
    [22] S. Maldonado, R. Weber, A wrapper method for feature selection using support vector machines, Inform. Sci., 179 (2009), 2208-2217. doi: 10.1016/j.ins.2009.02.014
    [23] J. Huang, Y. Cai, X. Xu, A hybrid genetic algorithm for feature selection wrapper based on mutual information, Pattern Recogn. Lett., 28 (2007), 1825-1844. doi: 10.1016/j.patrec.2007.05.011
    [24] C. Tang, X. Liu, X. Zhu, J. Xiong, M. Li, J. Xia, et al., Feature selective projection with low-rank embedding and dual laplacian regularization, IEEE Trans. Knowl. Data. Eng., 32 (2019), 1747-1760.
    [25] C. Tang, M. Bian, X. Liu, M. Li, H. Zhou, P. Wang, et al., Unsupervised feature selection via latent representation learning and manifold regularization, Neural Networks, 117 (2019), 163-178. doi: 10.1016/j.neunet.2019.04.015
    [26] S. Sharifzadeh, L. Clemmensen, C. Borggaard, S. Støier, B. K. Ersbøll, Supervised feature selection for linear and non-linear regression of L*a*b color from multispectral images of meat, Eng. Appl. Artif. Intel., 27 (2013), 211-227.
    [27] C. Tang, X. Zheng, X. Liu, L. Wang, Cross-view locality preserved diversity and consensus learning for multi-view unsupervised feature selection, IEEE Trans. Knowl. Data. Eng., 2021.
    [28] C. Tang, X. Zhu, X. Liu, L. Wang, Cross-View local structure preserved diversity and consensus learning for multi-view unsupervised feature selection, in Proceedings of the AAAI Conference on Artificial Intelligence, 33 (2019), 5101-5108.
    [29] M. Dorigo, Optimization, Learning and Natural Algorithms, Phd Thesis Politecnico Di Milano, 1992.
    [30] C. Lai, M. J. T. Reinders, L. Wessels, Random subspace method for multivariate feature selection, Pattern Recogn. Lett., 27 (2006), 1067-1076. doi: 10.1016/j.patrec.2005.12.018
    [31] B. Xue, M. Zhang, W. N. Browne, X. Yao, A survey on evolutionary computation approaches to feature selection, IEEE Trans. Evol. Comput., 20 (2016), 606-626. doi: 10.1109/TEVC.2015.2504420
    [32] J. J. Grefenstette, Optimization of control parameters for genetic algorithms, IEEE Trans. Syst. Man Cybern., 16 (1986), 122-128. doi: 10.1109/TSMC.1986.289288
    [33] B. G. Obaiahnahatti, J. Kennedy, A new optimizer using particle swarm theory, in MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, IEEE, (1995), 39-43.
    [34] K. Chen, F. Y. Zhou, X. F. Yuan, Hybrid particle swarm optimization with spiral-shaped mechanism for feature selection, Expert Syst. Appl., 128 (2019), 140-156. doi: 10.1016/j.eswa.2019.03.039
    [35] S. Mirjalili, A. Lewis, The whale optimization algorithm, Adv. Eng. Softw., 95 (2016), 51-67. doi: 10.1016/j.advengsoft.2016.01.008
    [36] S. Mirjalili, S.M. Mirjalili, A. Lewis, Grey wolf optimizer, Adv. Eng. Softw., 69 (2014), 46-61. doi: 10.1016/j.advengsoft.2013.12.007
    [37] S. Shahrzad, S. Mirjalili, A. Lewis, Grasshopper optimisation algorithm: Theory and application, Adv. Eng. Softw., 105 (2017), 30-47. doi: 10.1016/j.advengsoft.2017.01.004
    [38] E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, GSA: a gravitational search algorithm, Inform. Sci., 179 (2009), 2232-2248. doi: 10.1016/j.ins.2009.03.004
    [39] S. Li, H. Chen, M. Wang, A. A. Heidari, S. Mirjalili, Slime mould algorithm: A new method for stochastic optimization, Future Gener. Comput. Syst., 111 (2020), 300-323. doi: 10.1016/j.future.2020.03.055
    [40] Y. Yang, H. Chen, A. A. Heidari, A. H. Gandomi, Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts, Expert Syst. Appl., 171 (2021), 114864.
    [41] I. Ahmadianfar, O. Bozorg-Haddad, X. Chu, Gradient-based optimizer: A new metaheuristic optimization algorithm, Inform. Sci., 540 (2020), 131-159. doi: 10.1016/j.ins.2020.06.037
    [42] S. Mirjalili, Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems, Neural Comput. Appl., 27 (2016), 1053-1073. doi: 10.1007/s00521-015-1920-1
    [43] T. J. Ypma, Historical development of the Newton-Raphson method, SIAM Rev., 37 (1995), 531-551. doi: 10.1137/1037125
    [44] S. Mirjalili, A. Lewis, S-shaped versus V-shaped transfer functions for binary particle swarm optimization, Swarm Evol. Comput., 9 (2013), 1-14. doi: 10.1016/j.swevo.2012.09.002
    [45] H. Liu, J. Li, L. Wong, A comparative study on feature selection and classification methods using gene expression profiles and proteomic patterns, Genome Inform., 13 (2002), 51-60.
    [46] M. Dash, H. Liu, Feature selection for classification, Intell. Data Anal., 1 (1997), 131-156. doi: 10.3233/IDA-1997-1302
    [47] W. Siedlecki, J. Sklansky, A note on genetic algorithms for large-scale feature selection, Pattern Recogn. Lett., 10 (1989), 335-347. doi: 10.1016/0167-8655(89)90037-8
    [48] R. Leardi, R. Boggia, M. Terrile, Genetic algorithms as a strategy for feature selection, J. Chemom., 6 (1992), 267-281. doi: 10.1002/cem.1180060506
    [49] I. S. Oh, J. S. Lee, B. R. Moon, Hybrid genetic algorithms for feature selection, IEEE Trans. Pattern Anal., 26 (2004), 1424-1437. doi: 10.1109/TPAMI.2004.105
    [50] R. Storn, K. Price, Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim., 11 (1997), 341-359. doi: 10.1023/A:1008202821328
    [51] B. Xue, W. Fu, M. Zhang, Differential evolution (DE) for multi-objective feature selection in classification, in Proceedings of the Companion Publication of the 2014 Annual Conference on Genetic and Evolutionary Computation, (2014), 83-84.
    [52] D. Karaboga, B. Akay, A comparative study of artificial bee colony algorithm, Appl. Math. Comput., 214 (2009), 108-132.
    [53] A. A. Heidari, S. Mirjalili, H. Faris, I. Aljarah, M. Mafarja, H. Chen, Harris hawks optimization: Algorithm and applications, Future Gener. Comput. Syst., 97 (2019), 849-872. doi: 10.1016/j.future.2019.02.028
    [54] A. Faramarzi, M. Heidarinejad, S. Mirjalili, A. H. Gandomi, Marine predators algorithm: A nature-inspired metaheuristic, Expert Syst. Appl., 152 (2020), 113377. doi: 10.1016/j.eswa.2020.113377
    [55] O. S. Qasim, Z. Algamal, Feature selection using particle swarm optimization-based logistic regression model, Chemom. Intell. Lab. Syst., 182 (2018), 41-46. doi: 10.1016/j.chemolab.2018.08.016
    [56] K. Chen, F. Y. Zhou, X. F. Yuan, Hybrid particle swarm optimization with spiral-shaped mechanism for feature selection, Expert Syst. Appl., 128 (2019), 140-156. doi: 10.1016/j.eswa.2019.03.039
    [57] B. Xue, M. Zhang, W. N. Browne, Particle swarm optimization for feature selection in classification: a multi-objective approach, IEEE Trans. Cybern., 43 (2013), 1656-1671. doi: 10.1109/TSMCB.2012.2227469
    [58] E. Emary, H. M. Zawbaa, A. E. Hassanien, Binary grey wolf optimization approaches for feature selection, Neurocomputing, 172 (2016), 371-381. doi: 10.1016/j.neucom.2015.06.083
    [59] P. Hu, J. S. Pan, S. C. Chu, Improved binary grey wolf optimizer and its application for feature selection, Knowl.-Based Syst., 195 (2020), 105746. doi: 10.1016/j.knosys.2020.105746
    [60] T. Qiang, X. Chen, X. Liu, Multi-strategy ensemble grey wolf optimizer and its application to feature selection, Appl. Soft Comput., 76 (2019), 16-30. doi: 10.1016/j.asoc.2018.11.047
    [61] M. M. Mafarja, S. Mirjalili, Whale optimization approaches for wrapper feature selection, Appl. Soft Comput., 62 (2018), 441-453. doi: 10.1016/j.asoc.2017.11.006
    [62] M. M. Mafarja, S. Mirjalili, Hybrid whale optimization algorithm with simulated annealing for feature selection, Neurocomputing, 260 (2017), 302-312. doi: 10.1016/j.neucom.2017.04.053
    [63] R. K. Agrawal, B. Kaur, S. Sharma, Quantum based whale optimization algorithm for wrapper feature selection, Appl. Soft Comput., 89 (2020), 106092. doi: 10.1016/j.asoc.2020.106092
    [64] C. R. Hwang, Simulated annealing: Theory and applications, Acta. Appl. Math., 12 (1988), 108-111.
    [65] H. Shareef, A. A. Ibrahim, A. H. Mutlag, Lightning search algorithm, Appl. Soft Comput., 36 (2015), 315-333. doi: 10.1016/j.asoc.2015.07.028
    [66] S. Mirjalili, S. M. Mirjalili, A. Hatamlou, Multi-verse optimizer: a nature-inspired algorithm for global optimization, Neural Comput. Appl., 27 (2016), 495-513. doi: 10.1007/s00521-015-1870-7
    [67] H. Abedinpourshotorban, S. M. Shamsuddin, Z. Beheshti, D. N. A. Jawawib, Electromagnetic field optimization: A physics-inspired metaheuristic optimization algorithm, Swarm Evol. Comput., 26 (2016), 8-22. doi: 10.1016/j.swevo.2015.07.002
    [68] A. Y. S. Lam, V. O. K. Li, Chemical-reaction-inspired metaheuristic for optimization, IEEE Trans. Evol. Comput., 14 (2009), 381-399.
    [69] F. A. Hashim, E. H. Houssein, M. S. Mabrouk, W. Al-Atabany, S. Mirjalili, Henry gas solubility optimization: A novel physics-based algorithm, Future Gener. Comp. Syst., 101 (2019), 646-667. doi: 10.1016/j.future.2019.07.015
    [70] R. Meiri, J. Zahavi, Using simulated annealing to optimize the feature selection problem in marketing applications, Eur. J. Oper. Res., 171 (2006), 842-858. doi: 10.1016/j.ejor.2004.09.010
    [71] S. W. Lin, Z. J. Lee, S. C. Chen, T. Y. Tseng, Parameter determination of support vector machine and feature selection using simulated annealing approach, Appl. Soft Comput., 8 (2008), 1505-1512. doi: 10.1016/j.asoc.2007.10.012
    [72] E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, BGSA: Binary gravitational search algorithm, Nat. Comput., 9 (2010), 727-745. doi: 10.1007/s11047-009-9175-3
    [73] S. Nagpal, S. Arora, S. Dey, Feature selection using gravitational search algorithm for biomedical data, Procedia Comput. Sci., 115 (2017), 258-265. doi: 10.1016/j.procs.2017.09.133
    [74] P. C. S. Rao, A. J. S. Kumar, Q. Niyaz, P. Sidike, V. K. Devabhaktuni, Binary chemical reaction optimization based feature selection techniques for machine learning classification problems, Expert Syst. Appl., 167 (2021), 114169. doi: 10.1016/j.eswa.2020.114169
    [75] N. Neggaz, E. H. Houssein, K. Hussain, An efficient henry gas solubility optimization for feature selection, Expert Syst. Appl., 152 (2020), 113364. doi: 10.1016/j.eswa.2020.113364
    [76] R. V. Rao, V. J. Savsani, D. P. Vakharia, Teaching-learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput. Aided Design, 43 (2011), 303-315. doi: 10.1016/j.cad.2010.12.015
    [77] S. Hosseini, A. A. Khaled, A survey on the imperialist competitive algorithm metaheuristic: implementation in engineering domain and directions for future research, Appl. Soft Comput., 24 (2014), 1078-1094. doi: 10.1016/j.asoc.2014.08.024
    [78] R. Moghdani, K. Salimifard, Volleyball premier league algorithm, Appl. Soft Comput., 64 (2017), 161-185.
    [79] H. C. Kuo, C. H. Lin, Cultural evolution algorithm for global optimizations and its applications, J. Appl. Res. Technol., 11 (2013), 510-522. doi: 10.1016/S1665-6423(13)71558-X
    [80] M. Allam, M. Nandhini, Optimal feature selection using binary teaching learning based optimization algorithm, J. King Saud Univ.-Comput. Inform. Sci., 10 (2018).
    [81] S. J. Mousavirad, H. Ebrahimpour-Komleh, Feature selection using modified imperialist competitive algorithm, in ICCKE 2013, IEEE, (2013), 400-405.
    [82] A. Keramati, M. Hosseini, M. Darzi, A. A. Liaei, Cultural algorithm for feature selection, in The 3rd International Conference on Data Mining and Intelligent Information Technology Applications, IEEE, (2011), 71-76.
    [83] D. H. Wolpert, W. G. Macready, No free lunch theorems for optimization, IEEE Trans. Evol. Comput., 1 (1997), 67-82. doi: 10.1109/4235.585893
    [84] A. Fink, S. Vo, Solving the continuous flow-shop scheduling problem by metaheuristics, Eur. J. Oper. Res., 151 (2003), 400-414. doi: 10.1016/S0377-2217(02)00834-2
    [85] J. Kennedy, R. C. Eberhart, A discrete binary version of the particle swarm algorithm, in 1997 IEEE International conference on systems, man, and cybernetics. Computational cybernetics and simulation, IEEE, 5 (1997), 4104-4108.
    [86] E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, BGSA: binary gravitational search algorithm, Nat. Comput., 9 (2010), 727-745. doi: 10.1007/s11047-009-9175-3
    [87] N. S. Altman, An introduction to kernel and nearest-neighbor nonparametric regression, Am. Stat., 46 (1992), 175-185.
    [88] F. Pernkopf, Bayesian network classifiers versus selective k-NN classifier, Pattern Recogn., 38 (2005), 1-10. doi: 10.1016/j.patcog.2004.05.012
    [89] A. Asuncion, D. Newman, UCI Machine Learning Repository, University of California, 2007.
    [90] E. Emary, H. M. Zawbaa, A. E. Hassanien, Binary ant lion approaches for feature selection, Neurocomputing, 213 (2016), 54-65. doi: 10.1016/j.neucom.2016.03.101
    [91] Arizona State University's (ASU) repository, Available from: http://featureselection.asu.edu/datasets.php.
    [92] A. I. Hammouri, M. Mafarja, M. A. Al-Betar, M. A. Awadallah, I. Abu-Doush, An improved Dragonfly Algorithm for feature selection, Knowl.-Based Syst., 203 (2020), 106131. doi: 10.1016/j.knosys.2020.106131
    [93] H. Faris, M. M. Mafarja, A. A. Heidari, I. Aljarah, A. M. Al-Zoubi, S. Mirjalili, et al., An Efficient Binary Salp Swarm Algorithm with Crossover Scheme for Feature Selection Problems, Knowl.-Based Syst., 154 (2018), 43-67. doi: 10.1016/j.knosys.2018.05.009
    [94] M. Mafarja, S.Mirjalili, Whale optimization approaches for wrapper feature selection, Appl. Soft Comput., 62 (2018), 441-453. doi: 10.1016/j.asoc.2017.11.006
    [95] M. Mafarja, I. Aljarah, H. Faris, A. I. Hammouri, A. M. Al-Zoubi, S. Mirjalili, Binary grasshopper optimisation algorithm approaches for feature selection problems, Expert Syst. Appl., 117 (2019), 267-286. doi: 10.1016/j.eswa.2018.09.015
    [96] E. Emary, H. M. Zawbaa, A. E. Hassanien, Binary grey wolf optimization approaches for feature selection, Neurocomputing, 172 (2016), 371-381. doi: 10.1016/j.neucom.2015.06.083
  • This article has been cited by:

    1. Shabana Anwar, Muhammad Kamran Jamil, Amal S. Alali, Mehwish Zegham, Aisha Javed, Extremal values of the first reformulated Zagreb index for molecular trees with application to octane isomers, 2023, 9, 2473-6988, 289, 10.3934/math.2024017
    2. Ali N. A. Koam, Ali Ahmad, Raed Qahiti, Muhammad Azeem, Waleed Hamali, Shonak Bansal, Enhanced Chemical Insights into Fullerene Structures via Modified Polynomials, 2024, 2024, 1076-2787, 10.1155/2024/9220686
    3. Ali Ahmad, Ali N. A. Koam, Muhammad Azeem, Reverse-degree-based topological indices of fullerene cage networks, 2023, 121, 0026-8976, 10.1080/00268976.2023.2212533
    4. Muhammad Waheed Rasheed, Abid Mahboob, Iqra Hanif, Uses of degree-based topological indices in QSPR analysis of alkaloids with poisonous and healthful nature, 2024, 12, 2296-424X, 10.3389/fphy.2024.1381887
    5. Shriya Negi, Vijay Kumar Bhat, Face Index of Silicon Carbide Structures: An Alternative Approach, 2024, 16, 1876-990X, 5865, 10.1007/s12633-024-03119-0
    6. Haseeb AHMAD, Muhammad AZEEM, Face-degree-based topological descriptors of germanium phosphide, 2024, 52, 18722040, 100429, 10.1016/j.cjac.2024.100429
    7. Belman Gautham Shenoy, Raghavendra Ananthapadmanabha, Badekara Sooryanarayana, Prasanna Poojary, Vishu Kumar Mallappa, 2024, Rational Wiener Index and Rational Schultz Index of Graphs, 180, 10.3390/engproc2023059180
  • Reader Comments
  • © 2021 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(4177) PDF downloads(294) Cited by(50)

Figures and Tables

Figures(15)  /  Tables(18)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog