In recent years, augmented reality has emerged as an emerging technology with huge potential in image-guided surgery, and in particular, its application in brain tumor surgery seems promising. Augmented reality can be divided into two parts: hardware and software. Further, artificial intelligence, and deep learning in particular, have attracted great interest from researchers in the medical field, especially for the diagnosis of brain tumors. In this paper, we focus on the software part of an augmented reality scenario. The main objective of this study was to develop a classification technique based on a deep belief network (DBN) and a softmax classifier to (1) distinguish a benign brain tumor from a malignant one by exploiting the spatial heterogeneity of cancer tumors and homologous anatomical structures, and (2) extract the brain tumor features. In this work, we developed three steps to explain our classification method. In the first step, a global affine transformation is preprocessed for registration to obtain the same or similar results for different locations (voxels, ROI). In the next step, an unsupervised DBN with unlabeled features is used for the learning process. The discriminative subsets of features obtained in the first two steps serve as input to the classifier and are used in the third step for evaluation by a hybrid system combining the DBN and a softmax classifier. For the evaluation, we used data from Harvard Medical School to train the DBN with softmax regression. The model performed well in the classification phase, achieving an improved accuracy of 97.2%.
Citation: Karim Gasmi, Ahmed Kharrat, Lassaad Ben Ammar, Ibtihel Ben Ltaifa, Moez Krichen, Manel Mrabet, Hamoud Alshammari, Samia Yahyaoui, Kais Khaldi, Olfa Hrizi. Classification of MRI brain tumors based on registration preprocessing and deep belief networks[J]. AIMS Mathematics, 2024, 9(2): 4604-4631. doi: 10.3934/math.2024222
[1] | Snježana Maksimović, Sanja Atanasova, Zoran D. Mitrović, Salma Haque, Nabil Mlaiki . Abelian and Tauberian results for the fractional Fourier cosine (sine) transform. AIMS Mathematics, 2024, 9(5): 12225-12238. doi: 10.3934/math.2024597 |
[2] | Liping Zhou, Yumei Yan, Ying Liu . Error estimate and superconvergence of a high-accuracy difference scheme for 2D heat equation with nonlocal boundary conditions. AIMS Mathematics, 2024, 9(10): 27848-27870. doi: 10.3934/math.20241352 |
[3] | Anumanthappa Ganesh, Swaminathan Deepa, Dumitru Baleanu, Shyam Sundar Santra, Osama Moaaz, Vediyappan Govindan, Rifaqat Ali . Hyers-Ulam-Mittag-Leffler stability of fractional differential equations with two caputo derivative using fractional fourier transform. AIMS Mathematics, 2022, 7(2): 1791-1810. doi: 10.3934/math.2022103 |
[4] | Firdous A. Shah, Waseem Z. Lone, Kottakkaran Sooppy Nisar, Amany Salah Khalifa . Analytical solutions of generalized differential equations using quadratic-phase Fourier transform. AIMS Mathematics, 2022, 7(2): 1925-1940. doi: 10.3934/math.2022111 |
[5] | Jeong-Kweon Seo, Byeong-Chun Shin . Reduced-order modeling using the frequency-domain method for parabolic partial differential equations. AIMS Mathematics, 2023, 8(7): 15255-15268. doi: 10.3934/math.2023779 |
[6] | Abdelbaki Choucha, Sofian Abuelbacher Adam Saad, Rashid Jan, Salah Boulaaras . Decay rate of the solutions to the Lord Shulman thermoelastic Timoshenko model. AIMS Mathematics, 2023, 8(7): 17246-17258. doi: 10.3934/math.2023881 |
[7] | Andrey Muravnik . Nonclassical dynamical behavior of solutions of partial differential-difference equations. AIMS Mathematics, 2025, 10(1): 1842-1858. doi: 10.3934/math.2025085 |
[8] | Murali Ramdoss, Ponmana Selvan-Arumugam, Choonkil Park . Ulam stability of linear differential equations using Fourier transform. AIMS Mathematics, 2020, 5(2): 766-780. doi: 10.3934/math.2020052 |
[9] | Afraz Hussain Majeed, Sadia Irshad, Bagh Ali, Ahmed Kadhim Hussein, Nehad Ali Shah, Thongchai Botmart . Numerical investigations of nonlinear Maxwell fluid flow in the presence of non-Fourier heat flux theory: Keller box-based simulations. AIMS Mathematics, 2023, 8(5): 12559-12575. doi: 10.3934/math.2023631 |
[10] | Aamir H. Dar, Mohra Zayed, M. Younus Bhat . Short-time free metaplectic transform: Its relation to short-time Fourier transform in L2(Rn) and uncertainty principles. AIMS Mathematics, 2023, 8(12): 28951-28975. doi: 10.3934/math.20231483 |
In recent years, augmented reality has emerged as an emerging technology with huge potential in image-guided surgery, and in particular, its application in brain tumor surgery seems promising. Augmented reality can be divided into two parts: hardware and software. Further, artificial intelligence, and deep learning in particular, have attracted great interest from researchers in the medical field, especially for the diagnosis of brain tumors. In this paper, we focus on the software part of an augmented reality scenario. The main objective of this study was to develop a classification technique based on a deep belief network (DBN) and a softmax classifier to (1) distinguish a benign brain tumor from a malignant one by exploiting the spatial heterogeneity of cancer tumors and homologous anatomical structures, and (2) extract the brain tumor features. In this work, we developed three steps to explain our classification method. In the first step, a global affine transformation is preprocessed for registration to obtain the same or similar results for different locations (voxels, ROI). In the next step, an unsupervised DBN with unlabeled features is used for the learning process. The discriminative subsets of features obtained in the first two steps serve as input to the classifier and are used in the third step for evaluation by a hybrid system combining the DBN and a softmax classifier. For the evaluation, we used data from Harvard Medical School to train the DBN with softmax regression. The model performed well in the classification phase, achieving an improved accuracy of 97.2%.
A well-known and active direction in the study of derivations is the local derivations problem, which was initiated by Kadison [8] and Larson and Sourour [9]. Recall that a linear map φ of an algebra A is called a local derivation if for each x∈A, there exists a derivation φx of A, depending on x, such that φ(x)=φx(x). The question of determining under what conditions every local derivation must be a derivation has been studied by many authors (see [4,6,7,13,15]). Recently, Brešar [2] proved that each local derivation of algebras generated by all their idempotents is a derivation.
A linear map φ of an algebra A is called a Lie derivation if φ([x,y])=[φ(x),y]+[x,φ(y)] for all x,y∈A, where [x,y]=xy−yx is the usual Lie product, also called a commutator. A Lie derivation φ of A is standard if it can be decomposed as φ=d+τ, where d is a derivation from A into itself and τ is a linear map from A into its center vanishing on each commutator. The classical problem, which has been studied for many years, is to find conditions on A under which each Lie derivation is standard or standard-like. We say that a linear map φ from A into itself is a local Lie derivation if for each x∈A, there exists a Lie derivation φx of A such that φ(x)=φx(x). In [3], Chen et al. studied local Lie derivations of operator algebras on Banach spaces. We remark that the methods in [3] depend heavily on rank one operators in B(X). Later, Liu and Zhang [10] proved that each local Lie derivation of factor von Neumann algebras is a Lie derivation. Liu and Zhang [11] investigated local Lie derivations of a certain class of operator algebras. An et al. [1] proved that every local Lie derivation on von Neumann algebras is a Lie derivation.
It is quite common to study local derivations in algebras that contain many idempotents, in the sense that the linear span of all idempotents is 'large'. The main novelty of this paper is that we shall deal with the subalgebra generated by all idempotents instead of their span. Let M2 be the algebra of 2×2 matrices over L∞[0,1]. By [6], M2 is generated by, but not spanned by, its idempotents. In what follows, we denote by J(A) the subalgebra of A generated by all idempotents in A. The purpose of the present paper is to study local Lie derivations of a certain class of generalized matrix algebras. Finally we apply the main result to full matrix algebras and unital simple algebras with nontrivial idempotents.
Let A and B be two unital algebras with unit elements 1A and 1B, respectively. A Morita context consists of A,B, two bimodules AMB and BNA, and two bimodule homomorphisms called the pairings ΦMN:M⊗BN→A and ΨNM:N⊗AM→B satisfying the following commutative diagrams:
![]() |
and
![]() |
If (A,B,M,N,ΦMN,ΨNM) is a Morita context, then the set
G=(AMNB)={(amnb)∣a∈A,m∈M,n∈N,b∈B} |
forms an algebra under matrix-like addition and multiplication. Such an algebra is called a generalized matrix algebras. We further assume that M is faithful as an (A,B)-bimodule. The most common examples of generalized matrix algebras are full matrix algebras and triangular algebras.
Consider algebra G. Any element of the form
(a00b)∈G |
will be denoted by a⊕b. Let us define two natural projections πA:G→A and πB:G→B by
πA:(amnb)↦a and πB:(amnb)↦b. |
The center of G is
Z(G)={a⊕b∣am=mb,na=bn for all m∈M,n∈N}. |
Furthermore, πA(Z(G))⊆Z(A) and πB(Z(G))⊆Z(B), and there exists a unique algebra isomorphism η from πB(Z(G)) to πA(Z(G)) such that η(b)m=mb and nη(b)=bn for all m∈M,n∈N (see [14]). Set
e=(1A000), f=(0001B). |
We immediately notice that e and f are orthogonal idempotents of G and so G may be represented as G=(e+f)G(e+f)=eGe+eGf+fGe+fGf. Then each element x=exe+exf+fxe+fxf∈G can be represented in the form x=eae+emf+fne+fbf=a+m+n+b, where a∈A,b∈B,m∈M,n∈N.
We close this section with a well known result concerning Lie derivations.
Proposition 1.1. (See [5],Theorem 1) Let G be a generalized matrix algebra. Suppose that
(1) Z(A)=πA(Z(G)) and Z(B)=πB(Z(G));
(2) either A or B does not contain nonzero central ideals.
Then every Lie derivation φ:G→G is standard, that is, φ is the sum of a derivation d and a linear central-valued map τ vanishing on each commutator.
Our main result reads as follows.
Theorem 2.1. Let G be a generalized matrix algebra. Suppose that
(1) A=J(A) and B=J(B);
(2) Z(A)=πA(Z(G)) and Z(B)=πB(Z(G));
(3) either A or B does not contain nonzero central ideals.
Then every local Lie derivation φ from G into itself is a sum of a derivation δ and a linear central-valued map h vanishing on each commutator.
To prove Theorem 2.1, we need some lemmas. In the following, φ is a local Lie derivation and, for any x∈G, the symbol φx stands for a Lie derivation from G into itself such that φ(x)=φx(x). It follows from A=J(A) that every a in A can be written as a linear combination of some elements p1p2⋯pi (i=1,2,…,k), where p1,p2,…,pi are idempotents in A.
Lemma 2.2. Let p,q∈G be idempotents, then for every x∈G, there exist linear maps τ1,τ2,τ3,τ4:G→Z(G) vanishing on each commutator such that
φ(pxq)=φ(px)q+pφ(xq)−pφ(x)q+p⊥τ1(pxq)q⊥−pτ2(p⊥xq)q⊥+pτ3(p⊥xq⊥)q−p⊥τ4(pxq⊥)q, |
where p⊥=1−p and q⊥=1−q.
Proof. Proposition 1.1 implies that for every idempotents p,q∈G and x∈G, there exist derivations d1,d2,d3,d4:G→G and linear maps τ1,τ2,τ3,τ4:G→Z(G) vanishing on each commutator such that
φ(pxq)=φpxq(pxq)=d1(pxq)+τ1(pxq), | (2.1) |
φ(p⊥xq)=φp⊥xq(p⊥xq)=d2(p⊥xq)+τ2(p⊥xq), | (2.2) |
φ(p⊥xq⊥)=φp⊥xq⊥(p⊥xq⊥)=d3(p⊥xq⊥)+τ3(p⊥xq⊥), | (2.3) |
φ(pxq⊥)=φpxq⊥(pxq⊥)=d4(pxq⊥)+τ4(pxq⊥). | (2.4) |
It follows from (2.1)–(2.4) that
p⊥φ(pxq)q⊥=p⊥τ1(pxq)q⊥, pφ(p⊥xq)q⊥=pτ2(p⊥xq)q⊥, |
pφ(p⊥xq⊥)q=pτ3(p⊥xq⊥)q, p⊥φ(pxq⊥)q=p⊥τ4(pxq⊥)q. |
Hence
φ(pxq)q⊥=pφ(pxq)q⊥+p⊥φ(pxq)q⊥=pφ(xq)q⊥−pφ(p⊥xq)q⊥+p⊥φ(pxq)q⊥=pφ(xq)q⊥+p⊥τ1(pxq)q⊥−pτ2(p⊥xq)q⊥=pφ(xq)−pφ(xq)q+p⊥τ1(pxq)q⊥−pτ2(p⊥xq)q⊥, |
φ(pxq⊥)q=pφ(pxq⊥)q+p⊥φ(pxq⊥)q=pφ(xq⊥)q−pφ(p⊥xq⊥)q+p⊥φ(pxq⊥)q=pφ(xq⊥)q−pτ3(p⊥xq⊥)q+p⊥τ4(pxq⊥)q. |
Thus,
φ(pxq)=φ(pxq)q⊥+φ(pxq)q=φ(pxq)q⊥+φ(px)q−φ(pxq⊥)q=φ(px)q+pφ(xq)−pφ(x)q+p⊥τ1(pxq)q⊥−pτ2(p⊥xq)q⊥+pτ3(p⊥xq⊥)q−p⊥τ4(pxq⊥)q. |
It is easy to verify that for each derivation d:G→G, we have
d(e)=−d(f)∈M⊕N, d(A)⊆A⊕M⊕N, d(M)⊆A⊕M⊕B. | (2.5) |
Lemma 2.3. eφ(e)e+fφ(e)f∈Z(G).
Proof. For any m∈M, there exists a Lie derivation φe of G such that
φe(m)=φe([e,m])=[φ(e),m]+[e,φe(m)]=φ(e)m−mφ(e)+eφe(m)f−fφe(m)e. |
Multiplying the above equality from the left by e and from the right by f, we arrive at
eφ(e)m=mφ(e)f. |
Similarly, for any n∈N, we have from φe(n)=φe([n,e])=[φe(n),e]+[n,φ(e)] that
fφ(e)n=nφ(e)e. |
Hence
eφ(e)e+fφ(e)f∈Z(G). |
In the sequel, we define ϕ:G→G by ϕ(x)=φ(x)−[x,eφ(e)f−fφ(e)e]. One can verify that ϕ is also a local Lie derivation. Moreover, by Lemma 2.3, we have ϕ(e)=eφ(e)e+fφ(e)f∈Z(G).
Lemma 2.4. ϕ(M)⊆M and ϕ(N)⊆N.
Proof. Let a∈A,m∈M and p1 be any idempotent in A. Taking p=p1, x=a and q=e+m in Lemma 2.2, it follows from the facts p⊥xq⊥ and pxq⊥ can be written as commutators that τ3(p⊥xq⊥)=τ4(pxq⊥)=0, hence
ϕ(p1a+p1am)=ϕ(p1a)(e+m)+p1ϕ(a+am)−p1ϕ(a)(e+m)+(1−p1)τ1(p1a+p1am)(f−m)−p1τ2(a+am−p1a−p1am)(f−m)=ϕ(p1a)e+ϕ(p1a)m+p1ϕ(a)f+p1ϕ(am)−p1ϕ(a)m+τ1(p1a)f−τ1(p1a)m+p1τ1(p1a)m+p1τ2(a−p1a)m. | (2.6) |
Multiplying (2.6) from the right by e, we arrive at
ϕ(p1am)e=p1ϕ(am)e. |
In particular,
ϕ(p1m)e=p1ϕ(m)e. |
By the above two equations, then
ϕ(p1p2⋯pnm)e=p1ϕ(p2⋯pnm)e=p1p2⋯pn−1ϕ(pnm)e=p1p2⋯pnϕ(m)e |
for any idempotents p1,…,pn∈A. It follows from A=J(A) that
ϕ(am)e=aϕ(m)e | (2.7) |
for all a∈A,m∈M. This implies that fϕ(M)e=0.
The hypothesis (2), (3) and Proposition 1.1 imply that there exist a derivation d:G→G and a linear map τ:G→Z(G) vanishing on each commutator such that
ϕ(e+m)=d(e+m)+τ(e+m)=d(e+m)+τ(e). | (2.8) |
It follows from (2.5), (2.8) and the fact fϕ(M)e=0 that
0=fϕ(e+m)e=fd(e)e |
and hence by (2.5) and (2.8) again,
eϕ(e)e+eϕ(m)e=ed(m)e+eτ(e)e=ed(mf)e+eτ(e)e=md(f)e+eτ(e)e=−md(e)e+eτ(e)e=eτ(e)e |
and
fϕ(e)f+fϕ(m)f=fd(m)f+fτ(e)f=fd(e)m+fτ(e)f=fτ(e)f. |
Then we have from the fact ϕ(e)=eϕ(e)e+fϕ(e)f∈Z(G) that
eϕ(m)e+fϕ(m)f=τ(e)−ϕ(e)∈Z(G). | (2.9) |
We assume without loss of generality that A does not contain nonzero central ideals. By (2.7) and (2.9) that eϕ(m)e in the central ideal of A. Thus eϕ(M)e=0. So, by (2.9), we get fϕ(M)f=0. Hence, ϕ(M)⊆M.
With the same argument, we can obtain that ϕ(N)⊆N.
Lemma 2.5. There exist a linear map h1 from A into Z(G) such that ϕ(a)−h1(a)∈A for all a∈A and a linear map h2 from B into Z(G) such that ϕ(b)−h2(b)∈B for all b∈B.
Proof. Taking m=0 in (2.6), we have
eϕ(p1a)f=p1ϕ(a)f and fϕ(p1a)f=τp1a(p1a)f∈πB(Z(G)). | (2.10) |
In particular,
eϕ(p1)f=p1ϕ(e)f=0. |
By the two equations above, we obtain
eϕ(p1p2⋯pn)f=p1ϕ(p2⋯pn)f=p1p2⋯pn−1ϕ(pn)f=0 |
for all idempotents pi in A. It follows from A=J(A) that eϕ(a)f=0. Similarly, by taking p=e, x=a and q=p1 in Lemma 2.2, we get
fϕ(ap1)e=fϕ(a)p1. |
This implies that fϕ(a)e=0. So ϕ(a)∈A⊕B.
By the hypothesis (2) of Theorem 2.1, there exists a algebra isomorphism η:Z(B)→Z(A) such that η(b)⊕b∈Z(G) for any b∈Z(B).
It follows from (2.10) that fϕ(a)f∈πB(Z(G))=Z(B). We define h1:A→Z(G) by h1(a)=η(fϕ(a)f)⊕fϕ(a)f. It is clear that h1 is linear and
ϕ(a)−h1(a)=eϕ(a)e+fϕ(a)f−η(fϕ(a)f)−fϕ(a)f=eϕ(a)e−η(fϕ(a)f)∈A. |
With the similar argument, we can define a linear map h2:B→Z(G) such that ϕ(b)−h2(b)∈B for all b∈B.
Now for any x∈G, we define two linear maps h:G→Z(G) and δ:G→G by
h(x)=h1(exe)+h2(fxf) and δ(x)=ϕ(x)−h(x). |
It is easy to verify that δ(e)=0. Moreover, we have
δ(A)⊆A, δ(B)⊆B, δ(M)=ϕ(M)⊆M, δ(N)=ϕ(N)⊆N. |
Lemma 2.6. δ is a derivation.
Proof. We divide the proof into the following three steps.
Step 1. We first prove that
δ(p1p2…pnm)=δ(p1p2…pn)m+p1p2…pnδ(m) | (2.11) |
for all idempotents pi in A and m∈M.
Let a∈A, m∈M and p1 be any idempotent in A. Taking p=p1, x=a and q=e+m in (2.2), we have
ϕ(a+am−p1a−p1am)=d2(a+am−p1a−p1am)+τ2(a+am−p1a−p1am)=d2(a+am−p1a−p1am)+τ2(a−p1a). | (2.12) |
It follows from (2.5) and (2.12) that
0=fd2(a−p1a)e=fd2(e(a−p1a))e=fd2(e)(a−p1a) |
and hence by (2.5) and (2.12) again,
fϕ(a−p1a)f=fd2(am−p1am)f+fτ2(a−p1a)f=fd2(e)(a−p1a)m+fτ2(a−p1a)f=fτ2(a−p1a)f. | (2.13) |
Multiplying (2.6) by f from both sides, we arrive at
fϕ(p1a)f=fτ1(p1a)f. | (2.14) |
By (2.13) and (2.14), then mτ1(p1a)=mϕ(p1a) and
p1mτ2(a−p1a)=p1mϕ(a−p1a)=p1mϕ(a)−p1mϕ(p1a)=p1mϕ(a)−p1mτ1(p1a). |
Hence (2.6) implies that
δ(p1am)=ϕ(p1am)=ϕ(p1a)m+p1ϕ(am)−p1ϕ(a)m−mϕ(p1a)+p1mϕ(a)=(δ(p1a)+h(p1a))m+p1δ(am)−p1(δ(a)+h(a))m−m(δ(p1a)+h(p1a))+p1m(δ(a)+h(a))=δ(p1a)m+p1δ(am)−p1δ(a)m. | (2.15) |
Taking a=e in (2.15), we have from δ(e)=0 that
δ(p1m)=δ(p1)m+p1δ(m). |
This shows that (2.11) is true for n=1. One can verify that Eq (2.11) follows easily by induction based on (2.15). It follows from A=J(A) that δ(am)=δ(a)m+aδ(m).
Similarly, we can get δ(mb)=δ(m)b+mδ(b), δ(mb)=δ(m)b+mδ(b) and δ(na)=δ(n)a+nδ(a).
Step 2. Let a,a′∈A. For any m∈M, on one hand, by Step 1, we have
δ(aa′m)=δ(a)a′m+aδ(a′m)=δ(a)a′m+aδ(a′)m+aa′δ(m). |
On the other hand,
δ(aa′m)=δ(aa′)m+aa′δ(m). |
Comparing these two equalities, we have
(δ(aa′)−δ(a)a′−aδ(a′))m=0 |
for any m∈M. Since M is a faithful left A-module, we get
δ(aa′)=δ(a)a′+aδ(a′). |
Similarly, by considering δ(mbb′), we can get
δ(bb′)=δ(b)b′+bδ(b′). |
Step 3. Let m,m′∈M and n∈N. Taking p=e−m′, x=n+m′n and q=e−m′ in Lemma 2.2, we have from pxq=pxq⊥=0 that
0=(e−m′)ϕ(m′n−m′nm′+n−nm′)−(e−m′)ϕ(m′n+n)(e−m′)−(e−m′)τ2(m′n−nm′)(f+m′)+(e−m′)τ3(nm′)(e−m′)=−ϕ(m′nm′)−eϕ(nm′)−m′ϕ(m′n)+m′ϕ(nm′)+ϕ(m′n)m′−m′ϕ(n)m′+eτ3(nm′)e−τ3(nm′)m′. | (2.16) |
This implies that
eϕ(nm′)=eτ3(nm′)e. |
Then eϕ(nm′)m′=τ3(nm′)m′ and hence by (2.16),
δ(m′nm′)=ϕ(m′nm′)=−m′ϕ(m′n)+m′ϕ(nm′)+ϕ(m′n)m′−m′ϕ(n)m′−ϕ(nm′)m′=−m′h(m′n)+m′δ(nm′)+m′h(nm′)+δ(m′n)m′+h(m′n)m′−m′δ(n)m′−h(nm′)m′=m′δ(nm′)+δ(m′n)m′−m′δ(n)m′. |
Replacing m′ with m+m′, we arrive at
δ(m′nm+mnm′)=δ(m′n)m+m′δ(nm)−m′δ(n)m+δ(mn)m′+mδ(nm′)−mδ(n)m′. |
On the other hand, by Steps 1 and 2, we have
δ(m′nm+mnm′)=δ(m′n)m+m′nδ(m)+δ(m)nm′+mδ(nm′). |
Comparing these two equalities, we have
(δ(mn)−δ(m)n−mδ(n))m′=−m′(δ(nm)−nδ(m)−δ(n)m). | (2.17) |
Set
f(m,n):=δ(mn)−δ(m)n−mδ(n) |
and
g(m,n):=δ(nm)−nδ(m)−δ(n)m. |
We assume without loss of generality that A does not contain nonzero central ideals. For any a∈A, by (2.17),
f(m,n)am′=−am′g(m,n)=af(m,n)m′. |
which is equivalent to (f(m,n)a−af(m,n))m′=0. Since M is a faithful left A-module, we get f(m,n)a=af(m,n). Then
f(m,n)∈Z(A). |
By Steps 1 and 2, we have
f(am,n)=δ(amn)−δ(am)n−amδ(n)=δ(a)mn+aδ(mn)−δ(a)mn−aδ(m)n−amδ(n)=af(m,n). |
The above two equalities show that f(m,n) in the central ideal of A and hence
f(m,n)=0, | (2.18) |
that is
δ(mn)=δ(m)n+mδ(n) |
for all m∈M,n∈N. Since M is a faithful right B-module, it follows from (2.17) that
δ(nm)=nδ(m)+δ(n)m |
for all m∈M,n∈N.
Lemma 2.7. The map h:G→Z(G) vanishes on each commutator.
Proof. Step 1. Let a∈A, m∈M, n∈N and b∈B, by the definition of h, we have h([a,m])=h([m,b])=h([n,a])=h([b,n])=0.
Step 2. Let a,a′∈A, we have ϕ([a,a′])=eϕ([a,a′])e+fϕ([a,a′])f∈A⊕B. On the other hand, Proposition 1.1 implies that ϕ([a,a′])=d([a,a′])∈A⊕M⊕N, where d is a derivation. Thus, fϕ([a,a′])f=0. This implies that h([a,a′])=h1([a,a′])=η(fϕ([a,a′])f)+fϕ([a,a′])f=0.
Similarly, we can get h([b,b′])=0, for all b,b′∈B.
Step 3. It follows from (2.18) that
(ϕ(mn)−η(fϕ(mn)f)−ϕ(m)n−mϕ(n))m′=−m′(ϕ(nm)−η−1(eϕ(nm)e)−nϕ(m)−ϕ(n)m). | (2.19) |
Since fϕ(a)f∈πB(Z(G)), eϕ(b)e∈πA(Z(G)), we get that
m′fϕ(mn)f=η(fϕ(mn)f)m′,eϕ(nm)em′=m′η−1(eϕ(nm)e). |
It further follows from (2.19) that
ϕ(mn)m′−m′fϕ(mn)f−ϕ(m)nm′−mϕ(n)m′=−m′ϕ(nm)+eϕ(nm)m′+m′nϕ(m)+m′ϕ(n)m. |
Hence
(ϕ(mn)−eϕ(nm)−ϕ(m)n−mϕ(n))m′=m′(−ϕ(nm)+fϕ(mn)f+nϕ(m)+ϕ(n)m). |
Using an argument similar to that in the proof of (2.18), we arrive that
eϕ(mn)e−eϕ(nm)−ϕ(m)n−mϕ(n)=0, | (2.20) |
and
−fϕ(nm)f+fϕ(mn)f+nϕ(m)+ϕ(n)m=0. |
By (2.19) and (2.20), we get that eϕ(nm)e=η(fϕ(mn)f). Note that h([m,n])=h1(mn)−h2(nm)=η(fϕ(mn)f)+fϕ(mn)f−eϕ(nm)e−η−1(eϕ(nm)e), thus h([m,n])=0.
Therefore it is easily verify that h vanishing on each commutator.
Proof of Theorem 1.1 By the definition of δ, we have φ(x)=δ(x)+[x,eφ(e)f−fφ(e)e]+h(x) for all x∈A, where δ is a derivation and h is a linear map from A into its center vanishing on each commutator. The proof is complete.
Let A be a unital algebra and Mk×m(A) be the set of all k×m matrices over A. For n≥2 and each 2≤l<n−1, the full matrix algebra Mn(A) can be represented as a generalized matrix algebra of the form
(Ml×l(A)Ml×(n−l)(A)M(n−l)×l(A)M(n−l)×(n−l)(A)). |
Corollary 2.8. Let Mn(A) be a full matrix algebra with n≥4. Then each local Lie derivation φ on Mn(A) is of the form φ=d+τ, where d is a derivation of Mn(A) and τ is a linear map from Mn(A) into its center Z(A)⋅In vanishing on each commutator.
Proof. It follows from the example (C) of [2] that the matrix algebras Ml(A) and Mn−l(A) are generated by their idempotents for 2≤l<n−1. Since Z(Mn(A))=Z(A)⋅In, Z(Ml(A))=Z(A)⋅Il and Z(Mn−l(A))=Z(A)⋅In−l, the condition (2) of Theorem 2.1 is satisfied. By [5,Lemma 1], Mk(A) does not contain nonzero central ideals for k≥2. Hence by Theorem 2.1, every local Lie derivation of Mn(A) is a sum of a derivation and a linear central-valued map vanishing on each commutator.
Corollary 2.9. Let R be an unital simple algebra with a nontrivial idempotent. If φ:R→R is a local Lie derivation, then there exit a derivation d and a linear central map τ vanishing on each commutator, such that φ=d+τ.
Proof. Let R be an unital simple algebra with a nontrivial idempotent e0 and let f0 denote the idempotent 1−e0. Then R can be represented in the so-called Peirce decomposition form
R=e0Re0+e0Rf0+f0Re0+f0Rf0, |
where e0Re0 and f0Rf0 are subalgebras with unitary element e0 and f0, respectively, e0Rf0 is an (e0Re0,f0Rf0)-bimodule.
Next, we will show that
e0xe0⋅e0Rf0={0} implies e0xe0=0 |
and
e0Rf0⋅f0xf0={0} implies f0xf0=0. |
That is e0Rf0 is faithful as an (e0Re0,f0Rf0)-bimodule. Let e=f0+e0Rf0, then e2=e and [e,R]⊆eR(1−e)+(1−e)Re. Note that
(1−e)Re=(e0−e0Rf0)R(f0+e0Rf0)⊆e0Rf0. |
Furthermore, the assumption e0xe0⋅e0Rf0={0} implies
e0xe0eR(1−e)=e0xe0(f0+e0Rf0)R(e0+e0Rf0)={0} |
and then
e0xe0[e,R]={0}. |
Let r=[e,y] and z,w∈R. It follows from
zrw=[e,z[e,r]w]−[e,z][e,rw]−[e,zr][e,w]+2[e,z]r[e,w] |
that e0xe0zrw=0. Then
e0xe0R[e,R]R=0. | (2.21) |
It is clear that I=R[e,R]R is a nonzero ideal of R. R is a simple algebra, which implies I=R. By (2.21), e0xe0R=0. Since 1∈R, we get e0xe0=0. Similarly, we can show that e0Rf0⋅f0xf0={0} implies f0xf0=0. Now, we can conclude that R can be represented as a generalized matrix algebra of the form R=e0Re0+e0Rf0+f0Re0+f0Rf0.
It follows from the example (A) of [2] that the unital simple algebra with a nontrivial idempotent is generated by its idempotents, the condition (1) of Theorem 2.1 is satisfied. It is clear that e0Re0 and f0Rf0 satisfy the conditions (2) and (3) of Theorem 2.1. Hence by Theorem 2.1, every local Lie derivation of R is the sum of a derivation and a linear central-valued map vanishing on each commutator.
Let B(H) be the set of bounded linear operators acting on a complex Hilbert space H, and let K(H) be the ideal of compact operators on H. If H is an infinite-dimensional separable Hilbert space, by [12,Theorem 4.1.16], the Calkin algebra B(H)/K(H) is a simple C∗-algebra.
Corollary 2.10. If H is an infinite-dimensional separable Hilbert space, then every local Lie derivation of the Calkin algebra B(H)/K(H) is the sum of a derivation and a linear central map vanishing on each commutator.
In this paper, we investigate local Lie derivations of a certain class of generalized matrix algebras and show that, under certain conditions every local Lie derivation of a generalized matrix algebra is a sum of a derivation and a linear central-valued map vanishing on each commutator. The main result is then applied to full matrix algebras and unital simple algebras with nontrivial idempotents.
This research was supported by the National Natural Science Foundation of China (No. 11901248). Moreover, the authors express their sincere gratitude to the referee for reading this paper very carefully and specially for valuable suggestions concerning improvement of the manuscript.
All authors declare no conflicts of interest in this paper.
[1] |
K. Klinker, M. Wiesche, H. Krcmar, Digital transformation in health care: Augmented reality for hands-free service innovation, Inform. Syst. Front., 22 (2020), 1419–1431. https://doi.org/10.1007/s10796-019-09937-7 doi: 10.1007/s10796-019-09937-7
![]() |
[2] | C. Jung, G. Wolff, B. Wernly, R. R. Bruno, M. Franz, P. C. Schulze, et al., Virtual and augmented reality in cardiovascular care: state-of-the-art and future perspectives, Cardiovascular Imag., 15 (2022), 519–532. https://www.jacc.org/doi/abs/10.1016/j.jcmg.2021.08.017 |
[3] |
F. Davnall, C. S. Yip, G. Ljungqvist, M. Selmi, F. Ng, B. Sanghera, et al., Assessment of tumor heterogeneity: An emerging imaging tool for clinical practice?, Insights Imaging, 3 (2012), 573–589. https://doi.org/10.1007/s13244-012-0196-6 doi: 10.1007/s13244-012-0196-6
![]() |
[4] |
S. Bauer, R. Wiest, L. P. Nolte, M. Reyes, A survey of mri-based medical image analysis for brain tumor studies, Phys. Med. Biol., 58 (2013), R97. https://doi.org/10.1088/0031-9155/58/13/r97 doi: 10.1088/0031-9155/58/13/r97
![]() |
[5] | A. Madani, M. Moradi, A. Karargyris, T. Syeda-Mahmood, Semi-supervised learning with generative adversarial networks for chest x-ray classification with ability of data domain adaptation, in: 2018 IEEE 15th International symposium on biomedical imaging (ISBI 2018), IEEE, 2018, 1038–1042. 10.1109/ISBI.2018.8363749 |
[6] |
A. Van Opbroek, M. A. Ikram, M. W. Vernooij, M. De Bruijne, Transfer learning improves supervised image segmentation across imaging protocols, IEEE T. Med. Imaging, 34 (2014), 1018–1030. https://doi.org/10.1109/tmi.2014.2366792 doi: 10.1109/tmi.2014.2366792
![]() |
[7] |
N. Varuna Shree, T. Kumar, Identification and classification of brain tumor mri images with feature extraction using dwt and probabilistic neural network, Brain Informatics, 5 (2018), 23–30. https://doi.org/10.1007/s40708-017-0075-5 doi: 10.1007/s40708-017-0075-5
![]() |
[8] |
O. Hrizi, K. Gasmi, I. Ben Ltaifa, H. Alshammari, H. Karamti, M. Krichen, et al., Tuberculosis disease diagnosis based on an optimized machine learning model, J. Healthc. Eng., 2022 (2022), https://doi.org/10.1155/2022/8950243 doi: 10.1155/2022/8950243
![]() |
[9] | G. B. Abdennour, K. Gasmi, R. Ejbali, Ensemble learning model for medical text classification, in: International Conference on Web Information Systems Engineering, Springer, 2023, 3–12. https://doi.org/10.1007/978-981-99-7254-8_1 |
[10] |
A. K. Attili, A. Schuster, E. Nagel, J. H. Reiber, R. J. Van der Geest, Quantification in cardiac mri: Advances in image acquisition and processing, Int. J. Cardiovas. Imag., 26 (2010), 27–40. https://doi.org/10.1007/s10554-009-9571-x doi: 10.1007/s10554-009-9571-x
![]() |
[11] |
L. Cai, J. Gao, D. Zhao, A review of the application of deep learning in medical image classification and segmentation, Ann. Transl. Med., 8 (2020). https://doi.org/10.21037/atm.2020.02.44 doi: 10.21037/atm.2020.02.44
![]() |
[12] | R. Li, W. Zhang, H. I. Suk, L. Wang, J. Li, D. Shen, et al., Deep learning based imaging data completion for improved brain disease diagnosis, in: International conference on medical image computing and computer-assisted intervention, Springer, 2014, 305–312. https://doi.org/10.1007/978-3-319-10443-0_39 |
[13] |
J. De Fauw, J. R. Ledsam, B. Romera-Paredes, S. Nikolov, N. Tomasev, S. Blackwell, et al., Clinically applicable deep learning for diagnosis and referral in retinal disease, Nat. Med., 24 (2018), 1342–1350. https://doi.org/10.1038/s41591-018-0107-6 doi: 10.1038/s41591-018-0107-6
![]() |
[14] |
A. Ramcharan, P. McCloskey, K. Baranowski, N. Mbilinyi, L. Mrisho, M. Ndalahwa, et al., A mobile-based deep learning model for cassava disease diagnosis, Front. Plant sci., 10 (2019), 272. https://doi.org/10.3389/fpls.2019.00272 doi: 10.3389/fpls.2019.00272
![]() |
[15] |
K. Gasmi, I. B. Ltaifa, G. Lejeune, H. Alshammari, L. B. Ammar, M. A. Mahmood, Optimal deep neural network-based model for answering visual medical question, Cybern. Syst., (2021), 1–22. https://doi.org/10.1080/01969722.2021.2018543 doi: 10.1080/01969722.2021.2018543
![]() |
[16] | P. Sapra, R. Singh, S. Khurana, Brain tumor detection using neural network, Int. J. Sci. Modern Eng., (2013) 2319–6386. https://www.ijisme.org/wp-content/uploads/papers/v1i9/I0425081913.pdf |
[17] | K. Simonyan, A. Zisserman, Very deep convolutional networks for large-scale image recognition, arXiv preprint arXiv: 1409.1556 (2014). |
[18] |
M. Rizwan, A. Shabbir, A. R. Javed, M. Shabbir, T. Baker, D. A.-J. Obe, Brain tumor and glioma grade classification using gaussian convolutional neural network, IEEE Access, 10 (2022), 29731–29740. https://doi.org/10.1109/access.2022.3153108 doi: 10.1109/access.2022.3153108
![]() |
[19] |
A. Krizhevsky, I. Sutskever, G. E. Hinton, Imagenet classification with deep convolutional neural networks, Adv. Neural Inf. Process. Syst., 25 (2012). https://doi.org/10.1145/3065386 doi: 10.1145/3065386
![]() |
[20] |
S. C. Turaga, J. F. Murray, V. Jain, F. Roth, M. Helmstaedter, K. Briggman, et al, Convolutional networks can learn to generate affinity graphs for image segmentation, Neural Comput., 22 (2010), 511–538. https://doi.org/10.1162/neco.2009.10-08-881 doi: 10.1162/neco.2009.10-08-881
![]() |
[21] |
A. Kharrat, M. Neji, A system for brain image segmentation and classification based on three-dimensional convolutional neural network, Comput. Sist., 24 (2020), 1617–1626. https://doi.org/10.13053/cys-24-4-3058 doi: 10.13053/cys-24-4-3058
![]() |
[22] |
A. Rehman, S. Naz, M. I. Razzak, F. Akram, M. Imran, A deep learning-based framework for automatic brain tumors classification using transfer learning, Circ. Syst. Signal Pr., 39 (2020), 757–775. https://doi.org/10.1007/s00034-019-01246-3 doi: 10.1007/s00034-019-01246-3
![]() |
[23] |
M. Krichen, Convolutional Neural Networks: A Survey, Computers, 12 (2023), 151. https://doi.org/10.3390/computers12080151 doi: 10.3390/computers12080151
![]() |
[24] | N. Montemurro, S. Condino, M. Carbone, N. Cattari, R. D'Amato, F. Cutolo, et al., Brain tumor and augmented reality: New technologies for the future, 2022. https://doi.org/10.3390/ijerph19106347 |
[25] |
M. Krichen, A. J. Maâlej, M. Lahami, A model-based approach to combine conformance and load tests: An ehealth case study, Int. J. Critical Computer-Based Syst., 8 (2018), 282–310. https://doi.org/10.1504/ijccbs.2018.096437 doi: 10.1504/ijccbs.2018.096437
![]() |
[26] |
P. E. Pelargos, D. T. Nagasawa, C. Lagman, S. Tenn, J. V. Demos, S. J. Lee, et al., Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery, J. Clin. Neurosci., 35 (2017), 1–4. https://doi.org/10.1016/j.jocn.2016.09.002 doi: 10.1016/j.jocn.2016.09.002
![]() |
[27] |
Q. Shan, T. E. Doyle, R. Samavi, M. Al-Rei, Augmented reality based brain tumor 3d visualization, Procedia Comput. Sci., 113 (2017), 400–407. https://doi.org/10.1016/j.procs.2017.08.356 doi: 10.1016/j.procs.2017.08.356
![]() |
[28] |
R. Tagaytayan, A. Kelemen, C. Sik-Lanyi, Augmented reality in neurosurgery, Arch. Med. Sci., 14 (2018), 572–578. https://doi.org/10.5114/aoms.2016.58690 doi: 10.5114/aoms.2016.58690
![]() |
[29] |
C. Lee, G. K. C. Wong, Virtual reality and augmented reality in the management of intracranial tumors: A review, J. Clin. Neurosci., 62 (2019), 14–20. https://doi.org/10.1016/j.jocn.2018.12.036 doi: 10.1016/j.jocn.2018.12.036
![]() |
[30] |
H. Greenspan, B. Van Ginneken, R. M. Summers, Guest editorial deep learning in medical imaging: Overview and future promise of an exciting new technique, IEEE T. Med. Imaging, 35 (2016), 1153–1159. https://doi.org/10.1109/tmi.2016.2553401 doi: 10.1109/tmi.2016.2553401
![]() |
[31] |
A. S. Lundervold, A. Lundervold, An overview of deep learning in medical imaging focusing on mri, Z. Medizinische Phys., 29 (2019), 102–127. https://doi.org/10.1016/j.zemedi.2018.11.002 doi: 10.1016/j.zemedi.2018.11.002
![]() |
[32] | G. Tomasila, A. W. R. Emanuel, Mri image processing method on brain tumors: A review, in: AIP Conference Proceedings, volume 2296, AIP Publishing LLC, 2020, 020023. https://doi.org/10.1063/5.0030978 |
[33] | A. Kharrat, N. Mahmoud, Feature selection based on hybrid optimization for magnetic resonance imaging brain tumor classification and segmentation, Appl. Med. Inf., 41 (2019), 9–23. https://ami.info.umfcluj.ro/index.php/AMI/article/view/648 |
[34] |
A. Hossain, M. T. Islam, S. K. Abdul Rahim, M. A. Rahman, T. Rahman, H. Arshad, et al., A lightweight deep learning based microwave brain image network model for brain tumor classification using reconstructed microwave brain (rmb) images, Biosensors, 13 (2023), 238. https://doi.org/10.3390/bios13020238 doi: 10.3390/bios13020238
![]() |
[35] |
B. Pattanaik, K. Anitha, S. Rathore, P. Biswas, P. Sethy, S. Behera, Brain tumor magnetic resonance images classification based machine learning paradigms, Contemporary Oncol., 27 (2022). https://doi.org/10.5114/wo.2023.124612 doi: 10.5114/wo.2023.124612
![]() |
[36] |
M. Rasool, N. A. Ismail, A. Al-Dhaqm, W. Yafooz, A. Alsaeedi, A novel approach for classifying brain tumours combining a squeezenet model with svm and fine-tuning, Electronics, 12 (2023), 149. https://doi.org/10.3390/electronics12010149 doi: 10.3390/electronics12010149
![]() |
[37] |
S. Solanki, U. P. Singh, S. S. Chouhan, S. Jain, Brain tumor detection and classification using intelligence techniques: An overview, IEEE Access, 2023). https://doi.org/10.1109/access.2023.3242666 doi: 10.1109/access.2023.3242666
![]() |
[38] | K. Wisaeng, W. Sa-Ngiamvibool, Brain tumor segmentation using fuzzy otsu threshold morphological algorithm, IAENG Int. J. Appl. Math., 53 (2023), 1–12. |
[39] | T. Schmah, G. E. Hinton, S. Small, S. Strother, R. Zemel, Generative versus discriminative training of rbms for classification of fmri images, Adv. Neural Inf. Proc. Syst., 21 (2008). http://www.cs.toronto.edu/fritz/absps/fmrinips.pdf |
[40] | C. Dev, K. Kumar, A. Palathil, T. Anjali, V. Panicker, Machine learning based approach for detection of lung cancer in dicom ct image, in: Ambient Communications and Computer Systems, Springer, 2019, 161–173. https://doi.org/10.1007/978-981-13-5934-7_15 |
[41] | T. Williams, R. Li, Wavelet pooling for convolutional neural networks, in: International Conference on Learning Representations, 2018, 1. |
[42] |
E. I. Zacharaki, S. Wang, S. Chawla, D. Soo Yoo, R. Wolf, E. R. Melhem, et al., Classification of brain tumor type and grade using mri texture and shape in a machine learning scheme, Magn. Reson. Med., 62 (2009), 1609–1618. https://doi.org/10.1002/mrm.22147 doi: 10.1002/mrm.22147
![]() |
[43] | K. Machhale, H. B. Nandpuru, V. Kapur, L. Kosta, Mri brain cancer classification using hybrid classifier (svm-knn), in: 2015 International Conference on Industrial Instrumentation and Control (ICIC), IEEE, 2015, 60–65. https://doi.org/10.1109/iic.2015.7150592 |
[44] | A. S. Ansari, Numerical simulation and development of brain tumor segmentation and classification of brain tumor using improved support vector machine, Int. J. Intell. Syst. Appl. Eng., 11 (2023), 35–44. https://www.ijisae.org/index.php/IJISAE/article/view/2505 |
[45] | O. Ariyo, Q. Zhi-guang, L. Tian, Brain mr segmentation using a fusion of k-means and spatial fuzzy c-means, in: 2017 International conference on computer science and application engineering (CSAE 2017), 2017, 863–873. https://doi.org/10.12783/dtcse/csae2017/17565 |
[46] |
M. Sharif, M. A. Khan, Z. Iqbal, M. F. Azam, M. I. U. Lali, M. Y. Javed, Detection and classification of citrus diseases in agriculture based on optimized weighted segmentation and feature selection, Comput. Electron. Agr., 150 (2018), 220–234. https://doi.org/10.1016/j.compag.2018.04.023 doi: 10.1016/j.compag.2018.04.023
![]() |
[47] |
P. A. Babu, B. S. Rao, Y. V. B. Reddy, G. R. Kumar, J. N. Rao, S. K. R. Koduru, et al., Optimized cnn-based brain tumor segmentation and classification using artificial bee colony and thresholding, Int. J. Comput. Commun. Control, 18 (2023). https://doi.org/10.15837/ijccc.2023.1.4577 doi: 10.15837/ijccc.2023.1.4577
![]() |
[48] | M. Sharma, G. Purohit, S. Mukherjee, Information retrieves from brain mri images for tumor detection using hybrid technique k-means and artificial neural network (kmann), in: Networking communication and data knowledge engineering, Springer, 2018, 145–157. https://doi.org/10.1007/978-981-10-4600-1_14 |
[49] | A. Mikołajczyk, M. Grochowski, Data augmentation for improving deep learning in image classification problem, in: 2018 international interdisciplinary PhD workshop (IIPhDW), IEEE, 2018, 117–122. https://doi.org/10.1109/iiphdw.2018.8388338 |
[50] |
L. Zhang, X. Wang, D. Yang, T. Sanford, S. Harmon, B. Turkbey, et al., Generalizing deep learning for medical image segmentation to unseen domains via deep stacked transformation, IEEE T. Med. Imaging, 39 (2020), 2531–2540. https://doi.org/10.1109/tmi.2020.2973595 doi: 10.1109/tmi.2020.2973595
![]() |
[51] |
A. Işın, C. Direkoğlu, M. Şah, Review of mri-based brain tumor image segmentation using deep learning methods, Procedia Comput. Sci., 102 (2016), 317–324. https://doi.org/10.1016/j.procs.2016.09.407 doi: 10.1016/j.procs.2016.09.407
![]() |
[52] |
N. Varuna Shree, T. Kumar, Identification and classification of brain tumor mri images with feature extraction using dwt and probabilistic neural network, Brain Inform., 5 (2018), 23–30. https://doi.org/10.1007/s40708-017-0075-5 doi: 10.1007/s40708-017-0075-5
![]() |
[53] |
M. A. Hamid, N. A. Khan, Investigation and classification of mri brain tumors using feature extraction technique, J. Med. Biol. Eng., 40 (2020), 307–317. https://doi.org/10.1007/s40846-020-00510-1 doi: 10.1007/s40846-020-00510-1
![]() |
[54] |
Y. Chen, H. Jiang, C. Li, X. Jia, P. Ghamisi, Deep feature extraction and classification of hyperspectral images based on convolutional neural networks, IEEE T. Geosci. Remote, 54 (2016), 6232–6251. https://doi.org/10.1109/tgrs.2016.2584107 doi: 10.1109/tgrs.2016.2584107
![]() |
[55] | X. Yang, Y. Fan, Feature extraction using convolutional neural networks for multi-atlas based image segmentation, in: Medical Imaging 2018: Image Processing, volume 10574, International Society for Optics and Photonics, 2018, 1057439. https://doi.org/10.1117/12.2293876 |
[56] |
A. M. Hasan, H. A. Jalab, F. Meziane, H. Kahtan, A. S. Al-Ahmad, Combining deep and handcrafted image features for mri brain scan classification, IEEE Access, 7 (2019), 79959–79967. https://doi.org/10.1109/access.2019.2922691 doi: 10.1109/access.2019.2922691
![]() |
[57] |
H. El Hamdaoui, A. Benfares, S. Boujraf, N. E. H. Chaoui, B. Alami, M. Maaroufi, et al., High precision brain tumor classification model based on deep transfer learning and stacking concepts, Indones. J. Electr. Eng. Comput. Sci., 24 (2021), 167–177. https://doi.org/10.11591/ijeecs.v24.i1.pp167-177 doi: 10.11591/ijeecs.v24.i1.pp167-177
![]() |
[58] |
A. Khatami, A. Khosravi, T. Nguyen, C. P. Lim, S. Nahavandi, Medical image analysis using wavelet transform and deep belief networks, Expert Syst. Appl., 86 (2017), 190–198. https://doi.org/10.1016/j.eswa.2017.05.073 doi: 10.1016/j.eswa.2017.05.073
![]() |
[59] |
N. D. G. Carneiro, A. P. Bradley, Automated mass detection from mammograms using deep learning and random forest, International Conference on Digital Image Computing: Techniques and Applications (DICTA), (2016) 1–8. https://doi.org/10.1109/dicta.2015.7371234 doi: 10.1109/dicta.2015.7371234
![]() |
[60] |
Z. N. Shahweli, Deep belief network for predicting the predisposition to lung cancer in tp53 gene, Iraqi J. Sci., (2020), 171–177. https://doi.org/10.24996/ijs.2020.61.1.19 doi: 10.24996/ijs.2020.61.1.19
![]() |
[61] | T. Jemimma, Y. J. V. Raj, Brain tumor segmentation and classification using deep belief network, in: 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), IEEE, 2018, 1390–1394. https://doi.org/10.1109/iccons.2018.8663207 |
[62] |
N. Farajzadeh, N. Sadeghzadeh, M. Hashemzadeh, Brain tumor segmentation and classification on mri via deep hybrid representation learning, Expert Syst. Appl., 224 (2023), 119963. https://doi.org/10.1016/j.eswa.2023.119963 doi: 10.1016/j.eswa.2023.119963
![]() |
[63] |
H. H. Sultan, N. M. Salem, W. Al-Atabany, Multi-classification of brain tumor images using deep neural network, IEEE Access, 7 (2019), 69215–69225. https://doi.org/10.1109/access.2019.2919122 doi: 10.1109/access.2019.2919122
![]() |
[64] |
M. M. Badža, M. Č. Barjaktarović, Classification of brain tumors from mri images using a convolutional neural network, Appl. Sci., 10 (2020), 1999. https://doi.org/10.3390/app10061999 doi: 10.3390/app10061999
![]() |
[65] |
A. R. Raju, S. Pabboju, R. R. Rao, Hybrid active contour model and deep belief network based approach for brain tumor segmentation and classification, Sensor Rev., (2019). https://doi.org/10.1108/sr-01-2018-0008 doi: 10.1108/sr-01-2018-0008
![]() |
[66] | A. Kharrat, M. Néji, Classification of brain tumors using personalized deep belief networks on mrimages: Pdbn-mri, in: Eleventh International Conference on Machine Vision (ICMV 2018), volume 11041, SPIE, 2019, 713–721. https://doi.org/10.1117/12.2522848 |
[67] |
S. Deepa, J. Janet, S. Sumathi, J. Ananth, Hybrid optimization algorithm enabled deep learning approach brain tumor segmentation and classification using mri, J. Digit. Imaging, 36 (2023), 847–868. https://doi.org/10.1007/s10278-022-00752-2 doi: 10.1007/s10278-022-00752-2
![]() |
[68] |
B. E. Bejnordi, M. Veta, P. J. Van Diest, B. Van Ginneken, N. Karssemeijer, G. Litjens, et al., Diagnostic assessment of deep learning algorithms for detection of lymph node metastases in women with breast cancer, Jama, 318 (2017), 2199–2210. 10.1001/jama.2017.14585 doi: 10.1001/jama.2017.14585
![]() |
[69] |
M. F. Alanazi, M. U. Ali, S. J. Hussain, A. Zafar, M. Mohatram, M. Irfan, et al., Brain tumor/mass classification framework using magnetic-resonance-imaging-based isolated and developed transfer deep-learning model, Sensors, 22 (2022), 372. https://doi.org/10.3390/s22010372 doi: 10.3390/s22010372
![]() |
[70] |
B. B. Avants, C. L. Epstein, M. Grossman, J. C. Gee, Symmetric diffeomorphic image registration with cross-correlation: Evaluating automated labeling of elderly and neurodegenerative brain, Med. Image Anal., 12 (2008), 26–41. https://doi.org/10.1016/j.media.2007.06.004 doi: 10.1016/j.media.2007.06.004
![]() |
[71] |
M. P. Heinrich, I. J. Simpson, B. W. Papież, M. Brady, J. A. Schnabel, Deformable image registration by combining uncertainty estimates from supervoxel belief propagation, Med. Image Anal., 27 (2016), 57–71. https://doi.org/10.1016/j.media.2015.09.005 doi: 10.1016/j.media.2015.09.005
![]() |
[72] | A. V. Dalca, G. Balakrishnan, J. Guttag, M. R. Sabuncu, Unsupervised learning for fast probabilistic diffeomorphic registration, in: International Conference on Medical Image Computing and Computer-Assisted Intervention, Springer, 2018, pp. 729–738. https://doi.org/10.1016/j.media.2019.07.006 |
[73] | G. Balakrishnan, A. Zhao, M. R. Sabuncu, J. Guttag, A. V. Dalca, An unsupervised learning model for deformable medical image registration, in: Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, 9252–9260. https://doi.org/10.1109/cvpr.2018.00964 |
[74] |
M. Holden, A review of geometric transformations for nonrigid body registration, IEEE T. Med. Imaging, 27 (2007), 111–128. https://doi.org/10.1109/tmi.2007.904691 doi: 10.1109/tmi.2007.904691
![]() |
[75] | A. V. Dalca, G. Balakrishnan, J. Guttag, M. R. Sabuncu, Unsupervised learning for fast probabilistic diffeomorphic registration, in: International Conference on Medical Image Computing and Computer-Assisted Intervention, Springer, 2018, 729–738. https://doi.org/10.1016/j.media.2019.07.006 |
[76] | P. Viola, M. Jones, Rapid object detection using a boosted cascade of simple features, in: Proceedings of the 2001 IEEE computer society conference on computer vision and pattern recognition. CVPR 2001, volume 1, Ieee, 2001, Ⅰ–Ⅰ. https://doi.org/10.1109/cvpr.2001.990517 |
[77] |
G. E. Hinton, T. J. Sejnowski, Learning and relearning in boltzmann machines, Parallel distributed processing: Explorations in the microstructure of cognition, 1 (1986), 2. https://doi.org/10.7551/mitpress/3349.003.0005 doi: 10.7551/mitpress/3349.003.0005
![]() |
[78] |
M. Zambra, A. Testolin, M. Zorzi, A developmental approach for training deep belief networks, Cogn. Comput., 15 (2023), 103–120. https://doi.org/10.1007/s12559-022-10085-5 doi: 10.1007/s12559-022-10085-5
![]() |
[79] |
A. P. Kale, R. M. Wahul, A. D. Patange, R. Soman, W. Ostachowicz, Development of deep belief network for tool faults recognition, Sensors, 23 (2023), 1872. https://doi.org/10.3390/s23041872 doi: 10.3390/s23041872
![]() |
[80] |
A. M. Abdel-Zaher, A. M. Eldeib, Breast cancer classification using deep belief networks, Expert Syst. Appl., 46 (2016), 139–144. https://doi.org/10.1016/j.eswa.2015.10.015 doi: 10.1016/j.eswa.2015.10.015
![]() |
[81] |
M. Latha, G. Kavitha, Detection of schizophrenia in brain mr images based on segmented ventricle region and deep belief networks, Neural Comput. Appl., 31 (2019), 5195–5206. https://doi.org/10.1007/s00521-018-3360-1 doi: 10.1007/s00521-018-3360-1
![]() |
[82] | V. Golovko, A. Kroshchanka, U. Rubanau, S. Jankowski, A learning technique for deep belief neural networks, in: International Conference on Neural Networks and Artificial Intelligence, Springer, 2014, 136–146. https://doi.org/10.1007/978-3-319-08201-1_13 |
[83] |
W. Zhang, L. Ren, L. Wang, A method of deep belief network image classification based on probability measure rough set theory, Int. J. Pattern Recogn., 32 (2018), 1850040. https://doi.org/10.1142/s0218001418500404. doi: 10.1142/s0218001418500404
![]() |
[84] |
A. R. Khan, S. Khan, M. Harouni, R. Abbasi, S. Iqbal, Z. Mehmood, Brain tumor segmentation using k-means clustering and deep learning with synthetic data augmentation for classification, Micros. Res. Techniq., 84 (2021), 1389–1399. https://doi.org/10.1002/jemt.23694 doi: 10.1002/jemt.23694
![]() |
[85] | B. Dufumier, P. Gori, I. Battaglia, J. Victor, A. Grigis, E. Duchesnay, Benchmarking cnn on 3d anatomical brain mri: architectures, data augmentation and deep ensemble learning, arXiv preprint arXiv: 2106.01132 (2021). https://doi.org/10.48550/arXiv.2106.01132 |
[86] | F. Isensee, P. F. Jäger, P. M. Full, P. Vollmuth, K. H. Maier-Hein, nnu-net for brain tumor segmentation, in: Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries: 6th International Workshop, BrainLes 2020, Held in Conjunction with MICCAI 2020, Lima, Peru, October 4, 2020, Revised Selected Papers, Part II 6, Springer, 2021, pp. 118–132. https://doi.org/10.1007/978-3-030-72087-2_11 |
[87] |
J. Nalepa, M. Marcinkiewicz, M. Kawulok, Data augmentation for brain-tumor segmentation: a review, Front. Comput. Neurosc., 13 (2019), 83. https://doi.org/10.3389/fncom.2019.00083 doi: 10.3389/fncom.2019.00083
![]() |
[88] | F. Wilcoxon, Individual comparisons by ranking methods, in: Breakthroughs in Statistics: Methodology and Distribution, Springer, 1992, 196–202. https://doi.org/10.2307/3001968 |
[89] | D. Hull, Using statistical testing in the evaluation of retrieval experiments, in: Proceedings of the 16th annual international ACM SIGIR conference on Research and development in information retrieval, 1993, 329–338. https://doi.org/10.1145/160688.160758 |