1. Trang chủ
  2. » Luận Văn - Báo Cáo

STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS

54 1 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Nội dung

STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS STABILITY OF POSITIVE SOLUTIONS OF NONLINEAR DIFFERENTIAL EQUATIONS WITH DELAYS IN NEURAL NETWORKS

Trang 1

MINISTRY OF EDUCATION AND TRAININGHANOI NATIONAL UNIVERSITY OF EDUCATION

LE THI HONG DUNG

STABILITY OF POSITIVE SOLUTIONS

OF NONLINEAR DIFFERENTIAL EQUATIONSWITH DELAYS IN NEURAL NETWORKS

Speciality: Differential and Integral EquationsCode: 9460103

SUMMARY OF DOCTORAL THESIS IN MATHEMATICS

HA NOI-2024

Trang 2

This dissertation has been written on the basis of my research work carried at:Hanoi National University of Education

Supervisor: Assoc Prof Le Van Hien

Hanoi National University of Education

Referee 1: Associate Professor Tran Dinh Ke

Hanoi National University of Education

Referee 2: Associate Professor Do LanWater Resources University

Referee 3: Associate Professor Duong Anh Tuan

Hanoi University of Science and Technology

The thesis will be presented to the examining committee at Hanoi National University ofEducation, 136 Xuan Thuy Road, Hanoi, Vietnam

At the time of ,2024

This dissertation is publicly available at:- HNUE Library Information Centre- The National Library of Vietnam

Trang 3

A Motivation

Stability theory is one of the top priority research topics in the qualitative theory of ential equations and, more general, systems and control theory Date back to the pioneeringwork of Lyapunov, stability theory has been extensively developed Its intrinsic interest andrelevance has been found in a variety of disciplines like mechanics, physics, chemistry, ecologyor artificial intelligence.

differ-Appearing naturally in practice, many models in population, economic growth or labormigration are described by dynamic systems whose states are always nonnegative when theinitial states and inputs are nonnegative Such systems are called positive systems Applica-tions of positive systems can be found in various disciplines, from physics, chemistry, ecologyand epidemiology to economics, control engineering and telecommunications networks Re-search on positive systems shows that, besides a wide range of applications, positive systemsalso possess many properties that are not found in general systems For example, based onthe monotonicity and robustness induced by the positivity, positive systems are employed todesign interval observers, state estimation or stability analysis of nonlinear systems Thus,due to theoretical and practical features, the theory of positive systems has received ever-increasing interest in the past few years While the theory of positive systems has beenintensively studied for various kinds of linear systems, this area is still considerably lesswell-developed for nonlinear systems, in particular, for models arising in artificial and/orbiological neural networks Typically, dynamics of a network is represented by a system ofnonlinear differential equations with or without delay In the past two decades, the study in-volving qualitative behavior of nonlinear systems describing various types of neural networkshas attracted significant research attention due to a wide range of applications.

The terminology of neural networks, appeared in the late 1800s, was mentioned by entists while studying the function of human brain with the desire to be able to designcomputers that can work like the human brain, capable of learning through databases, re-membering experiences and use in appropriate situations Over the history of more than200 years, with the advent of computers, the study of neural networks has evolved extensivedevelopment and has obtained many important results in widen the ability to recognize andadapt to the industry computer technology Using ideas from experimental results in studiesof human brain, many intelligent computers, which have components like neurons or sets ofneurons and the connections between those components like the synapses of neurons, wereinvented.

sci-There are many kind of neural networks mentioned in the literature Due to their specificstructure and practical applicability, some popular models of neural networks such as Hop-field neural networks (HNNs), Cohen-Grossberg neural networks (CGNNs), inertial neural

Trang 4

networks (INNs), or bidirectional associative memory networks (BAMs) have been widelystudied However, there has been very little attempt devoted to the study of positive non-linear systems in neural networks For neural systems, the nonlinearity of neuron activationfunctions makes the study of positive neural networks more complicated and challenging,which requires in-depth knowledge and specific techniques.

Although many results concerning systems and control theory for positive systems havebeen published in the past few decades, the field of qualitative research on long-term be-havior and stability of nonlinear neuronal systems with delays is still of great interest tomathematicians and engineers Besides, many open issues related to the stability of posi-tive solutions in ecological models with multiple distinct delays are still being developed,especially for models with a structure that is more general and closer to reality Developingqualitative research on this type of systems is generally more difficult and complicated dueto the technical limitations and the capability of existing approaches This motives scholarsfor the research of positive solutions and stability of nonlinear differential equations withdelays.

B Research aims

The thesis focuses on problems of stability of several classes of nonlinear positive ential systems with delay in neural network models Specifically, the thesis researches thefollowing issues.

differ-• Positive solutions and exponential stability of nonlinear time-delay systems in the modelof BAM-Cohen-Grossberg neural networks.

• Exponential stability of positive conformable BAM neural networks with time-varyingdelays.

• Exponential attractivity of positive inertial neural networks in bidirectional associativememory model with time-varying delays.

The models considered are systems describing BAM-Hopfield networks with bounded delays.With assumptions related to connection weights and neural activation functions, we prove thesystems are positive and aim to establish conditions in LP form to ensure exponential stabilityof system There are similarities in the analysis schemes for the systems considered, such ascomparison techniques via inequalities, however, due to specific structure, it is necessary tohave separate research methods and proof techniques for each individual system.

C Methodology

• Comparison techniques via differential and integral inequalities.

• M-matrix theory approach.

Trang 5

• The use of fixed point theorems and basic tools in nonlinear functional analysis.D Research topics

This thesis is concerned with some issues of positive solutions and the stability of positivesystems with delays Specifically, the following topics will be researched and presented inthis thesis.

D1 Stability of nonlinear time-delay systems in BAM-Cohen-Grossberg neuralnetworks

Consider a class of nonlinear systems describing BAM-Cohen-Grossberg neural networkswith time-varying delays and nonlinear self-excitation rates of the form

ith neuron and jth neuron, respectively Initial conditions associated with system (1) arespecified as follows

x(t0+ ξ) = x0(ξ), ξ ∈ [−τ , 0], y(t0+ θ) = y0(θ), θ ∈ [−σ, 0], (3)where x0∈ C([−τ , 0],Rn) and y0 ∈ C([−σ, 0],Rm) are initial functions.

The objective is to study the existence of global positive solutions and the existence,uniqueness and exponential stability of positive equilibrium point Based on novel comparisontechniques via differential inequalities, unified conditions for the existence and exponentialstability of a unique positive equilibrium of model (1) are derived in terms of tractableLP-based conditions.

D2 Stability of positive conformable BAM neural networks with delays

In Chapter 3, we consider a class of differential equations with delays described by formable fractional derivative (CFD) This differential equation type can be used to describe

Trang 6

con-the dynamics of various practical models, including biological and artificial neural networkswith heterogeneous time-varying delays Consider the following system

cDtα0 x(t)y(t)

= −Dβ,γ x(t)y(t)

+ Af (y(t))Cg(x(t))

+ Bf (yσ(t))Dg(xτ(t))

+ IJ

For a given t0 ≥ 0 and σ, τ are known positive constants, the initial condition of system(4) is specified as

xt0 = x0 ∈ C([−τ , 0],Rn), yt0 = y0∈ C([−σ, 0],Rm), (5)that is,

xt0(s) = x(t0+ s) = x0(s), s ∈ [−τ , 0],yt0(θ) = y(t0+ θ) = y0(θ), θ ∈ [−σ, 0].

By novel comparison techniques via fractional differential and integral inequalities, unifiedconditions in terms of tractable LP-based conditions for the existence and exponential sta-bility of a unique positive equilibrium of the CFD model (4) are derived.

D3 Stability of positive inertial BAM neural network model

In Chapter 4, we consider a model of inertial BAM neural networks with delays describedby the following second-order differential equations

0 ≤ τj(t) ≤ τ and 0 ≤ σi(t) ≤ σ

for all t ≥ t0, where τ, σ are known positive constants.

Trang 7

The initial condition associated with system (6) is defined by

x(t0+ θ) = ϕ(θ), x0(t0+ θ) = ϕd(θ), θ ∈ [−τ , 0],y(t0+ θ) = ψ(θ), y0(t0+ θ) = ψd(θ), θ ∈ [−σ, 0],

where ϕ, ϕd ∈ C([−τ , 0],Rn) and ψ, ψd ∈ C([−σ, 0],Rm) are compatible initial functions.Based on some novel comparison techniques developed from monotone dynamical systemstheory, we then derive tractable conditions in terms of M-matrix involving self excitationcoefficients and connection weights to ensure the positivity of solutions and the existence ofa unique EP corresponding to an input vector of system (6) The derived conditions are thenutilized to show that the unique EP is positive and globally attractive.

D Main contributions

This dissertation is concerned with the positivity of solutions and exponential stability ofpositive equilibrium of nonlinear differential equations with delays in various types of neuralnetworks Main contribution are as follows.

1 Proposed new LP-based conditions for the positivity of solutions and exponential ity of positive equilibrium of BAM-Cohen-Grossberg neural networks with time-varyingdelays and nonlinear self-excitation rates.

stabil-2 Proved the positivity and derived tractable conditions for the global exponential stabilityof a unique positive equilibrium of conformable BAM neural networks with communi-cation delays.

3 Established LP-based conditions ensuring the positivity of solutions and global nential stability of a unique positive equilibrium of inertial BAM neural networks withbounded delays.

expo-E Thesis outline

Except the Introduction, Conclusion, List of Publications, and List of References, theremaining of the thesis is devided into four chapters Chapter 1 presents some preliminaryresults Chapter 2 deals with the exponential stabilization problem of a positive equilibriumpoint of the BAM-Cohen-Grosberg system with time-varying delays Chapter 3 investigatesthe problem of the exponential stabilization of nonlinear delay differential equations de-scribed by conformable fractional derivative Chapter 4 is concerned with the problem ofglobal exponential stability of inertial neural networks (INNs) described by Hopfield-typeBAM model with time-varying delays.

Trang 8

Chapter 1

In this chapter, we recall some auxiliary results on matrix analysis and stability theory,which will be useful for the presentation of the results in next chapters.

1.1 Nonnegative matrix and M-matrix

1.2 System of delayed differential equations and Lyapunov stability1.3 Positive system and stablity of nonlinear positive system

1.3.1 Linear positive system

1.3.2 A result on the exponential stability of positive Hopfield neural network1.4 Conformable fractional derivative

1.5 Other auxiliary results

Trang 9

differential-2.1 Model description and Preliminaries

Consider system (2.1), wheren,mrepresent the number of neurons inX-layer andY-layer,respectively, and i ∈ [n], j ∈ [m]; xi(t) and yj(t) represent the state variables of cell ith infieldFX and celljth in field FY; αi(xi)and βj(yj)are neural amplification functions, ϕi(xi),

ψj(yj) are nonlinear decay rate functions and δi > 0, ρj > 0 are self-inhibition coefficients.For linear decay rate functions (that is, ϕi(xi) = xi and ψj(yj) = yj), δi and ρj are therates at which ith and jth neurons will reset their potential to the resting state in isolationwhen disconnected from the network and external inputs In system (2.1), fj, gi are neuronactivation functions andaij, bij,cji,dji are connection weights which represent the strengthsof connectivity between jth neuron in FY and ith neuron in FX Ii and Jj are externalinputs to the ith neuron and jth neuron, respectively The functions τi(t) and σj(t) denotecommunication delays between neurons which satisfy

0 ≤ τi(t) ≤ τ , 0 ≤ σj(t) ≤ σ, (2.2)where τ and σ are known positive constants Initial conditions associated with system (2.1)are specified as follows

x(t0+ ξ) = x0(ξ), ξ ∈ [−τ , 0], y(t0+ θ) = y0(θ), θ ∈ [−σ, 0], (2.3)where x0 ∈ C([−τ , 0],Rn) and y0 ∈ C([−σ, 0],Rm) are initial functions For convenience, wedenote the matrices

α(x) = diag {α1(x1), , αn(xn)} ,

Trang 10

β(y) = diag {β1(y1), , βm(ym)} ,

Dδ = diag {δ1, , δn} , Dρ= diag {ρ1, , ρm} ,

and A = (aij), B = (bij) ∈ Rn×m, C = (cji), D = (dji) ∈Rm×n In addition, we also use thevectors I = (Ii) ∈Rn, J = (Jj) ∈Rm and the vector-valued functions

Φ(x(t)) = col(ϕi(xi(t))), Ψ(y(t)) = col(ψj(yj(t))),

f (y(t)) = col(fj(yj(t))), f (y(t − σ(t))) = col(fj(yj(t − σj(t)))),g(x(t)) = col(gi(xi(t))), g(x(t − τ (t))) = col(gi(xi(t − τi(t)))).

Then, system (2.1) can be written in the following form

x0(t) = α(x(t)) [−DδΦ(x(t)) + Af (y(t)) + Bf (y(t − σ(t))) + I] ,y0(t) = β(y(t)) [−DρΨ(y(t)) + Cg(x(t)) + Dg(x(t − τ (t))) + J ]

2.1.1 The existence and uniqueness of a solution

Let D denote the set of continuous functionsϕ :R →R that satisfies ϕ(0) = 0 and thereexist positive scalars cϕ, cˆϕ such that

j ≤ βj(u) ≤ βj.

(B2) ϕi(.) and ψj(.) belong to the function classD.

(B3) fj(.),gi(.)are continuous, fj(0) = 0, gi(0) = 0, and there exist positive constants Lfj,Lgi

such that

0 ≤ fj(u) − fj(v)

u − v ≤ Lfj, 0 ≤ gi(u) − gi(v)u − v ≤ Lgi,

for u, v ∈R, u 6= v.

Theorem 2.1.1 With the assumptions (B1)-(B3), for any initial condition defined by x0 ∈C([−τ , 0],Rn)and y0 ∈ C([−σ, 0],Rm), system (2.1) possesses a solution χ(t) = col(x(t), y(t))

on [t0, +∞), which is absolutely continuous in t

2.1.2 Positive solution and equilibrium point

Let χ(t) = col(x(t), y(t))be a solution of system (2.4) If the trajectory of χ(t)is confinedwithin the first orthant, that is, χ(t) ∈Rn+m+ for all t ≥ t0, then χ(t) is said to be a positivesolution of (2.4) We define the following admissible set of initial conditions for system (2.4)

Trang 11

Definition 2.1.1 System (2.4) is said to be positive if for any initial function φ ∈ A andnonnegative input vector col(I, J ) ∈ Rn+m+ , the corresponding solution χ(t) = col(x(t), y(t))

of (2.4) is positive.

Definition 2.1.2 For given input vectors I ∈ Rn and J ∈ Rm, a vector χ∗ = col(x∗, y∗) ∈

Rn+m, where x∗ ∈Rn and y∗ ∈Rm, is said to be an equilibrium point (EP) of system (2.4)if it satisfies the following algebraic system

−DδΦ(x∗) + (A + B)f (y∗) + I = 0−DρΨ(y∗) + (C + D)g(x∗) + J = 0.

Moreover, χ∗ is a positive equilibrium point if it is an equilibrium point and χ∗ 0.

Definition 2.1.3 A positive EP χ∗ = col(x∗, y∗) of system (2.4) is said to be globallyexponentially stable (GES) if there exist positive scalars κ and λ such that any solution

χ(t) = col(x(t), y(t)) of (2.4) with initial condition (2.3) satisfies the following inequality

kχ(t) − χ∗k∞≤ κkφ − χ∗kCe−λ(t−t0), t ≥ t0.

2.2 Positive solutions

In this section, we will prove that, under assumptions (B1)-(B3), any solution of system(2.4) with nonnegative initial states is positive provided that the weighted coefficients arenonnegative.

Theorem 2.2.1 Let assumptions (B1)-(B3) hold and assume that the matrix

M =

A BC> D>

is nonnegative (M  0) Then, the BAM-CG neural network model described by system (2.4)is positive Specifically, for any initial condition φ ∈ A and nonnegative input vector J =col(I, J ) ∈Rn+m, I  0, J  0, the corresponding solution satisfies χ(t) = col(x(t), y(t))  0

for all t ≥ t0.

2.3 Positive equilibria

In this section, by utilizing Brouwer fixed point theorem, we derive conditions by whichmodel (2.4) possesses at least one positive EP for a given input vectorJ = col(I, J) ∈ Rn+m+ First, it can be verified from (2.7) that a vectorχ∗ = col(x∗, y∗) ∈Rn+m is an EP of system(2.4) if and only if it satisfies the algebraic system

D−1δ ((A + B)f (y∗) + I) = Φ(x∗),D−1ρ ((C + D)g(x∗) + J ) = Ψ(y∗).

(2.8)

Trang 12

Revealed by system (2.8), we define a mapping H :Rn+m →Rn+m by

H (χ) = Φ−1 D−1δ ((A + B)f (y) + I)Ψ−1 D−1ρ ((C + D)g(x) + J )

(2.9)whereχ = col(x, y), x ∈Rn and y ∈Rm The mapping H defined by (2.9) can be written interms componentwise as

hj(x) = ψj−1

and ϕ−1i (.), ψj−1(.) denote the inverse functions of ϕi(.)and ψj(.), respectively.

In regard to equations (2.8) and (2.9), a vector χ∗∈Rn+m is an EP of system (2.4) if andonly if it is a fixed point of the mapping H, that is, H (χ∗) = χ∗ Based on Brouwer fixedpoint theorem, we have the following result.

Theorem 2.3.1 Let assumptions (B1)-(B3) hold and assume that

ρ(Λ) < 1, Λ = 0n×n Λ1Λ2 0m×m

where the matrices Λ1 = (Λ1ij) ∈Rn×m, Λ2 = (Λ2ji) ∈Rm×n are defined by entries

Λ1ij = 1δicϕi

A BC> D>

is nonnegative If condition (2.10) is satisfied, then, for a given nonnegative input vector

J = col(I, J), system (2.4) has at least a positive EP χ∗∈Rn+m+ Remark 2.3.1 Let eΛ = Λ1Λ2= (Λeij) ∈Rn×n, where

Λij = 1δicϕi

By the Schur identities, we have

det(λIn+m− Λ) = det λIn −Λ1−Λ2 λIm

= λmdet

λIn− 1λΛe

= λm−ndet(λ2In−Λ)e (2.11)

Trang 13

for any λ ∈ C, λ 6= 0 Therefore, λ ∈ σ(Λ)\{0} if and only if µ = λ2 ∈ σ(Λ)\{0}e and,consequently, ρ(Λ) < 1 if and only if ρ(Λ) < 1e By this, the condition (2.10) holds if andonly if ρ(Λ) < 1e Similarly, we can also conclude that the condition (2.10) holds if and onlyif ρ(Λ) < 1b , where bΛ = Λ2Λ1= (Λbij) ∈Rm×m and

Λij = 1ρicψi

(|cil| + |dil|)(|alj| + |blj|)LglLfj, i, j ∈ [m].

2.4 Exponential stability of positive equilibrium point

This section focuses on the exponential stability of the positive EP of system (2.4) Forconvenience, we denote the following notations

rαi = αiα−1i , sβj = βjβ−1

j ,kij1 = rαi

δicϕi(aij + bij)L

j, K1= (kij1) ∈Rn×m,kji2 = sβj

ρjcψj(cji+ dji)L

i, K2= (kji2) ∈Rm×n,M =K1K2 = (mik), M =f K2K1 = (mejl),mik = rαi

M =

A BC> D>

is nonnegative and one of the following LP conditions is satisfied(i) There exists a vector ξ ∈ Rn, ξ = (ξk)  0, such that

Trang 14

of a unique positive equilibrium However, when system (2.1) is reduced to linear system(i.e the amplification functions α(.), β(.), self decay rates Φ(.), Ψ(.), and neuron activationfunctions f (.), g(.) are identity), condition (2.13) holds if and only if there exist positivevectors η = (ηi), ν = (νk) such that

!

It is well-known that, for positive linear systems reduced from (2.1), condition (2.16) is independent necessary and sufficient condition that guarantees the property of GAS or GESwith any bound delays Thus, the derived conditions in Theorem 2.4.1 can be necessary forrestricted class of time-delay systems.

delay-2.5 Conclusion of Chapter 2

In this chapter, the problem of exponential stability of positive nonlinear systems senting BAM-Cohen-Grossberg neural networks with time-varying delays has been investi-gated The main results achieved include:

repre-1 Prove the existence of a global positive solution of the system (Theorem 2.2.1).

2 Prove the existence and uniqueness of the equilibrium point and establish the conditionsin LP form for the exponential stability of the positive equilibrium point of the system(Theorem 2.3.1 and Theorem 2.4.1).

3 Proposed a new approach based on comparison techniques through differential-integralinequalities for research on the stability of positive Hopfield neural network models withvariable delay

The results presented in this chapter can be regarded as some extensions of existing onesin the literature However, it seems that the obtained results cannot be simply extendedto certain models of BAM-Hopfield or BAM-CG inertial neural networks with delays Howto extend the proposed method in this chapter to nonlinear systems involving fractionalBAM-CG neural networks proves to be an interesting and challenging issue.

Trang 15

3.1 Model description and PreliminariesConsider the model (4), which can be written as

where βi, i ∈ [n], and γj, j ∈ [m], are given positive scalars, A = (aij), B = (bij), C = (cji)

and D = (dji) are known real matrices of appropriate dimensions, delays τi(t), σj(t) arecontinuous functions and satisfy

0 ≤ τi(t) ≤ τ , 0 ≤ σj(t) ≤ σ,

where σ and τ are known positive constants For a given t0 ≥ 0, the initial condition ofsystem (3.1) is specified as

xt0 = x0 ∈ C([−τ , 0],Rn), yt0 = y0∈ C([−σ, 0],Rm), (3.2)that is,

xt0(s) = x(t0+ s) = x0(s), s ∈ [−τ , 0],yt0(θ) = y(t0+ θ) = y0(θ), θ ∈ [−σ, 0].

Assumption (C): Assume that the nonlinear functions (which represent neuron activationfunctions)fj(·) and gi(·)are continuous,fj(0) = 0, gi(0) = 0, and there exist positive scalars

Trang 16

lfj, lgi such that

0 ≤ fj(u) − fj(v)

u − v ≤ ljf, 0 ≤ gi(u) − gi(v)

u − v ≤ lig, (3.3)for all u, v ∈R, u 6= v.

System (3.1) can be written in the following form

cDtα0 x(t)y(t)

= −Dβ,γ x(t)y(t)

+ Af (y(t))Cg(x(t))

+ Bf (yσ(t))Dg(xτ(t))

+ IJ

Dβ = diag {β1, , βn} , Dγ = diag {γ1, , γm} Dβ,γ = diag{Dβ, Dγ}.

Theorem 3.1.1 Let assumption (C) hold Then, for any initial condition (x0, y0), the lem governed by the system (3.1) and (3.2) possesses a unique solution χ(t) = col(x(t), y(t)),which is continuous in t on [t0, +∞).

prob-we say that the solution χ(t) is a positive solution of the system (3.2) is χ(t)  0 for all

t ≥ t0 Thus, to characterize the positivity of system (3.2), we define the following admissibleset of initial conditions

φ = x0y0

−Dβx∗+ (A + B)f (y∗) + I = 0,−Dγy∗+ (C + D)g(x∗) + J = 0.

Moreover, χ∗ is called a positive EP if χ∗ 0.

Definition 3.1.3 An EP χ∗ = col(x∗, y∗) of system (3.1) is said to be globally fractionalexponentially stable (GFES) if there exist positive scalars κ, which is independent of initialconditions, and λ such that any solution χ(t) = col(x(t), y(t)) of system (3.1) satisfies thefollowing inequality

kχ(t) − χ∗k∞≤ κkφ − χ∗kCe−λ(t−t0)

α , t ≥ t0.

Trang 17

3.2 Positive solutions

In this section, we prove under assumptions involving the order-preserving property ofnonlinear vector fields that, with nonnegative initial states and inputs, the state trajectoriesof (3.1) are always nonnegative for all time First, we prove the following lemma.

Lemma 3.2.1 For a given function r(·) ∈ C([t0, +∞),R+) and a real number p, the sponding solution of the problem

Dtα0x(t) = −px(t) + r(t), t ≥ t0,x(t0) = x0,

(3.5)is nonnegative for all t ≥ t0 provided that x0 ≥ 0.

The positivity of system (3.1) is proved in the following theorem.

Theorem 3.2.1 Let assumption (C) hold and assume that the weight coefficient matrices

A, B, C and D are nonnegative (equivalently, the augmented matrix

Dβ−1((A + B)f (y∗) + I) = x∗Dγ−1((C + D)g(x∗) + J ) = y∗.

(3.6)From (3.6), we define the functions si(y), s˜j(x) and a mapping S :Rn+m →Rn+m as

S (χ) = D−1β (A + B)f (y) + IDγ−1 (C + D)g(x) + J

(3.7)where χ = col(x, y), x ∈Rn and y ∈Rm.

In regard to (3.6) and (3.7), a vector χ∗ ∈ Rn+m is an EP of system (3.1) if and only ifit is a fixed point of the mapping S, that is, S (χ∗) = χ∗ Based on Brouwer’s fixed pointtheorem, we have the following result.

Theorem 3.3.1 Let assumption (C) hold and assume that

ρ(H ) < 1, H = 0n H12

H21 0m

!

Trang 18

where the matrices H12= (hfij) ∈Rn×m, H21 = (hgji) ∈Rm×n are defined by entries

is nonnegative and condition (3.8) is satisfied Then, for a given nonnegative input vector

J = col(I, J), system (3.1) has at least a positive EP χ+∗ ∈Rn+m+ Remark 3.3.1 Let fH = H12H21 = (Hfij) ∈Rn×n, where

Hij = 1βi

By the Schur identities, we have

det(λIn+m−H ) = det λIn −H12

Hij = 1γi

Corollary 3.3.2 Let condition (3.3) hold and assume that the weight coefficient matrices

A, B, C, and D are nonnegative If ρ Hf

< 1 or ρ Hc

< 1, then system (3.1) has at leastone positive EP χ+∗ ∈Rn+m+ for any nonnegative input vector J = col(I, J).

3.4 Fractional exponential stability

In this section, we study the fractional exponential stability of the unique positive EP ofthe system (3.1) Main result of this section is presented in the following theorem.

Theorem 3.4.1 Let assumption (C) hold Assume that the weight coefficient matrices arenonnegative and one of three following conditions is satisfied

(i) Spectral radius of the matrix H satisfies ρ H

< 1.(ii) There exists a vector η = (ηi) ∈Rn, η  0, such that

1βi

Trang 19

(iii) There exists a vector ζ = (ζi) ∈Rm, ζ  0, such that

1 Provide conditions to ensure that with non-negative input and connection weights, allstate trajectories originating from a positive cone with initial conditions are non-negative(Theorem 3.2.1 ).

2 Establish the condition and prove the unique existence of a positive equilibrium pointthat is exponentially stable with any bounded delay (Theorem 3.3.1 and Theorem 3.4.1).Based on newly derived comparison techniques via fractional differential and integral in-equalities, tractable LP-based conditions have been formulated to ensure that, for each non-negative input vector, the system possesses a unique positive equilibrium point, which isfractional exponentially stable for any bounded delays.

Trang 20

qjigi(xi(t − τi(t))) + Jj, t ≥ t0, j ∈ [m], (4.1b)where n, m are positive integers representing the number of neurons in X-layer and Y-layer, x(t) = (xi(t)) ∈ Rn and y(t) = (yj(t)) ∈ Rm are the state vectors of neuron fields

FX and FY, respectively, ai > 0, bj > 0, i ∈ [n], j ∈ [m], are the damping coefficientsand ci > 0, dj > 0 are self-inhibition coefficients, that is, the rates at which ith and jthneurons will reset their potential to the resting state in isolation when disconnected from thenetwork and external inputs In model (4.1), fj and gi are the neuron activation functions,

P = (pji),Q = (qji) ∈Rm×n and R = (rij),S = (sij) ∈Rn×m are connection weight matrices,which represent the strengths of connectivity between cells in FY and FX, I = (Ii) ∈ Rn,

J = (Jj) ∈ Rm are external input vectors to the networks The functions τi(t) and σj(t)

Trang 21

represent the heterogeneous communication delays between neurons which are assumed tosatisfy

0 ≤ τj(t) ≤ τ and 0 ≤ σi(t) ≤ σ

for all t ≥ t0, where τ, σ are known positive constants.

The initial condition associated with (4.1), which specifies the initial state of the networks,is defined by

x(t0+ θ) = ϕ(θ), x0(t0+ θ) = ϕd(θ), θ ∈ [−τ , 0],y(t0+ θ) = ψ(θ), y0(t0+ θ) = ψd(θ), θ ∈ [−σ, 0],

(4.2)whereϕ,ϕd ∈ C([−τ , 0],Rn)and ψ, ψd ∈ C([−σ, 0],Rm)are compatible initial functions Forconvenience, we denote the following matrices and vector-valued functions

A = diag {a1, a2, , an} , C = diag {c1, c2, , cn} ,B = diag {b1, b2, , bm} , D = diag {d1, d2, , dm} ,

f (y(t)) = col(fj(yj(t))), f (y(t − σ(t))) = col(fj(yj(t − σj(t)))),g(x(t)) = col(gi(xi(t))), g(x(t − τ (t))) = col(gi(xi(t − τj(t)))).

Then, system (4.1) can be written in the following vector form

x00(t) = −Ax0(t) − Cx(t) + Rf (y(t)) + Sf (y(t − σ(t))) + I, (4.3a)

y00(t) = −By0(t) − Dy(t) + P g(x(t)) + Qg(x(t − τ (t))) + J (4.3b)4.2 Global existence of solutions

Consider the inertial BAM neural network model (4.1) We define a state transformationby

ηixˆi= x0i+ ξixi, i ∈ [n],µjˆj = yj0 + ζjyj, j ∈ [m],

(4.4)whereηi, ξi, and µj, ζj are positive scalars, which will be determined later System (4.3) canbe represented as

Dξ = diag{ξ1, ξ2, , ξn}, Dη = diag{η1, η2, , ηn},Dα= diag{α1, α2, , αn}, Dγ = diag{γ1, γ2, , γn},

Dζ = diag{ζ1, ζ2, , ζm}, Dµ = diag{µ1, µ2, , µm},Dβ = diag{β1, β2, , βm}, Dν = diag{ν1, ν2, , νm},

Trang 22

αi= ai− ξi, γi= 1

ηi(αiξi− ci) , i ∈ [n],βj = bj− ζj, νj = 1

µj (βjζj − dj) , j ∈ [m].

Assumption (D): The neuron activation functions fj(.),gi(.),i ∈ [n],j ∈ [m], are continuous,

fj(0) = 0, gi(0) = 0and there exist positive constants Kjf, Kig such that

0 ≤ fj(x) − fj(y)

x − y ≤ Kjf, 0 ≤ gi(x) − gi(y)

x − y ≤ Kig (4.6)for all x, y ∈R, x 6= y.

Theorem 4.2.1 (Global existence of solutions) Let assumption (D) hold For any initialfunctions ϕ, ψ, ϕd and ψd, the problem governed by the system (4.1) and (4.2) possesses aunique solution on the infinite interval [t0, +∞), which is absolutely continuous in t

4.3 State transformations and positive solutions

LetX (t) = col(x(t), y(t))be a solution of (4.1) (or system (4.3)) If its trajectory is confinedwithin the first orthant, that is, x(t) ∈ Rn+ and y(t) ∈ Rm+ for all t ≥ t0, then X (t) is saidto be a positive solution For a function ϕ ∈ C([−τ , 0],Rn), we denote by ϕ  0 if ϕ(θ)  0

for all θ ∈ [−τ , 0] Consider a transformation (4.4), where ηi > 0, ξi > 0, i ∈ [n], and µj, ζj,

j ∈ [m] We define the following set of admissible initial functions for system (4.1)

DA =

φ =

ϕ  0, Dξϕ + ϕd  0ψ  0, Dζψ + ψd  0

Definition 4.3.2 Given input vectors I ∈ Rn, J ∈ Rm A vector X∗ = col(x∗, y∗), where

x∗ ∈ Rn and y∗ ∈ Rm, is said to be an equilibrium point (EP) of system (4.3) if it satisfiesthe following algebraic system

−Cx∗+ (R + S)f (y∗) + I = 0,−Dy∗+ (P + Q)g(x∗) + J = 0.

(4.7)Moreover, if X∗ 0 then it is called a positive equilibrium point.

Definition 4.3.3 An EPX∗ of (4.3) is said to be globally exponentially stable (GES) if thereexist positive scalars κ and ω such that any solution X (t) of (4.3) satisfies the followinginequality

kX (t) − X∗k∞ ≤ κΨ∗e−ω(t−t0)

, t ≥ t0,

Trang 23

where Ψ∗ = max {kϕ − x∗kC, kϕdkC, kψ − y∗kC, kψdkC}.

The main objective of this section is to derive testable LP-based conditions that ensureinertial BAM neural networks described by system (4.3) positive and there exists a uniquepositive EP which is GES.

To establish the positivity of the system (4.1) or vector form (4.3), via the transformedsystem (4.5), the existence of a state transformation given in (4.4) with positive coefficientsis essential This will be shown in the following technical lemma.

Lemma 4.3.1 Given positive damping coefficients ai, bj and self-inhibition coefficients ci,

dj, i ∈ [n], j ∈ [m] Then, there exists a transformation (4.4) defined by positive coefficientmatrices Dξ, Dη, Dζ and Dµ such that diag{Dα, Dγ, Dβ, Dν}  0 if and only if it holds that

A2− 4C  0,B2− 4D  0.

(4.8)Theorem 4.3.1 Let assumption (D) and condition (4.8) hold Assume that the connectionmatrices P, Q, R and S are nonnegative Then, system (4.3) is positive for any boundeddelays.

4.4 Exponential stability of positive equilibrium point of inertial BAMneural networks

4.4.1 Equilibrium

In this section, by utilizing the homeomorphism theory, we establish conditions for theexistence and uniqueness of an EP for system (4.3) with respect to any initial augmentedvector col(I, J ) ∈Rn+m.

Lemma 4.4.1 For given positive matrices Dξ, Dη, Dζ and Dµ, a vector X∗ = col(x∗, y∗) isan EP of system (4.3) if and only if the vector χ∗= col(x∗, ˆx∗, y∗, ˆy∗), where xˆ∗ = Dη−1Dξx∗,

ˆ∗ = Dµ−1Dζy∗, is an EP of system (4.5) In other words, the vectorχ∗ satisfies the followingalgebraic system

−Dξx∗+ Dηxˆ∗ = 0,

Dη(−Dαxˆ∗+ Dγx∗) + (R + S)f (y∗) + I = 0,−Dζy∗+ Dµˆ∗ = 0,

, Φ12 = 0n×m 0n×m(R + S)Kf 0n×m

Φ21= 0m×n 0m×n(P + Q)Kg 0m×n

, Φ22 = Dζ −Dµ−Dβζ+ D Dβµ

,

Trang 24

where Dαξ = DαDξ, Dαη = DαDη, Dβζ = DβDζ and Dβµ= DβDµ.

Theorem 4.4.1 Assume that there exist vectors Λ ∈R2n, Υ ∈R2m, Λ  0 and Υ  0, suchthat

Φ11 −Φ12−Φ21 Φ22

Then, for a given input vector I = col(I, J) ∈ Rn+m, there exists a unique EP χ∗ =col(x∗, ˆx∗, y∗, ˆy∗) of system (4.5).

4.4.2 Exponential stability of positive EP of INNs

The result of Theorem 4.4.1 guarantees that for any input vector I = col(I, J) ∈ Rn+m,there exists a unique EP χ∗ of system (4.5) In this section, we will prove under the assump-tions of Theorem 4.3.1 and Theorem 4.4.1 that the unique EPχ∗ of system (4.5) is positiveand GES for any input I = col(I, J) ∈Rm+n+

Theorem 4.4.2 Let the assumptions of Theorem 4.3.1 hold Assume that there exist positivevectors 0 ≺ ˆΛ ∈R2n and 0 ≺ ˆΥ ∈R2m such that

Φ11 −Φ12−Φ21 Φ22

! I−1η Λˆ

I−1µ Υˆ

where Iη = diag{En, Dη} and Iµ = diag{Em, Dµ} Then, for given input vectors I ∈ Rn+,

J ∈Rm+, system (4.3) has a unique positive EP X∗, which is GES for any delays τi(t) ∈ [0, τ ]

and σj(t) ∈ [0, σ].

Remark 4.4.1 By the theory of M-matrix, it can be noted that, for given0 ≺ η = (ηi) ∈Rn,

0 ≺ µ = (µj) ∈ Rm, condition (4.11) is feasible for positive vectors Λ ∈ˆ R2n and Υ ∈ˆ R2m

if and only if there exist Λ ∈ R2n, Υ ∈ R2m, Λ  0, Υ  0, that satisfy condition (4.10).Moreover, the derived conditions in (4.10) and (4.11) are satisfied if and only if

Theorem 4.4.3 Let assumption (D) hold and assume that the damping coefficients, inhibition rates and connection weights satisfy the following conditions

Trang 25

(iii) There exist positive vectors 0 ≺ p ∈ Rn and 0 ≺ q ∈Rm such that

Cp − (R + S)Kfq  0,Dq − (P + Q)Kgp  0.

Then, it holds that

(a) System (4.1) is positive subject to initial conditions that belong to DA.

(b) For given input vectors I ∈ Rn+, J ∈ Rm+, there exists a unique positive EP X∗ =col(x∗, y∗) of (4.1) which is GES for any delays τi(t) ∈ [0, τ ] and σj(t) ∈ [0, σ].

Remark 4.4.2 By assumption (D) and conditions (ii), it can be verified that the mappings

G : Rn → Rn, G(x) = (P + Q)g(x), and F : Rm → Rm, F (y) = (R + S)f (y), are preserving vector fields Thus, for given input vectors 0 ≺ I ∈ Rn, 0 ≺ J ∈ Rm, the EP

order-X∗= col(x∗, y∗) of (4.1) determined by (4.7) satisfies

x∗ C−1I  0 and y∗  D−1J  0.

Moreover, let X (t) = col(x(t), y(t)) be a solution of system (4.1)-(4.2) Since X (t) convergesexponentially to the unique EP X∗  0, it is clear that X (t)  0 for all t ≥ tf with a largeenough time tf By this we can conclude that under the assumptions of Theorem 4.4.3, anysolution of system (4.1)-(4.2) is eventually positive and converge exponentially to a uniquepositive EP for any positive input vectorsI and J.

4.5 Conclusion of Chapter 4

In this chapter, the problem of global exponential attractivity of a unique positive EP hasbeen studied for inertial neural networks described by BAM-Hopfield model with multipletime-varying delays The results achieved include:

1 Provide the conditions and prove the positivity of the system the positivity of the system(Theorem 4.3.1).

2 Establish the conditions for the existence and uniqueness of the positive equilibriumpoint that global exponential attractivity (Theorem 4.4.1 and Theorem 4.4.2).

A systematic approach based on a reduction in positive state transformation and comparisontechniques via differential inequalities has been presented By utilizing the proposed method,tractable conditions for the positivity of solutions and the existence of a unique positive EP,which is globally exponentially attractive, have been derived in the form of linear programingwith M-matrices.

Trang 26

CONCLUDING REMAKRS

This thesis focuses on the problem of stability of nonlinear differential equations withdelays in neural networks Specifically, the thesis is concerned with the following aspects: (1)Positive solutions and global exponential stability of BAM-Cohen-Grossberg neural networkswith time-varying delays and nonlinear self-excitation rates; (2) Exponential stability of pos-itive BAM neural networks with communication delays described by conformable fractionalderivatives; (3) Exponential attractivity of positive inertial neural networks in bidirectionalassociative memory model with bounded delays Main approaches used in this thesis arebased on comparison techniques via differential inequalities and M-matrix theory.

Main contributions

1 Proposed a novel and systematic approach based on M-matrix theory for the problem ofpositive solutions and exponential stability of BAM-Cohen-Grossberg neural networkswith time-varying delays and nonlinear self-excitation rates By utilizing the proposedmethod, sufficient conditions for the global exponential stability of a unique EP arederived in the form of tractable LP-based conditions.

2 Proved the positivity and derived tractable conditions ensuring the existence and nential stability of a unique positive equilibrium of conformable fractional-order BAMneural networks with communication delays.

expo-3 Established sufficient conditions which ensure the positivity of solutions and globalexponential stability of a unique positive equilibrium of inertial BAM neural networkswith bounded delays.

Future works: Potential further extensions

• The results presented in Chapter 2 extends existing ones in the literature However, howto extend the proposed method in this chapter to nonlinear systems involving fractional-order BAM-Cohen-Grossberg delayed neural networks proves to be an interesting andchallenging issue This will be further investigated in future works.

• The approach and techniques used to obtain the results presented in Chapter 3 canbe still effective when considering the stability of conformable fractional inertial neuralnetworks We expect to develop this topic in future works.

• The techniques presented in Chapter 4, which were used to demonstrate the exponentialstability of inertial BAM-Hopfield model, can also be useful when used to study expo-nential stability of inertial BAM-Cohen-Grossberg neural networks with heterogeneoustime-varying delays This motivates further investigation.

Trang 27

LIST OF PUBLICATIONS

[P1] Le Thi Hong Dzung and Le Van Hien (2024), Positive solutions and exponential bility of nonlinear time-delay systems in the model of BAM-Cohen-Grossberg neuralnetworks, Differential Equations and Dynamical Systems, vol 32, no 3, pp 909–932(ESCI/Scopus, Q3).

sta-[P2] Le Thi Hong Dzung and Le Van Hien (2024), Exponential stability of positive formable BAM neural networks with communication delays, Journal of Nonlinear Mod-eling and Analysis, vol 6, no 2, pp 453–475 (Scopus, Q3).

con-[P3] Le Thi Hong Dzung and Le Van Hien (2024), Exponential attractivity of positive inertialneural networks in bidirectional associative memory model with heterogeneous delays,Computational and Applied Mathematics (SCIE, Q2) to appear

The results of this dissertation have been presented at

• The weekly seminar on Differential and Integral Equation, Division of MathematicalAnalysis, Faculty of Mathematics and Informatics, Hanoi National University of Edu-cation.

• PhD Annual Conferences, Faculty of Mathematics and Informatics, Hanoi NationalUniversity of Education.

Ngày đăng: 31/07/2024, 17:01

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w