G-equations are well-known front propagation models in combustion and are HamiltonJacobi type equations with convex but non-coercive Hamiltonians. Viscous G-equations arise from numerical discretization or modeling dissipative mechanisms. Although viscosity helps to overcome non-coercivity, we prove homogenization of an inviscid G-equation based on approximate correctors and attainability of controlled flow trajectories. We verify the attainability for two-dimensional mean zero incompressible flows, and demonstrate asymptotically and numerically that viscosity reduces the homogenized Hamiltonian in cellular flows. In the case of one-dimensional compressible flows, we found an explicit formula of homogenized Hamiltonians, as well as necessary and sufficient conditions for wave trapping (effective Hamiltonian vanishes identically). Viscosity restores coercivity and wave propagation.
By analysing the uniform attractor for multi-valued processes, we study the long-time behaviour of the solutions of a model of non-autonomous porous-medium equations. The result is obtained by using the <i>a priori</i> estimates and suitable compactness arguments.
The formation and propagation of thermal fronts in a cylindrical medium that is undergoing microwave heating is studied in detail. The model consists of Maxwell's wave equation coupled to a temperature diffusion equation containing a bistable nonlinear term.
In this paper, we consider the local existence of solutions to Euler equations with linear damping under the assumption of physical vacuum boundary condition. By using the transformation introduced in Lin and Yang (Methods Appl. Anal. 7 (3) (2000) 495) to capture the singularity of the boundary, we prove a local existence theorem on a perturbation of a planar wave solution by using LittlewoodPaley theory and justifies the transformation introduced in Liu and Yang (2000) in a rigorous setting.
Let G=(V,E) be a locally finite connected weighted graph, and Δ be the usual graph Laplacian. In this article, we study blow-up problems for the nonlinear parabolic equation u_t = Δu + f(u) on G. The blow-up phenomenons for u_t = Δu + f(u) are discussed in terms of two cases: (i) an initial condition is given; (ii) a Dirichlet boundary condition is given. We prove that if f satisfies appropriate conditions, then the corresponding solutions will blow up in a finite time.