Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Communications on Stochastic Analysis
…
15 pages
1 file
A stochastic process Xt is called a near-martingale with respect to a filtration {Ft} if E[Xt|Fs] = E[Xs|Fs] for all s ≤ t. It is called a nearsubmartingale with respect to {Ft} if E[Xt|Fs] ≥ E[Xs|Fs] for all s ≤ t. Near-martingale property is the analogue of martingale property when the Itô integral is extended to non-adapted integrands. We prove that Xt is a near-martingale (near-submartingale) if and only if E[Xt|Ft] is a martingale (near-submartingale, respectively). Doob-Meyer decomposition theorem is extended to near-submartingale. We study stochastic differential equations with anticipating initial conditions and obtain a relationship between such equations and the associated stochastic differential equations of the Itô type.
2022
The primary goal of this paper is to prove a near-martingale optional stopping theorem and establish solvability and large deviations for a class of anticipating linear stochastic differential equations. We prove the existence and uniqueness of solutions using two approaches: (1) Ayed-Kuo differential formula using an ansatz, and (2) a novel braiding technique by interpreting the integral in the Skorokhod sense. We establish a Freidlin-Wentzell type large deviations result for solution of such equations.
Communications on Stochastic Analysis, 2018
In this paper we discuss the new stochastic integral in [1] in terms of the Itô isometry. We prove the Doob-Meyer decomposition theorem for near-submartingales in the classes (D) and (DL). Moreover, we introduce a stochastic integral by a near-martingale as an application of the decomposition theorem.
Stochastics An International Journal of Probability and Stochastic Processes, 1991
2015
We study the discrete parameter case of near-martingales, nearsubmartingales, and near-supermartingales. In particular, we prove Doob's decomposition theorem for near-submartingales. This generalizes the classical case for submartingales.
Communications on Stochastic Analysis, 2015
We study the discrete parameter case of near-martingales, nearsubmartingales, and near-supermartingales. In particular, we prove Doob's decomposition theorem for near-submartingales. This generalizes the classical case for submartingales.
arXiv (Cornell University), 2022
This paper contributes to the study of relative martingales. Specifically, for a closed random set H, they are processes null on H which decompose as M = m + v, where m is a càdlàg uniformly integrable martingale and, v is a continuous process with integrable variations such that v 0 = 0 and dv is carried by H. First, we extend this notion to stochastic processes not necessarily null on H, where m is considered local martingale instead of a uniformly integrable martingale. Thus, we provide a general framework for the new larger class of relative martingales by presenting some structural properties. Second, as applications, we construct solutions for skew Brownian motion equations using continuous stochastic processes of the above mentioned new class. In addition, we investigate stochastic differential equations driven by a relative martingale.
Indian Journal of Pure and Applied Mathematics, 2017
In Karandikar-Rao [11], the quadratic variation [M, M ] of a (local) martingale was obtained directly using only Doob's maximal inequality and it was remarked that the stochastic integral can be defined using [M, M ], avoiding using the predictable quadratic variation M, M (of a locally square integrable martingale) as is usually done. This is accomplished here-starting with the result proved in [11], we construct f dX where X is a semimartingale and f is predictable and prove dominated convergence theorem (DCT) for the stochastic integral. Indeed, we characterize the class of integrands f for this integral as the class L(X) of predictable processes f such that |f | serves as the dominating function in the DCT for the stochastic integral. This observation seems to be new. We then discuss the vector stochastic integral f, dY where f is R d valued predictable process, Y is R d valued semimartingale. This was defined by Jacod [6] starting from vector valued simple functions. Memin [13] proved that for (local) martingales M 1 ,. .. , M d : If N n are martingales such that N n t → N t for every t and if ∃f n such that N n t = f n , dM , then ∃f such that N = f, dM. Taking a cue from our characterization of L(X), we define the vector integral in terms of the scalar integral and then give a direct proof of the result due to Memin stated above. This completeness result is an important step in the proof of the Jacod-Yor [4] result on martingale representation property and uniqueness of equivalent martingale measure. This result is also known as the second fundamental theorem of asset pricing.
Communications on Stochastic Analysis, 2013
We study the concept of translation of a Brownian motion by an anticipative term given by a Lebesgue integral of an instantly independent stochastic process. We introduce an equivalent probability measure that is constructed via an exponential process based on the stochastic integral of anticipative processes (in the sense of Ayed and Kuo) and show that under the new measure the translated Brownian motion is a continuous near-martingale with quadratic variation t on the interval [0, t]. Thus we obtain an anticipative version of the Girsanov theorem.
arXiv: Probability, 2020
In the definition of the stochastic integral, apart from the integrand and the integrator, there is an underlying filtration that plays a role. Thus, it is natural to ask: {\it Does the stochastic integral depend upon the filtration?} In other words, if we have two filtrations, $({\mathcal F}_\centerdot)$ and $({\mathcal G}_\centerdot)$, a process $X$ that is semimartingale under both the filtrations and a process $f$ that is predictable for both the filtrations, then are the two stochastic integrals - $Y=\int f\,dX$, with filtration $({\mathcal F}_\centerdot)$ and $Z=\int f\,dX$, with filtration $({\mathcal G}_\centerdot)$ the same? When $f$ is left continuous with right limits, then the answer is yes. When one filtration is an enlargement of the other, the two integrals are equal if $f$ is bounded but this may not be the case when $f$ is unbounded. We discuss this and give sufficient conditions under which the two integrals are equal.
Lecture Notes in Mathematics, 2015
Given a reference filtration F, we consider the cases where an enlarged filtration G is constructed from F in two different ways: progressively with a random time or initially with a random variable. In both situations, under suitable conditions, we present a G-optional semimartingale decomposition for F-local martingales. Our study is then applied to answer the question of how an arbitrage-free semimartingale model is affected when stopped at the random time in the case of progressive enlargement or when the random variable used for initial enlargement satisfies Jacod's hypothesis. More precisely, we focus on the No-Unbounded-Profit-with-Bounded-Risk (NUPBR) condition. We provide alternative proofs of some results from [5], with a methodology based on our optional semimartingale decomposition, which reduces significantly the length of the proof.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Communications on Stochastic Analysis, 2008
Communications on Stochastic Analysis
Stochastics An International Journal of Probability and Stochastic Processes
In Memoriam Paul-André Meyer, 2006
The Annals of Probability, 2006
Lithuanian Mathematical Journal, 1976
The Annals of Applied Probability, 2021
ESAIM: Probability and Statistics, 2011
Theory of Stochastic Processes
Barcelona Seminar on Stochastic Analysis, 1993
Stochastic Processes and their Applications, 1994
Mathematical Transactions of the Academy of Sciences of the Lithuanian SSR, 1973
Statistics & Probability Letters, 2008
Communications on Stochastic Analysis, 2017