Show newer

Model structures for diagrammatic $(\infty, n)$-categories arxiv.org/abs/2410.19053

Model structures for diagrammatic $(\infty, n)$-categories

Diagrammatic sets admit a notion of internal equivalence in the sense of coinductive weak invertibility, with similar properties to its analogue in strict $ω$-categories. We construct a model structure whose fibrant objects are diagrammatic sets in which every round pasting diagram is equivalent to a single cell -- its weak composite -- and propose them as a model of $(\infty, \infty)$-categories. For each $n < \infty$, we then construct a model structure whose fibrant objects are those $(\infty, \infty)$-categories whose cells in dimension $> n$ are all weakly invertible. We show that weak equivalences between fibrant objects are precisely morphisms that are essentially surjective on cells of all dimensions. On the way to this result, we also construct model structures for $(\infty, n)$-categories on marked diagrammatic sets, which split into a coinductive and an inductive case when $n = \infty$, and prove that they are Quillen equivalent to the unmarked model structures when $n < \infty$ and in the coinductive case of $n = \infty$. Finally, we prove that the $(\infty, 0)$-model structure is Quillen equivalent to the classical model structure on simplicial sets. This establishes the first proof of the homotopy hypothesis for a model of $\infty$-groupoids defined as $(\infty, \infty)$-categories whose cells in dimension $> 0$ are all weakly invertible.

arXiv.org

Physics-informed Neural Networks for Functional Differential Equations: Cylindrical Approximation and Its Convergence Guarantees arxiv.org/abs/2410.18153

Physics-informed Neural Networks for Functional Differential Equations: Cylindrical Approximation and Its Convergence Guarantees

We propose the first learning scheme for functional differential equations (FDEs). FDEs play a fundamental role in physics, mathematics, and optimal control. However, the numerical analysis of FDEs has faced challenges due to its unrealistic computational costs and has been a long standing problem over decades. Thus, numerical approximations of FDEs have been developed, but they often oversimplify the solutions. To tackle these two issues, we propose a hybrid approach combining physics-informed neural networks (PINNs) with the \textit{cylindrical approximation}. The cylindrical approximation expands functions and functional derivatives with an orthonormal basis and transforms FDEs into high-dimensional PDEs. To validate the reliability of the cylindrical approximation for FDE applications, we prove the convergence theorems of approximated functional derivatives and solutions. Then, the derived high-dimensional PDEs are numerically solved with PINNs. Through the capabilities of PINNs, our approach can handle a broader class of functional derivatives more efficiently than conventional discretization-based methods, improving the scalability of the cylindrical approximation. As a proof of concept, we conduct experiments on two FDEs and demonstrate that our model can successfully achieve typical $L^1$ relative error orders of PINNs $\sim 10^{-3}$. Overall, our work provides a strong backbone for physicists, mathematicians, and machine learning experts to analyze previously challenging FDEs, thereby democratizing their numerical analysis, which has received limited attention. Code is available at \url{https://github.com/TaikiMiyagawa/FunctionalPINN}.

arXiv.org
Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.