The inaugural SciMLCon of the Scientific Machine Learning Open Source Software Community is focused on the development and applications of the Julia-based SciML tooling. Check out the call for proposals. SciMLCon presentations can range from introductory to advanced, with speakers from industry and academia.
High Performance and Feature-Filled Differential Equation Solving. The library DifferentialEquations.jl is a library for solving ordinary differential equations (ODEs), stochastic differential equations (SDEs), delay differential equations (DDEs), differential-algebraic equations (DAEs), and hybrid differential equations which include multi-scale models and mixtures with agent-based simulations. The templated implementation allows arbitrary array and number types to be compatible, giving compatibility with arbitrary precision floating point numbers, GPU-based computations, unit-checked arithmetic, and other features. DifferentialEquations.jl is designed for both high performance on large-scale and small-scale problems, and routinely benchmarks at the top of the pack.
Physics-Informed Model Discovery and Learning. SciML contains a litany of modules for automating the process of model discovery and fitting. Tools like DiffEqParamEstim.jl and DiffEqBayes.jl provide classical maximum likelihood and Bayesian estimation for differential equation based models, while DiffEqFlux.jl enables the training of embedded neural networks inside of differential equations (neural differential equations or universal differential equations) for discovering unknown dynamical equations, DataDrivenDiffEq.jl estimates Koopman operators (DMD) and utilizes methods like SInDy to turn timeseries data into LaTeX for driving differential equations, and ReservoirComputing.jl for Echo State Networks that learn to predict the dynamics of chaotic systems.
A Polyglot Userbase. While the majority of the tooling for SciML is built using the Julia programming language, SciML is committed to ensure that these methodologies can be used throughout the greater scientific community. Tools like diffeqpy and diffeqr bridge the DifferentialEquations.jl solvers to Python and R respectively, and we hope to see many more developments along these lines in the near future.
Compiler-Assisted Model Analysis and Sparsity Acceleration. Scientific models generally have structures like locality which leads to sparsity in the program structures that can be exploited for major performance acceleration. The SciML builds a set of interconnected tools for generating numerical solver code directly on the models that are being simulated. SparsityDetection.jl can automatically detect the sparsity patterns of Jacobians and Hessians from arbitrary source code, while ModelingToolkit.jl can rewrite differential equation models to re-arrange equations for better stability and automatically parallelize code. These tools then connect with affiliated packages like SparseDiffTools.jl to accelerate solving with DifferentialEquations.jl and training with DiffEqFlux.jl.
ML-Assisted Tooling for Model Acceleration. SciML supports the development of the latest ML-accelerated toolsets for scientific machine learning. Methods like Physics-Informed Neural Networks (PINNs) and Deep BSDE methods for solving 1000 dimensional partial differential equations are productionized in the NeuralPDE.jl library. Surrogate-based acceleration methods are provided by Surrogates.jl.
Differentiable Scientific Data Structures and Simulators. The SciML ecosystem contains pre-built scientific simulation tools along with data structures for accelerating the development of models. Tools like LabelledArrays.jl and MultiScaleArrays.jl make it easy to build large-scale scientific models, while other tools like NBodySimulator.jl provide full-scale simulation simulators.
Tools for Accelerated Algorithm Development and Research:
SciML is an organization dedicated to helping state-of-the-art research in both numerical simulation methods and methodologies in scientific machine learning. Many tools throughout the organization automate the process of benchmarking and testing new methodologies to ensure they are safe and battle tested, both to accelerate the translation of the methods to publications and to users. We invite the larger research community to make use of our tooling like DiffEqDevTools.jl and our large suite of wrapped algorithms for quickly test and deploying new algorithms.
Thank you to all our sponsors! We acknowledge support from:
* Chan Zuckerberg Initiative
* Individual sponsors via Github and NumFOCUS donations
* Wellcome Trust