diff --git a/README.md b/README.md index ea15eeb1e..cb50cfea8 100644 --- a/README.md +++ b/README.md @@ -4,6 +4,7 @@ |:-------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------:| | [![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://structuralequationmodels.github.io/StructuralEquationModels.jl/) [![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://structuralequationmodels.github.io/StructuralEquationModels.jl/dev/) | [![Project Status: Active – The project has reached a stable, usable state and is being actively developed.](https://www.repostatus.org/badges/latest/active.svg)](https://www.repostatus.org/#active) [![Github Action CI](https://github.com/StructuralEquationModels/StructuralEquationModels.jl/workflows/CI_extended/badge.svg)](https://github.com/StructuralEquationModels/StructuralEquationModels.jl/actions/) [![codecov](https://codecov.io/gh/StructuralEquationModels/StructuralEquationModels.jl/branch/main/graph/badge.svg?token=P2kjzpvM4V)](https://codecov.io/gh/StructuralEquationModels/StructuralEquationModels.jl) | [![DOI](https://zenodo.org/badge/228649704.svg)](https://zenodo.org/badge/latestdoi/228649704) | +# What is this Package for? This is a package for Structural Equation Modeling. It is still *in development*. @@ -14,6 +15,8 @@ Models you can fit include - Multigroup SEM - Sums of arbitrary loss functions (everything the optimizer can handle). +# What are the merrits? + We provide fast objective functions, gradients, and for some cases hessians as well as approximations thereof. As a user, you can easily define custom loss functions. For those, you can decide to provide analytical gradients or use finite difference approximation / automatic differentiation. @@ -21,7 +24,8 @@ You can choose to mix and match loss functions natively found in this package an In such cases, you optimize over a sum of different objectives (e.g. ML + Ridge). This mix and match strategy also applies to gradients, where you may supply analytic gradients or opt for automatic differentiation or mix analytical and automatic differentiation. -You may consider using this package if: +# You may consider using this package if: + - you want to extend SEM (e.g. add a new objective function) and need an extendable framework - you want to extend SEM, and your implementation needs to be fast (because you want to do a simulation, for example) - you want to fit the same model(s) to many datasets (bootstrapping, simulation studies) @@ -33,7 +37,7 @@ The package makes use of - Optim.jl and NLopt.jl to provide a range of different Optimizers/Linesearches. - FiniteDiff.jl and ForwardDiff.jl to provide gradients for user-defined loss functions. -At the moment, we are still working on +# At the moment, we are still working on: - optimizing performance for big models (with hundreds of parameters) # Questions?