feat: Builder API for Sections and Entities#253
feat: Builder API for Sections and Entities#253zachdaniel merged 26 commits intoash-project:mainfrom
Conversation
|
This is pretty cool! One major piece of feedback: IMO for the field/value pairs, those should just be options to a function, i.e Field.new(:type) |> Field.type(Type.one_of([:string, :integer, :boolean])) |> Field.required(),
# should just be:
Field.new(:type, type: .., required?: ...)I'm also not sure about enumerating all of the types as functions like that, i.e I don't think I see a test for using this in an actual DSL (the tests test the built data) but it might be good to have those and maybe an example guide on how to use this to define an extension. |
9cd5d0b to
06798b4
Compare
|
Cool!
I also stumbled over Finally, I suspect the failing |
I think that we should consider refactoring sections/entities to be the same where it's all based on options lists. We can always add in more functions, i.e For |
e76f0f8 to
e7b44bc
Compare
|
e7b44bc to
94d08d7
Compare
lib/spark/builder/field.ex
Outdated
| %{field | keys: normalized} | ||
| end | ||
|
|
||
| defp apply_opt(field, :keys, value) when is_function(value) do |
There was a problem hiding this comment.
I think we should instead have a Spark.Options schema for entities and sections that way we don't have to redefine a bunch of the validation logic.
There was a problem hiding this comment.
Not sure I fully follow what you have in mind, but here is my attempt to reduce own validation logic.
allows optional arg default without schema default
94d08d7 to
3722719
Compare
|
That looks just right! Lets just get the build passing and get it merged 😎 |
3722719 to
2d319ab
Compare
At least got it reproduced by simulating CI behaviour. The CI action runs |
The CI workflow (ash-ci.yml) runs mix local.hex --force and mix local.rebar --force between the two cheat_sheets steps. This invalidates the Elixir compiler's manifest (compile.elixir), triggering a full recompile. However, stale .beam files from the first step remain in _build test/lib/spark/ebin/. The BEAM VM auto-loads these stale modules when they're referenced during compilation, and when the compiler then redefines them from source, it emits "redefining module" warnings — which warnings_as_errors: true turns into build failures.
|
While this fixes it, maybe we should instead touch the CI workflow ( |
|
Good callout, I've merged that improvement into ash-ci.yml |
|
Your change here is also fine to leave in since we define modules in test anyway 😄 |
|
🚀 Thank you for your contribution! 🚀 |
Contributor checklist
Leave anything that you believe does not apply unchecked.
Partially addresses #51 by providing a builder API for
SectionsandEntities.This was an experiment to understand Spark better, I thought the builder API might be interesting to dive into how the DSL works and things got a bit out of hand .
More of a discussion PR. Happy to know if it seems valuable and if yes, if the API matches expectations and what would be the required feature set to suffice for incorporation. Otherwise, it can also just be closed.
Caution
Disclaimer: I used AI to discuss the codebase and some parts were contributed by it (vetted by me, but I'm not an Elixir pro and not confident I understood all intricacies)
Example usage
Using the DSL