Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synchronize with OKH v1 & OKH LOSH #60

Open
hoijui opened this issue Sep 28, 2023 · 1 comment
Open

Synchronize with OKH v1 & OKH LOSH #60

hoijui opened this issue Sep 28, 2023 · 1 comment

Comments

@hoijui
Copy link

hoijui commented Sep 28, 2023

Hello! :-)
I had a call with @Jbutler-helpful and @ElijahAhianyo yesterday.
Thanks you two, especially James!
I know I have been a pain in the ass in the past (and also somewhat in this call), but .. yesterday I felt for the first time.. a helpful attitude in me. ;-)

How to unite our OKH versions

This whole message represents my view only, and I am not with IoPA.
Why anyone should care anyway, see section "Who am I" (not so important though).

  • OKH v1 is YAML files, specified by a Schema (most rigorous version of it is from the LOSH team, in JSON-Schema format)
  • OKH LOSH is a further development of v1; it:
    • is much more refined
    • has many bugs fixed that were in v1
    • uses TOML files (though YAML and JSON could also be used)
    • the native format though (which the TOML files are converted to), is RDF.
      We mostly use RDF/Turtle, but again, it could just as well be JSON-LD or any of the other RDF syntaxes.
      Roughly speaking, we use RDF for the same reason, benefits and function, for which you use WikiBase, but ...

-> WikiBase: baaaad! (see section "WikiBase/WikiMedia rant" below for details)

Apart from that, I like your atomization principle a lot!
It is a good, and simple-enough-to-be-viable step forward.
We were clear that the BoM area is a big problem, hindering anyone from doing what you are doing, but we did not have time, resources or even ideas for how to go at it. ... great work; thank you, and ride on!!!

My dream-scenario would be:

  1. using OKH-LOSH (which also boasts with more then 10 times the amount of projects at the moment, including all the v1 projects) as a base, instead of OKH v1.
  2. using RDF instead of WikiBase (or nothing of the sorts, like v1)
  3. using your atomization approach
  4. Create a separate standard for BoM-atomization, with dedicated tooling

With point 4., I am looking to prevent lock-in, and at the same time to spread the atomizability of BoMs through a wider range of projects. What I am looking for, is a standard that basically sais:

  • a BoM has to be machine-readable
  • ... in one of these supported formats:
    • CSV
    • JSON
    • MD-table(maybe? :/ )
  • for each item:
    • needs to supply at least an ID unique withing ... the project(?)
    • recommended but optional: support a human readable name and a URL

And a stand-alone tool (CLI & web?) that takes a random projects BoM and tries to convert it to a CSV file with the three columns id, name, URL, or an equivalent JSON version of that.

This standard and tool would be useful for anyone trying to do anything similar like you guys, which I know, are quite a few people at the moment. It would then allow anyone to check their project, or set of projects or project-hosting platform against that standard, and if not supported, change their format or extend the standards tool to make it understand their format.

In short, it is a way to make more of the projects out there compatible with your atomization, and at the same time help other projects.
We try to use this approach as much as possible within OSEG, because .. let's be honest: a huge portion of projects and software we work on, will be dead, 10 years after we started them. if we make the simple, basic but labor intensive parts as reusable as possible, it is more likely for at least part of our work to still be used and useful in the future, and others or we ourselfs will be able to focus on cool ways to handle OSH projects - with data quite easily available. It is basically, providing steady, firm shoulders where giants can stand on.

And of course, this approach (if we get a few entities to hop on board with it) - as a side effect - forms a kind of OS BoM specialists committee, spread over many organizations, which could in the future also extend the format with more optional fields in a well defined manner.

To wrap up:
As OKH LOSH has only benefits over v1, we see it finds more usage then v1. If we'd combine the best of OKH LOSH and OKH helpful, the same would happen to that version, and to me (and I think not even IoPA would say much against that), that really is more important then it being officially accepted as the next OKH version.
I would be happy to work towards that together.

Who am I (related to OKH)

I am the main dev/tech-guy of OKH-LOSH, which is a further development of OKH v1.
Our version is much more refined and practically usable then v1. So much so, that IoPA (the stewards of OKH v1) also use our version over their own. It is not the official OKH v2, because... IoPA is ... I can not name it. slow/bureaucratic/understaffed/.... something. We did (and do) try to get there!
While developing OKH-LOSH, I necessarily learned a lot about OKH v1, and I am confident to say that I am probably the person knowing most about it (on a technical level), including anyone who is currently in contact with it or IoPA. I am not with IoPA myself, so I can not speak for them, but I can talk about OKH v1, and what makes most sense in general, and I am in relatively close contact with them, and if they need a tech guy for OKH .. anything, it is likely that they'd come to me, because ... other then me, there is only Max, and he does not have any time (nor is he as deep in this as I am).

WikiBase/WikiMedia rant

Our RDF is similar to your WikiBase approach.
I can also understand why you came to WikiBase, as it has more of a ... end-user exterior, and provides some things that seem good at first (like the unique IDs, automatic web-view of things, other comforts, ...).
We started out with WikiBase, and .. to summarize: the switch from WikiBase to RDF is similar to .. sending around ZIP files of different versions of a software during development amongst multiple devs, and using git instead, in our experience. We had a 2 year ordeal dealing with WikiMedia, which was terrible both on the technical as well as on the organization/legal level.
RDF is just a set of standards and formats. Just as much as it itsself is a distributed graph-DB format, its software landscape is distributed too. The standard itsself comes from Sir Bernards Lee (father of HTTP), and is hosted by W3C. WikiBase on the other hand, is centrally developed and is a centralized data-model, without any federation between WikiBase instances (if you find any other info, it is a lie). Also, the only devs that realistically can extend the WikiBase software, are WikiMedias devs themselfs. They and their non-tech entourage may promise many things, to develop this and add that feature, but in reality, they simply can't, because they don't have time (and their software is a hot mess, plus their internal structure is in some ways worse then a corporate in terms of bureaucracy and what not). We payed them money (quite a huge amount), and had a contract, but they still did not fulfill more then 10% of the agreed upon tasks.

@hoijui
Copy link
Author

hoijui commented Dec 20, 2023

I started a repo to work together on a BoM standard as mentioned above:
https://codeberg.org/oseg/open-bom

So far it is very limited.. mostly just to give an idea of what it could be.
A lot of questions need answering to move forward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Team Review
Development

No branches or pull requests

2 participants