[Discuss] Proposal: Open Source Hardware Score/Index

Antoine, as a contact of a free smallwindturbine project smallwindturbineproj.contactor at gmail.com
Sat Feb 28 08:38:05 UTC 2015


Hi all.

If relevant:
In your the list of questions of Mario "index", I wander if "the" one has
not yet been listed:

   - *total rights of authorship*: the big question, is how to be sure that
   the person who (which) deliver the "non software thing" has the fully
   capability to do it and the fully rights above the "non-software thing".

Until we have no certainty concerning the capability and the fully coverage
of authorship rights coming from the person who/which delivers the thing,
users of such material would not swim safely in potential red or blue
oceans business fields, wouldn't they ?

Freely (and quietly),

Antoine

2015-02-27 23:57 GMT+01:00 Mario Gómez <mxgxw.alpha at gmail.com>:

> Emilio: Make your proposal, I think that we need to try several approaches
> to see which one is easy to use/understood by the community. I never said
> that I have the final answer on this. (We all already know it's 42) ;)
>
> Regards,
> Mario.
>
>
> On Fri, Feb 27, 2015 at 3:13 PM, Emilio Velis <contacto at emiliovelis.com>
> wrote:
>
>> I agree with David. What we could consider in a qualitative style of
>> assesment is to evaluate whether it is or not it is OSHW (checklists give
>> you yes/no results), and a quantitative evaluation instrument should be
>> used to know how open a hardware design is. Each variable has more options
>> based on a standard, corresponding weighted values according to what we
>> consider to be the most open standard. We could take for example Open
>> Source Ecology having OSHWA's Openness Award 2015 as the most open project
>> in the industry (this is just an example), and evaluate any other project
>> around the world in a scale from 1 to Marcin.
>>
>> Then you can compare indexes by industry, location and other variables.
>>
>> For a quantitative evaluation it would be more interesting to evaluate
>> instead of "is the source code available?", aspects such as where is it
>> acailable, how easily distributed it is, if it is a fork of another
>> project, if the source is standarized, formats, documentation, level of
>> detail, etc.
>>
>> If you think of it, an assessment of this sort should be made by a
>> specialized party, and certifiable by OSHWA.
>>
>> I'll work on an idea based on Mario's index. Would you be interested,
>> Mario? Or is anyone interested in developing a paper on this? This sounds
>> like an awesome research project! I'm all up for it!
>>
>> El viernes, 27 de febrero de 2015, David A. Mellis <dmellis at gmail.com>
>> escribió:
>>
>> I agree that it’s important to provide consistent guidelines and
>>> standards, and make them as clear and easy to follow / evaluate as
>>> possible. And I think the OSHW definition is a good standard to start with
>>> (although we may want to tweak / improve it at some point if necessary).
>>> Again, any suggestions on how to better communicate the definition (both to
>>> hardware makers and to users) is welcome.
>>>
>>> I’d be curious to hear other people’s opinion on establishing other
>>> standards to complement the OSHW definition. What do you all think about
>>> trying to define a partially open standard or a more pure OSHW standard?
>>>
>>> David
>>>
>>> On Feb 27, 2015, at 12:34 PM, Mario Gómez <mxgxw.alpha at gmail.com> wrote:
>>>
>>> Hi David!
>>>
>>> Thanks for your comments,
>>>
>>> At the begining I was thinking at the score as a measure of the level of
>>> compliance with the definition and not a black and white classification.
>>> However in the way the OSHW definition is redacted you can practically
>>> guess which situations can directly prevent a project of being "pure OSHW"
>>> and that's what I tried to include in the questions.
>>>
>>> However I consider that the score is not the the most important thing,
>>> it's the way that the score gives you insightful recommendations and a
>>> system that allows crowd-validate the compliance.
>>>
>>> The other thing that I really did wanted with the score is that it could
>>> serve for certification purposes. For example, you cannot state "my
>>> bussiness complies 99.99% with the ISO 9001 requirements", I mean you can
>>> say it but no one is going to take you seriously. And to be sincere, for
>>> OSHW the line of what-is and what-not must be drawn somewhere. I would
>>> think that for many shady manufacturers it's really good that there isn't
>>> any clear line drawn yet because they can market their products as Open
>>> Source Hardware without following the spirit of the OSHW definition.
>>>
>>> For the levels personally I think they add more confusion to the issue
>>> of what is and not open source hardware, however they can be helpfull to
>>> guide the designers about what they need to do if they (ever) want to
>>> release their designs as OSHW.
>>>
>>> My point is: If we, as members of the OSHWA do not draw a line somewhere
>>> and use some tool that allow us to do it in a consistent, replicable and
>>> transparent way... Then someone else is going to do it and there is a risk
>>> that they draw the line in a place where there is no true intent of
>>> following with the "Open Source Hardware" philosophy that the OSWHA tries
>>> to promote.
>>>
>>> Regards,
>>> Mario.
>>>
>>>
>>> On Fri, Feb 27, 2015 at 11:05 AM, David A. Mellis <dmellis at gmail.com>
>>> wrote:
>>>
>>>> To me, it’s confusing for the required criteria to yield a score, if
>>>> only a perfect score counts as OSHW. That is, a naive reader might think
>>>> that 13/15 or 14/15 is a good score, even though we wouldn’t consider the
>>>> project OSHW. To me, it seems like we’re better off using a checklist
>>>> approach instead, i.e. these are all the things you have to do to be
>>>> considered OSHW. OSHWA has some things like that already:
>>>>
>>>> http://www.oshwa.org/wp-content/uploads/2014/08/oshwchecklist.pdf
>>>> http://www.oshwa.org/wp-content/uploads/2014/08/OSHW-May-and-Must.pdf
>>>>
>>>> Although suggestions are always welcome.
>>>>
>>>> A related approach I’d love feedback on is whether there is a
>>>> well-defined and agreed on set of practices that could constitute either a
>>>> weaker or stronger standard than our current OSHW definition.
>>>>
>>>> For example, can we imagine trying to establish a meaning of “partially
>>>> open” hardware — e.g. hardware for which design files are released but
>>>> under a more restrictive license than OSHW; hardware for which some files
>>>> (like schematic PDFs) are released but not others (like the actual design
>>>> files). This is still more open than many pieces of hardware, so it might
>>>> be worth trying to recognize these efforts, even if they’re not fully OSHW.
>>>> Thoughts?
>>>>
>>>> In the other direction, could we imagine something like a “pure
>>>> open-source hardware” standard, e.g. hardware which is designed using
>>>> open-source software tools, and which only uses, say, components that are
>>>> standard / widely available / publicly documented?
>>>>
>>>> David
>>>>
>>>> On Feb 25, 2015, at 7:12 AM, Mario Gómez <mxgxw.alpha at gmail.com> wrote:
>>>>
>>>> Hi Ben,
>>>>
>>>> That's the idea of the proposed score, there is a set of questions that
>>>> evaluate compliance against the OSHW definition. Your project must meet the
>>>> required score 15/15 to be considered OSHW.
>>>>
>>>> The reason why is a score instead a simple evaluation of compliance is
>>>> because I was thinking that it also must work as a tool for the begginer
>>>> that want to develop OSHW and a guide of which changes are needed to be
>>>> compliant. Currently in the way the score is designed you must have 15 of
>>>> 15 points of compliance to be considered OSHW if you doesn't meet all of it
>>>> well... then your project simply isn't OSHW. However you'll know after the
>>>> evaluation how far is your project of getting the goal, it's not the same
>>>> to get a score of 1 than a score of 14. The system later would underline
>>>> the things that you  failed to comply and (hopefuly) give you a guide or
>>>> ideas about what to do.
>>>>
>>>> After the 15 "required" points there are 7 aditional points that
>>>> evaluate good practices. The idea of including this in the calculation of
>>>> the score is because in some way is easy to comply with the definition but
>>>> that doesn't guarantee that you are following good practices. Then again if
>>>> you've got the 15 required points the extra points help you to know if you
>>>> are following the best practices and giving added value to your project
>>>> generating a good and accesible documentation.
>>>>
>>>> Also I think that the definition is pretty clear of what things prevent
>>>> a project to be considered OSHW and the questions of the score were
>>>> elaborated that way, following the definition.
>>>>
>>>> Regards,
>>>> Mario.
>>>>
>>>>
>>>> On Wed, Feb 25, 2015 at 1:55 AM, Ben Gray <ben at phenoptix.com> wrote:
>>>>
>>>>> Although I like the idea of an index, it seems to be enough of a
>>>>> problem (even on this list) to recognise what constitutes Open Source
>>>>> Hardware or not. I feel that adding an index or score could muddy the
>>>>> waters even more.
>>>>> However it could add to understanding if the compliance elements are
>>>>> stressed and failure underlined rather than a low score given.
>>>>>
>>>>> --
>>>>>
>>>>> Best Regards
>>>>>
>>>>> Ben Gray - Director
>>>>>
>>>>>
>>>>>
>>>>> www.phenoptix.com
>>>>> twitter.com/phenoptix
>>>>> plus.google.com/+phenoptix
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On 25 February 2015 at 07:16, Jeffrey Warren <jeff at publiclab.org>
>>>>> wrote:
>>>>>
>>>>>> So one thing I like about the contrib.json file is that it'd have a
>>>>>> BOM requirement with potentially optional things like prices, links for
>>>>>> where to buy materials, etc.
>>>>>>
>>>>>> I had some ideas (talking with RJ Steinert
>>>>>> <http://publiclab.org/profile/rjstatic> of Farm Hack) about how a
>>>>>> more Bower- or NPM-style utility could parse such files... these are just
>>>>>> roughly sketched out ideas -- say we called it "newt":
>>>>>>
>>>>>>    - newt init -- would run a text-based questionnaire to generate
>>>>>>    contrib.json file
>>>>>>    - newt compile bom -- aggregate/merge BOMs of nested projects
>>>>>>    - newt compile bom <string> -- aggregate/merge BOMs with links
>>>>>>    matching provided string like "digikey.com"
>>>>>>    - newt compile price <int> -- calculate unit price for int units
>>>>>>    - newt compile contributors -- compile contributors of nested
>>>>>>    projects
>>>>>>    - newt register -- makes searchable, tests for presence of req'd
>>>>>>    docs, clones repos or zips
>>>>>>
>>>>>> Updated my post in the comments here, where there's also been some
>>>>>> discussion about versioning:
>>>>>> http://publiclab.org/notes/warren/02-24-2015/standardizing-open-source-hardware-publication-practices-with-contributors-json#c11215
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Feb 24, 2015 at 7:35 PM, Roy Nielsen <amrset at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> One possibility would be to require a "BOM" or bill of materials
>>>>>>> that is required for an OSHWA certified design.  Perhaps something like the
>>>>>>> following for an embedded board:
>>>>>>>
>>>>>>> * contributors.jason
>>>>>>> * Project BOM - in the part descriptions - includes whether a part
>>>>>>> is open source or closed source
>>>>>>>                           (ie processors, complex chips, etc)
>>>>>>> * Schematics list - including descriptions & if the schematics are
>>>>>>> modifiable (ie, not pdf)
>>>>>>> * License
>>>>>>> * Hardware Design Documentation
>>>>>>> * Software Design Documentation & License (if applicable, like
>>>>>>> firmware)
>>>>>>> * Connectors - if they are open design/interface
>>>>>>>
>>>>>>> anything else?
>>>>>>>
>>>>>>> Score could possibly be based on what of the above is available . .
>>>>>>>
>>>>>>> Regards,
>>>>>>> -Roy
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Feb 24, 2015 at 4:15 PM, Pablo Kulbaba <
>>>>>>> pablokulbaba at gmail.com> wrote:
>>>>>>>
>>>>>>>>  On the validation via a community or a specific group of people,
>>>>>>>> maybe the initial open community can provide a seedstock to raise educated
>>>>>>>> people to form a later trusted group of people that gives an ulterior
>>>>>>>> certification.
>>>>>>>>
>>>>>>>> PD: Had to search JSON.
>>>>>>>>
>>>>>>>> On 24/02/2015 08:00 p.m., Mario Gómez wrote:
>>>>>>>>
>>>>>>>>  @jeff:
>>>>>>>>
>>>>>>>> That's great! It can even work both ways: If you already have a
>>>>>>>> JSON you can provide the URL to automatically calculate the indicator for
>>>>>>>> your project and vice versa: if you complete the questionnaire it could
>>>>>>>> automatically generate the JSON file that you can include in your project
>>>>>>>> as you propose that would be easy to do.
>>>>>>>>
>>>>>>>>  Sadly I'm a little busy this week but let me see if I can program
>>>>>>>> a functional prototype so we can experiment how it could work for the next
>>>>>>>> month. (I would not mind if someone else wants to help)
>>>>>>>>
>>>>>>>> @Javier:
>>>>>>>>
>>>>>>>>  I personally like the idea of the community, because if the
>>>>>>>> process is straight forward, verifiable and transparent what matters is the
>>>>>>>> result of the evaluation system and not the person/group of persons doing
>>>>>>>> the evaluation. This is assuming that the evaluation system provides means
>>>>>>>> to minimize/prevent abuses (That's why I consider important to also
>>>>>>>> implementing a meta-evaluation system).
>>>>>>>>
>>>>>>>> However... being certified from a trusted group of people it's
>>>>>>>> really important and I think that the OSHWA could be an appropriate group
>>>>>>>> to do that. But let's hear more opinions, I think that it's possible to
>>>>>>>> build something simple that helps people to follow the OSHW philosophy in
>>>>>>>> their projects.
>>>>>>>>
>>>>>>>> Regards,
>>>>>>>> Mario.
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Feb 24, 2015 at 3:54 PM, Jeffrey Warren <jeff at publiclab.org
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> I really like this idea!
>>>>>>>>>
>>>>>>>>>  Somewhat related is this idea from chatting with Alicia Gibb a
>>>>>>>>> few months ago, of a contributors.json file which would fulfill (with
>>>>>>>>> links, short descriptions, etc) all the terms of the OSH definition.
>>>>>>>>>
>>>>>>>>>  I finally typed up the idea and our sample format here:
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> http://publiclab.org/notes/warren/02-24-2015/standardizing-open-source-hardware-publication-practices-with-contributors-json
>>>>>>>>>
>>>>>>>>>  Love to hear input. Perhaps the questionnaire could generate
>>>>>>>>> such a file. At Public Lab, it'd be interesting for the file to be
>>>>>>>>> auto-generated from our tool wiki pages. The nice part about it is that
>>>>>>>>> it's not specifying a way of browsing or aggregating projects (as other
>>>>>>>>> folks are exploring that space) but specifies a standard way to make the
>>>>>>>>> relevant/required information available for such projects to
>>>>>>>>> scrape/consume. Also, it's easy enough to write by hand and include in a
>>>>>>>>> github repository.
>>>>>>>>>
>>>>>>>>>  Best,
>>>>>>>>> Jeff
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Feb 24, 2015 at 3:55 PM, Javier Serrano <
>>>>>>>>> Javier.Serrano at cern.ch> wrote:
>>>>>>>>>
>>>>>>>>>> Mario, I think this is a great idea. I see this can play a role
>>>>>>>>>> in the
>>>>>>>>>> solution to one of the biggest problems of OSHW: how to make sure
>>>>>>>>>> developers have more incentives to publish their work. Economic
>>>>>>>>>> incentives in particular. An OSHW label could give (more)
>>>>>>>>>> prestige to
>>>>>>>>>> developers who hold it and induce purchaser-driven growth of
>>>>>>>>>> OSHW. We
>>>>>>>>>> are already seeing that prestige is a big element in the success
>>>>>>>>>> of OSHW
>>>>>>>>>> companies. A well advertised and supported label or mark could
>>>>>>>>>> enlarge
>>>>>>>>>> the population of savvy customers.
>>>>>>>>>>
>>>>>>>>>> On 02/24/2015 05:58 PM, Mario Gómez wrote:
>>>>>>>>>> > The idea is that the community validates if you are telling
>>>>>>>>>> the  truth.
>>>>>>>>>> > To prevent abuse a meta-validation system could be implemented
>>>>>>>>>> were you
>>>>>>>>>> > can "evaluate the evaluators" to see if their are being fair on
>>>>>>>>>> their
>>>>>>>>>> > evaluations.
>>>>>>>>>>
>>>>>>>>>> One alternative is to entrust the OSHWA with that role.
>>>>>>>>>> "Community" is a
>>>>>>>>>> vague term. If I have to trust someone on whether a piece of
>>>>>>>>>> software is
>>>>>>>>>> free software I will trust the FSF over the "community" any day.
>>>>>>>>>> One way
>>>>>>>>>> of doing it would be through a creative use of marks or labels,
>>>>>>>>>> in the
>>>>>>>>>> vein of what OHANDA [1] proposes. See also the work of the
>>>>>>>>>> Wikimedia
>>>>>>>>>> Foundation [2] in this regard. In this scenario, developers have a
>>>>>>>>>> natural incentive to not misuse the mark, because they can be
>>>>>>>>>> sued with
>>>>>>>>>> all the arsenal of trademark law if they do.
>>>>>>>>>>
>>>>>>>>>> Cheers,
>>>>>>>>>>
>>>>>>>>>> Javier
>>>>>>>>>>
>>>>>>>>>> [1] http://www.ohanda.org/
>>>>>>>>>> [2] http://wikimediafoundation.org/wiki/Trademark_policy
>>>>>>>>>>  _______________________________________________
>>>>>>>>>> discuss mailing list
>>>>>>>>>> discuss at lists.oshwa.org
>>>>>>>>>> http://lists.oshwa.org/listinfo/discuss
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> discuss mailing list
>>>>>>>>> discuss at lists.oshwa.org
>>>>>>>>> http://lists.oshwa.org/listinfo/discuss
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> discuss mailing listdiscuss at lists.oshwa.orghttp://lists.oshwa.org/listinfo/discuss
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> PabloK
>>>>>>>>
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> discuss mailing list
>>>>>>>> discuss at lists.oshwa.org
>>>>>>>> http://lists.oshwa.org/listinfo/discuss
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> discuss mailing list
>>>>>>> discuss at lists.oshwa.org
>>>>>>> http://lists.oshwa.org/listinfo/discuss
>>>>>>>
>>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> discuss mailing list
>>>>>> discuss at lists.oshwa.org
>>>>>> http://lists.oshwa.org/listinfo/discuss
>>>>>>
>>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> discuss mailing list
>>>>> discuss at lists.oshwa.org
>>>>> http://lists.oshwa.org/listinfo/discuss
>>>>>
>>>>>
>>>> _______________________________________________
>>>> discuss mailing list
>>>> discuss at lists.oshwa.org
>>>> http://lists.oshwa.org/listinfo/discuss
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> discuss mailing list
>>>> discuss at lists.oshwa.org
>>>> http://lists.oshwa.org/listinfo/discuss
>>>>
>>>>
>>> _______________________________________________
>>> discuss mailing list
>>> discuss at lists.oshwa.org
>>> http://lists.oshwa.org/listinfo/discuss
>>>
>>>
>>>
>> _______________________________________________
>> discuss mailing list
>> discuss at lists.oshwa.org
>> http://lists.oshwa.org/listinfo/discuss
>>
>>
>
> _______________________________________________
> discuss mailing list
> discuss at lists.oshwa.org
> http://lists.oshwa.org/listinfo/discuss
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.oshwa.org/pipermail/discuss/attachments/20150228/43459a4a/attachment-0001.html>


More information about the discuss mailing list