chaoss / wg-value

CHAOSS Value Working Group

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Academic Open Source Project Impact - Metric Release Candidate

vinodkahuja opened this issue · comments

This issue is created to collect comments about the rolling release of "Academic Open Source Project Impact"

This metric can be found here: https://chaoss.community/metric-academic-open-source-project-impact/

See all release candidates: https://chaoss.community/metrics/

CHAOSS Metric Quality Checklist

This checklist is used for new and updated metrics to ensure we follow CHAOSS quality standards and processes. Below checklist items don’t have to be completed all at once: create the metric release candidate issue first and then start working on the checklist.

Process

  • Create the “review issue” in the authoring WG’s repo for comments during review period and paste this template in
  • Create pull request to edit or add metric to WG’s repo (after checking Content Quality and Technical Requirements below)
  • Add the new metric or metric edit to release notes issue in working group repo
  • Update the Metrics Spreadsheet
  • Create issue in CHAOSS/Translations repository to kick-off translation to other languages (please use the the translation issue template)
  • "Metric Candidate Release" label added to the metric release candidate issue.
  • Metric was added to website

When above steps are completed:

  • Announce new/updated metric on mailing list, newsletter, community Zoom call, and Twitter. This can be coordinated with the community manager.

Content Quality

  • Required headings are filled in, including Questions.
  • Description provides context to metric
  • Objectives list sample uses for the metric and desired outcomes
  • If any, DEI uses of the metric are included Objectives
  • Optional headings that have no content were removed
  • Contributors section lists those contributors that want to be named
  • The name of the metric is the same in (1) metric heading, (2) metric file name, (3) focus area, (4) metrics spreadsheet, (5) “review issue”, (6) translation issue, and (7) website

Technical Requirements

  • Message in the metric markdown file that the metric will be part of the next regular release is at top of page and the links are correct (this is in the metric template)
  • Metric file name is the full metric name and only contains lower case letters and hyphens (“-”) for spaces
  • Images are included using markdown and relative links (as described in the metrics template)
  • Images have at least one empty line above and below them
  • Ensure images are placed in image folder and followed naming convention
  • If new focus area is created, ensure focus area is added to wg repo readme and focus area folder readme
  • Within the focus area, add the metric in the table and provide the link to the metric and metric question
  • Ensure tables within metric are converted as image and placed in the image folder (both original MD and screenshotted PNG format) and follow the naming convention
  • No HTML code in the metrics markdown file

@klumb this link is giving a 404 error

@vinodkahuja it is because I am not done with release on the website side

Should be good now

@klumb Ok got it.

Here is a new tool from GitHub that makes citations easier, and could be included as a Tool in this metric: https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/creating-a-repository-on-github/about-citation-files

commented

In some cases I don't think that we have the "formal" repository citation in a paper and it could be also interesting to know the number of derived opensource repo reference (or community) implementation of a paper for a specific repo/framework.

Probably we could access to the Arxiv API or Paperswith code API to collect this information.

See also:
https://blog.arxiv.org/2020/10/08/new-arxivlabs-feature-provides-instant-access-to-code/

@ElizabethN @bhack thankyou for your feedback. We have incorporated these changes in the metric.

commented

Thank you but I don't see in the implementation that we want to count the number of academic projects/papers released with opensource code that depend on the project/library we are analyzing.

@bhack in today's meeting discussion it was discussed and added to the section "Tools providing the metric" with bullet point "arXiv.org code". Further, if I am understanding correctly, I think the dependency of the project/library we are analyzing is covered in the implementation through bullet point "Number of downstream-dependencies of software in consideration".

I hope we have captured all your feedback. Let me know if we are still missing anything.

commented

"Number of downstream-dependencies of software in consideration".

Probably, if this mean the reference implementation of the authors of a specific paper/accademic pubblication or also third party community members implementation.

Generally paperswithcode tries to track both with stargazing.

I think you could also weighting the downstream popularity (with a small subset of CHAOSS metrics) other then the academic popularity (impact factor and other classical accademic metrics).

Closing for Release