openbases / openbases-python Goto Github PK
View Code? Open in Web Editor NEWOpen Bases python helpers for https://openbases.github.io
Home Page: https://openbases.github.io/openbases-python/
License: Other
Open Bases python helpers for https://openbases.github.io
Home Page: https://openbases.github.io/openbases-python/
License: Other
We are adding a "specification" template type (e.g., for openschemas and don't have a well fitting type.
If the user specifies a custom color, it should override the default assignment per base names. This is to ensure that the tool has broader use.
hey @arfon ! What would you consider the minimum required fields for references (meaning present and defined?) so far I have a type (e.g., article) and title, month, and year. But we might want some additional checks, like giving a warning if missing a doi, or requiring a journal OR url? Let me know your thoughts! Just for perusing, here are the listing of fields for the current references I'm testing with. I'm not saying this is right / wrong, but rather a "real world" example:
['title', 'abstract', 'publisher', 'month', 'year']
['title', 'affiliation', 'abstract', 'journal', 'volume', 'number', 'pages', 'month', 'year', 'language']
['title', 'booktitle', 'abstract', 'month', 'year', 'howpublished', 'note']
['title', 'abstract', 'institution']
['title', 'abstract', 'journal', 'publisher', 'volume', 'month', 'year', 'keywords']
['title', 'abstract', 'howpublished', 'note']
['title', 'abstract', 'journal']
['title', 'abstract', 'journal', 'volume', 'number', 'pages', 'month', 'year']
['title', 'month', 'year', 'note', 'doi', 'url']
['title', 'year']
['title', 'affiliation', 'abstract', 'journal', 'volume', 'number', 'pages', 'month', 'year']
['title', 'publisher', 'series', 'year']
['title', 'journal', 'publisher', 'volume', 'number', 'month', 'year', 'address']
['title', 'abstract', 'journal', 'publisher', 'pages', 'doi', 'month', 'year', 'language']
['title', 'affiliation', 'abstract', 'journal', 'publisher', 'volume', 'pages', 'month', 'year', 'keywords', 'language', 'doi']
['title', 'affiliation', 'abstract', 'journal', 'publisher', 'volume', 'pages', 'month', 'year', 'doi', 'keywords', 'language']
['title', 'abstract', 'month', 'year', 'archivePrefix', 'primaryClass', 'doi', 'eprint']
['title', 'abstract', 'journal', 'publisher', 'volume', 'month', 'year', 'doi']
['title', 'abstract', 'month', 'year', 'archivePrefix', 'primaryClass', 'eprint']
And given my validation, the simple test passes for all of the above:
Testing bibliography entry Stodden2010-cu
type: article
title: The Scientific Method in Practice: Reproducibility in the Computational Sciences
Testing bibliography entry Ram2013-km
type: article
title: Git can facilitate greater reproducibility and increased transparency in science
Testing bibliography entry noauthor_2015-ig
type: misc
title: Docker-based solutions to reproducibility in science - Seven Bridges
Testing bibliography entry noauthor_undated-pi
type: misc
title: expfactory-docker
Testing bibliography entry Sochat2016-pu
type: article
title: The Experiment Factory: Standardizing Behavioral Experiments
Testing bibliography entry noauthor_undated-sn
type: misc
title: Science is in a reproducibility crisis: How do we resolve it?
Testing bibliography entry Baker_undated-bx
type: article
title: Over half of psychology studies fail reproducibility test
Testing bibliography entry Open_Science_Collaboration2015-hb
type: article
title: {PSYCHOLOGY}. Estimating the reproducibility of psychological science
Testing bibliography entry vanessa_sochat_2017_1059119
type: misc
title: {expfactory/expfactory: The Experiment Factory (v3.0) Release}
Testing bibliography entry McDonnell2012-ns
type: misc
title: psiTurk (Version 1.02)[Software]. New York, {NY}: New York University
Testing bibliography entry De_Leeuw2015-zw
type: article
title: jsPsych: a {JavaScript} library for creating behavioral experiments in a Web browser
Testing bibliography entry Smith2005-kg
type: book
title: Virtual Machines: Versatile Platforms for Systems and Processes
Testing bibliography entry Merkel2014-da
type: misc
title: Docker: Lightweight Linux Containers for Consistent Development and Deployment
Testing bibliography entry Ali2016-rh
type: article
title: The Case for Docker in Multicloud Enabled Bioinformatics Applications
Testing bibliography entry Moreews2015-dy
type: article
title: {BioShaDock}: a community driven bioinformatics shared Docker-based tools registry
Testing bibliography entry Belmann2015-eb
type: article
title: Bioboxes: standardised containers for interchangeable bioinformatics software
Testing bibliography entry Boettiger2014-cz
type: article
title: An introduction to Docker for reproducible research, with examples from the {R} environment
Testing bibliography entry Santana-Perez2015-wo
type: article
title: Towards Reproducibility in Scientific Workflows: An {Infrastructure-Based} Approach
Testing bibliography entry Wandell2015-yt
type: article
title: Data management to support reproducible research
I would want to be able to easily generate a badge, see openbases/openbases.github.io#3
Something like:
Could be generated via
ob-badge experiment lab-js
In the format
ob-badge <base-type> <base-name>
need to have tests before merging #15
The Github Pages rendered in "docs" should honor the ".nojekyll" file and include static files. It currently does not. https://openbases.github.io/whedon-python/html/
The first easy function of whedon-python should be to parse the paper.md and prepare the pandoc command to make it easily to generate the pdf (or other). This will be used in whedon-docker.
should be able to define a recipe of criteria, and then run it against a paper.
input name "builder-pdf" would generate:
https://img.shields.io/badge/paper-builder-pdf-%231ab170.svg
and instead needs to be
https://img.shields.io/badge/paper-builder_pdf-%231ab170.svg
to the extent possible, whedon-python should serve as a wrapper for whedon! And of course check for installation, report version, etc. This will also make it easier for Python preferring folk to use whedon from their python applications.
Akin to ob-paper, we need an ob-icon (or similar) entry point where the user can do things like get an icon https://github.com/openbases/openbases-icons. It also maybe should be more general (e.g., ob-resource) ?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.