Comments (5)
I propose the following addition to the toplevel README:
diff --git a/README.md b/README.md
index dc020d7..00f6c36 100644
--- a/README.md
+++ b/README.md
@@ -142,6 +142,33 @@ this extension causes a range of patches to be applied to the `datalad` package
to enable them. A comprehensive description of the current set of patch is
available at http://docs.datalad.org/projects/next/en/latest/#datalad-patches
+## Developing with DataLad NEXT
+
+This extension package moves fast in comparison to the core package. Nevertheless,
+attention is paid to API stability, adequate semantic versioning, and informative
+changelogs.
+
+### Public vs internal API
+
+Anything that can be imported directly from any of the sub-packages in
+`datalad_next` is considered to be part of the public API. Changes to this API
+determine the versioning, and development is done with the aim to keep this API
+as stable as possible. This includes signatures and return value behavior.
+
+As an example: `from datalad_next.runners import iter_git_subproc` imports a
+part of the public API, but `from datalad_next.runners.git import
+iter_git_subproc` does not.
+
+### Use of the internal API
+
+Developers can obviously use parts of the non-public API. However, this should
+only be done with the understanding that these components may change from one
+release to another, with no guarantee of transition periods, deprecation
+warnings, etc.
+
+Developers are adviced to never reuse any components with names starting with
+`_` (underscore). Their use should be limited to their individual subpackage.
+
## Acknowledgements
This DataLad extension was developed with funding from the Deutsche
from datalad-next.
Reads well to me 👍
from datalad-next.
This rule is indeed straightforward - it makes complete sense to me.
from datalad-next.
I like the rule. Same reasoning as @adswa: "straightforward".
from datalad-next.
Thanks for the feedback. I looked into what this would mean (using something like git grep 'datalad_next\..*\.' datalad_next/ | grep -v __init__.py | grep -v '/patches/'
for gathering initial evidence for subsequent curation).
This reveal a bunch of questions. Should we apply this rule uniformly to all sub-packages (or even sub-directories)?
I'd be in favor of that, for simplicity. I am inclined to make an exception for datalad_next/patches/
(it is not really a place to import anything from. I would try to not make an exception for datalad_next/tests/
and also not for datalad_next/utils/
.
With that in mind, here are to chunks of rule-violations we would have in the current code base:
Test tooling imports not from datalad_next.tests
directly
- Done with #619
# utils
datalad_next/annexbackends/tests/test_base.py:from datalad_next.tests.utils import
datalad_next/annexremotes/tests/test_archivist.py:from datalad_next.tests.utils import
datalad_next/annexremotes/tests/test_uncurl.py:from datalad_next.tests.utils import
datalad_next/commands/tests/test_create_sibling_webdav.py:from datalad_next.tests.utils import
datalad_next/commands/tests/test_credentials.py:from datalad_next.tests.utils import
datalad_next/commands/tests/test_download.py:from datalad_next.tests.utils import
datalad_next/commands/tests/test_status.py:from datalad_next.tests.utils import
datalad_next/commands/tests/test_tree.py:from datalad_next.tests.utils import
datalad_next/credman/tests/test_credman.py:from datalad_next.tests.utils import
datalad_next/gitremotes/tests/test_datalad_annex.py:from datalad_next.tests.utils import
datalad_next/iter_collections/tests/test_iterdir.py:from datalad_next.tests.utils import
datalad_next/iter_collections/tests/test_itergitdiff.py:from datalad_next.tests.utils import
datalad_next/iter_collections/tests/test_itergitstatus.py:from datalad_next.tests.utils import
datalad_next/iter_collections/tests/test_itergittree.py:from datalad_next.tests.utils import
datalad_next/iter_collections/tests/test_itergitworktree.py:from datalad_next.tests.utils import
datalad_next/iter_collections/tests/test_utils.py:from datalad_next.tests.utils import
datalad_next/tests/fixtures.py:from datalad_next.tests.utils import
datalad_next/tests/fixtures.py:from datalad_next.tests.utils import
datalad_next/tests/fixtures.py:from datalad_next.tests.utils import
datalad_next/url_operations/tests/test_ssh.py:from datalad_next.tests.utils import
#marker
datalad_next/archive_operations/tests/test_tarfile.py:from datalad_next.tests.marker import
datalad_next/commands/tests/test_ls_file_collection.py:from datalad_next.tests.marker import
datalad_next/iter_collections/tests/test_itertar.py:from datalad_next.tests.marker import
datalad_next/url_operations/tests/test_http.py:from datalad_next.tests.marker import
reuse of non-global fixture
datalad_next/commands/tests/test_ls_file_collection.py:from datalad_next.iter_collections.tests.test_iterzip import sample_zip
Credential manager
- Done with #621
datalad_next/commands/credentials.py:from datalad_next.credman.manager import
datalad_next/utils/credman.py:from datalad_next.credman.manager import
iterable_subprocess
- Done with #621
datalad_next/runners/iter_subproc.py:from datalad_next.iterable_subprocess.iterable_subprocess import
Constraints
- Done with #620
datalad_next/annexremotes/tests/test_uncurl.py:from datalad_next.constraints.dataset import
datalad_next/commands/create_sibling_webdav.py:from datalad_next.constraints.dataset import
datalad_next/commands/create_sibling_webdav.py:from datalad_next.constraints.exceptions import
datalad_next/commands/credentials.py:from datalad_next.constraints.dataset import
datalad_next/commands/download.py:from datalad_next.constraints.dataset import
datalad_next/commands/status.py:from datalad_next.constraints.dataset import
datalad_next/commands/tests/test_ls_file_collection.py:from datalad_next.constraints.exceptions import
datalad_next/commands/tests/test_status.py:from datalad_next.constraints.exceptions import
datalad_next/commands/tree.py:from datalad_next.constraints.dataset import
Types
- Done with #617
datalad_next/annexremotes/archivist.py:from datalad_next.types.annexkey import AnnexKey
datalad_next/annexremotes/archivist.py:from datalad_next.types.archivist import ArchivistLocator
datalad_next/annexremotes/archivist.py:from datalad_next.types.enums import ArchiveType
iterators
- Done with #618
datalad_next/archive_operations/tarfile.py:from datalad_next.iter_collections.tarfile import
datalad_next/archive_operations/zipfile.py:from datalad_next.iter_collections.zipfile import
datalad_next/archive_operations/tests/test_tarfile.py:from datalad_next.iter_collections.utils import
datalad_next/archive_operations/tests/test_zipfile.py:from datalad_next.iter_collections.utils import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.directory import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.tarfile import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.zipfile import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.utils import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.gittree import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.gitworktree import
datalad_next/commands/ls_file_collection.py:from datalad_next.iter_collections.annexworktree import
datalad_next/commands/status.py:from datalad_next.iter_collections.gitdiff import
datalad_next/commands/status.py:from datalad_next.iter_collections.gitstatus import
datalad_next/conftest.py:from datalad_next.iter_collections.tests.test_itertar import
datalad_next/conftest.py:from datalad_next.iter_collections.tests.test_iterzip import
URL operations
- Done with #615
datalad_next/annexremotes/tests/test_uncurl.py:from datalad_next.url_operations.any import
datalad_next/annexremotes/uncurl.py:from datalad_next.url_operations.any import
datalad_next/commands/download.py:from datalad_next.url_operations.any import
datalad_next/url_operations/tests/test_ssh.py:import datalad_next.url_operations.ssh
Archive operations
- Done with #622
datalad_next/annexremotes/archivist.py:from datalad_next.archive_operations.tarfile import
Common utilities
- Done with #623
datalad_next/iter_collections/utils.py:from datalad_next.utils.consts import
datalad_next/iter_collections/utils.py:from datalad_next.utils.multihash import
datalad_next/url_operations/file.py:from datalad_next.utils.consts import
datalad_next/url_operations/http.py:from datalad_next.utils.requests_auth import
datalad_next/utils/requests_auth.py:from datalad_next.utils.http_helpers import
datalad_next/utils/tests/test_deprecated.py:from datalad_next.utils.deprecate import
Absolute imports that should be relative
- Done with #621
datalad_next/annexremotes/tests/test_archivist.py:from datalad_next.annexremotes.archivist import
from datalad-next.
Related Issues (20)
- `ls-file-collection` misinterprets argument as a path for `gittree`
- Investigate performance of `next-status` with many subdatasets HOT 3
- Reenable GITEA tests
- Fix `datalad_next.utils.ParamDictator` HOT 1
- Remove `datalad_next.utils.ParamDictator`
- Remove undesired top-level imports before/with v2.0
- Changes of the public API scheduled for v2.0
- Phase out `get_dataset_root()` HOT 1
- datalad-core dependency tracker
- Scan for legacy/backward-compat import prior v2.0
- Add equivalent to `GitRepo.is_valid` to `repo_utils`
- Document API transitiion `EnsureDataset` + `require_dataset` to `ValidatedInterface`
- Improve error handling of `datalad-annex::` git remote helper HOT 2
- `next-status` state reporting too simplistic
- Spurious modification reports from `next-status` HOT 4
- `next-status` result rendering could be better aligned with Git's
- Some operations/tests are impacted by locale
- Decide on `next-status -r mono` reporting of a submodule with new commits
- Issue a warning if an unexpected execption is raised in `iterable_subprocesses.<locals>.input_to` HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from datalad-next.