proposal-keyby's People
Forkers
michaelficarraproposal-keyby's Issues
Symbols as composite values
A Composite Key could be a symbol instead of an object.
Unlike objects, not all symbols are currently usable as WeakMap keys, which would make it less surprising when the author attempts to use in a weak collection a Composite Key formed only of forgeable values (#3).
Using a symbol does not automatically resolve the question of value-like or unique semantics (#2), however value semantics would be natural for symbols, and more importantly, the author may not have as high expectations for the efficiency of ===
with complex primitive values. After all, string comparison can already be costly today.
const obj = {};
const a = CompositeKey.for(obj, 1, 2);
const b = CompositeKey.for(obj, 1, 2);
assert(typeof a === "symbol");
assert(a === b);
const ws = new WeakSet();
ws.add(a);
assert(ws.has(b));
assert.throws(() => ws.add(CompositeKey.for(0, 1, 2)));
Unlike registered symbols, there would be no way to get the constituents out of a composite key symbol. E.g. No CompositeKey.componentsFor
- Pro:
Map
s/Set
s andWeakMap
s/WeakSet
s don't need to be aware of CompositeKey and treat it differently from other symbols (besides their weakability like for current symbols) - Pro: Does not require interning at the point in time when CKs symbols are created. Any interning could be delayed until
===
, with overhead in line with the existing string overhead. - Pro: Does not enforce that at least one constituent has a lifetime. Like other symbols, some can be forgeable, while some are unique.
- Con: No intent expressed by the
typeof
value (unlike the unique object approach which has a prototype).
Why not expose the underlying values?
Maybe I'm missing where this is documented, but why not expose the constituents? When I've done something like the ===
-preserving CompositeKey in userland, I used a frozen array.
This has some advantages - it lets you do stuff like sortKeys((a, b) => a === b ? 0 : elementWiseCompare(a, b))
, where elementWiseCompare
is the normal lexicographic comparison on arrays. You can't implement that without access to the constituents.
CompositeKey and WeakMaps
Should CompositeKey be allowed as a WeakMap key, and if so what would the semantics be? (also depends on #2).
And should there be a keyBy
option when constructing new WeakMap([], { keyBy: ... })
(and WeakSet
).
Should CompositeKey have value-like equality?
Should CompositeKey have value-like equality?
Yes:
This would look something like this:
const obj = {};
const a = CompositeKey(obj, 1, 2);
const b = CompositeKey(obj, 1, 2);
assert(typeof a === "object");
assert(Reflect.ownkeys(a).length === 0); // or maybe only `Symbl.toStringTag` like ESM namespaces
assert(a === b):
assert(Reflect.getPrototypeOf(a) === null);
No:
This would look something like this:
const obj = {};
const a = new CompositeKey(obj, 1, 2);
const b = new CompositeKey(obj, 1, 2);
// same as above:
assert(typeof a === "object");
assert(Reflect.ownkeys(a).length === 0); // or maybe only `Symbl.toStringTag` like ESM namespaces
// different:
assert(a !== b);
assert(Reflect.getPrototypeOf(a) === CompositeKey.prototype);
assert(CompositeKey.equal(a, b));
Pros/Cons
Value like semantics
- Pro:
Map
s andSet
s don't need to beware of CompositeKey and treat it differently from other objects - Con: To avoid tying the equality of the CK to a realm it would need to have a
null
prototype. - Con: Effectively forces implementations to do global interning at the point in time when CKs are created, delaying until
===
can add overhead to every===
call. - Con: May enforce that at least one constituent has a lifetime (
CompositeKey(1, 2)
throws). - Pro? Could avoid
new
Unique Object
- Pro: Allows more freedom of when work is done when it comes to computing hashes and equality.
- Pro: CKs can have a prototype, which may be useful for adding methods/symbols in the future.
- Pro: No requirement for at least one lifetime bearing constituent (
new CompositeKey(1, 2)
allowed) - Con: CKs would be a special type of object when returned by
Map
andSet
with akeyby
function.- more work to virtualise.
- Con? Would most likely need to be created with
new
Varargs ...or not to Varargs
CompositeKey(1, 2, 3)
vs CompositeKey([1, 2, 3])
Varargs:
- Pro: Don't need to create an array when creating a key, potentially less overhead
- Con: Can hit engine limits of number of arguments (perhaps a good thing, large composite keys might be an anti-pattern)
- Pro: Cleaner?
One-arg:
- Pro: Allows room for things like an options bag in the future
CompositeKey([1, 2, 3], { someSetting: true })
- Con: Need to create an extra object when creating a key, potentially more overhead
- Con: More verbose?
Existing APIs:
new Array(1, 2, 3);
new Set([1, 2, 3]);
Set vs Hashmap
Excuse me sticking my nose in ๐ I was having a read through and wondered whether this would feel right as an extension to Set
. For example, in the README currently:
There is no capability to override this behavior and allow two different objects to be treated equal within the collection.
This sounds at-odds to me with the basic description of a set.
The Set object lets you store unique values of any type...
From https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set
The keyBy
callback is effectively a hashing function, making Set behave like a Hashmap. For that reason, I would think alternative to this would be to add a Hashmap
or Hashset
collection instead, into which people could pass JSON.stringify
or whatever hashing behaviour they would like.
To me this would feel like a better conceptual fit. If I received a Hashmap from some calling code, I think I would be aware of the fact that Hashmaps make less guarantees around uniquess and could proceed with caution. If I receive a Set
on the other hand, I may be less likely to question the uniqueness check and may be bitten by keyBy
being configured in a way I didn't expect. Another way to say it would be that Set
becomes less "trustworthy".
The are some equivalents to this stuff in C#: Hashset and Hashtable. I bring them up as points of comparison, because:
- Their names include the word "hash" - giving you a clue that values may not be unique as a consumer.
- The hash behaviour has always been modifiable (citation needed), so they haven't broken assumptions by introducing it at a later date.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.