A really cool feature of the Spark AR Reactive API is that different signals have different types -- which is specially handy when working with Vectors/VecSignals
A very simple example of how to implement this currently might be:
function createVector<T>(...args: VectorArgRest): Vector & (T extends 1 ? {x: number} : {[key:string]: any}) {
return new Vector(...tmp);
};
// note that {x: number} is part of the Vector type already, so this wouldn't be useful
const vec = createVector<1>(); // vec1 -> has {x: number}
A slightly more complex one:
In this case, NDVector could be extended/union'ed together with other Vector1/2/3/4 interfaces, which would also be great for hiding methods custom to each type (like cross3D
). These functions will likely be defined on the NDVector class, and later hidden away from the type def for non-3d vectors. Completely removing the function definitions into separate classes feels unnecessary, and would likely require complex code or a confusing dev-experience.
interface NDVector {
values: number[];
}
const NDVector = function(): NDVector {
this.values = [1,2,3];
return this;
} as unknown as { new (...args: VectorArgRest): NDVector; };
Unfortunately, using both rest parameters and allowing an array to be passed as an argument, might cause some confusion when types are automatically inferred. Arrays might have to default to the number
type no matter their length. A user of the lib could still explicitly declare the Vector
type if they so wished by calling Vector<type>
Vector<number>
(fallback) will likely reassemble the current type def, which merges all possible types into one
Open to changes/comments. Likely to be included as part of [email protected]