ar-js-org / studio-backend Goto Github PK
View Code? Open in Web Editor NEWBackend of AR.js studio project.
License: MIT License
Backend of AR.js studio project.
License: MIT License
When user choose to add audio / video resources, the behaviour should be the following:
To do that, other than the HTML, we have to embed a script with some predefined Javascript code. This should be done with a <script> inline, without creating another file.
Can someone look over my code?
I was able to get the zip file with the files in the right format, but I couldn't get it to work on my phone.
const {
MarkerModule,
ASSET_IMAGE,
ZipProvider,
ENC_BASE64,
} = ARjsStudioBackend;
// Init zip
const zip = new ZipProvider();
// Get image to be used for marker
const img = new Image();
img.src = imageData;
// Create canvas
const canvas = document.createElement("canvas");
canvas.width = img.width;
canvas.height = img.width;
// Draw image on canvas
const ctx = canvas.getContext("2d");
ctx.drawImage(img, 0, 0);
// Get data URI
const imageDataURI = canvas.toDataURL();
// Set up promises for full marker image, marker pattern, and content html
const getFullMarker = MarkerModule.getFullMarkerImage(
imageDataURI,
1.0,
100,
"black"
);
const getPattFile = MarkerModule.getMarkerPattern(imageDataURI);
const getContent = MarkerModule.generatePatternHtml(
ASSET_IMAGE,
"marker.patt",
"assets/image.jpg"
);
// Wait for all promises
Promise.all([getFullMarker, getPattFile, getContent])
.then((res) => {
const fullMarker = res[0].replace("data:image/png;base64,", ""); // Clean up data tag.
const pattFile = res[1];
const content = res[2];
zip.addFile("images/img.jpg", fullMarker, ENC_BASE64);
zip.addFile("assets/image.jpg", contentFile); // contentFile is an image in this case.
zip.addFile("marker.patt", pattFile);
zip.addFile("index.html", content);
return zip.serveFiles({ compress: 9 });
})
.then((base64) => {
// trigger download
window.location = `data:application/zip;base64,${base64}`;
});
Github will be our first integration vendor used to automatically deploy the generated code on the web.
The idea is, as a final step of the creation flow, to
After the login, we will show the user a successful message, his username and the possibility to deploy on Github pages. This will follow with another, specific task in the near future.
On the studio-backend side, for this current task, is required to:
Helps for AR-js-org/studio#65
So far we have static methods to generate the assets required for markers and providers (see PR #7) that should handle the bundle.
We need a component that generates the HTML for the user's application, with AR.js already imported and <a-scene>
/<a-marker>
elements set up.
Depends on #30
Hardcode the token in backend library and change repository name generation to be something like WebAR_<hash>
.
This provider should output a zip file containing the application bundle for the user.
This could be done wrapping JSZip library and returning the zip as a base64 string to the frontend.
Users may upload GLTF assets that comes with more than a single file. We have to handle those as well.
This is a class that exposes (static?) methods to the frontend to generate the application package and "give" It to the user.
The interface could be somethings like this:
const package = new Package(type, config);
package.serveFiles('zip/github'); // uploads to GitHub or returns a Zip file
Where type
is one of location/barcode/pattern/ntf
and each one can have It's ownconfig
:
{
matrixType: 'see exported consts',
value: 123
}
{
image: 'data URI string',
ratio: 0.5,
size: 128,
color: '#000000'
}
{
lat: 99.7766
lon: 77.8844
}
Internally, the module should:
MarkerModule
to generate marker image (for barcode and pattern) and patt file (only for patter)I think both GitHub and Zip providers will need adjustments for handling 3d models, but binary files should be already sort-of implemented.
My idea is to have the three, available ar types, configurable: marker, location and NFT. The last one will be switched off for now, aka "unplugged".
We could create three separate js classes, one for each module.
The backend can be configured by a config.js
file, that will import the classes.
This way, we create a pluggable architecture, very basic but useful for possible future configurations.
A user, using the studio
app, will see on the homepage all activated modules. To start, he will see location and marker based only, as said before.
When NFT support will come, routing for the NFT feature will be activated and handled on the frontend repo.
For the MVP, we decided to use a new GitHub account to generate a token, and hard-code that account/token into the Studio app.
When users create a project from Studio and decide to host it on GitHub, we'll use this account and create a new repository for deploying and hosting the project.
Right now (in the MVP) we're going to use a GitHub account we own to deploy and host user's projects, for simplicity. This is not very good, nor scalable.
After MVP is released we should think about a better way to host the user's projects, keeping the process easy for non-tech savvy users.
Requirements:
We need HTML templates for generating NTF application.
When using GIF, we should adapt the following template:
<script src="https://cdn.jsdelivr.net/gh/aframevr/aframe@1c2407b26c61958baa93967b5412487cd94b290b/dist/aframe-master.min.js"></script>
<script src="https://raw.githack.com/AR-js-org/AR.js/master/aframe/build/aframe-ar.js"></script>
<script src="./gif-fixed.js"></script>
<body style='margin : 0px; overflow: hidden;'>
<a-scene
vr-mode-ui="enabled: false;"
renderer='antialias: true; alpha: true; precision: mediump;'
embedded arjs='trackingMethod: best; sourceType: webcam; debugUIEnabled: false;'>
<a-marker preset="hiro">
<a-entity
geometry="primitive:plane; width: 3; height: 3;"
position="0 0 -1"
rotation="-90 0 0"
material="shader:gif; src:url(banana.gif); alphaTest:1;"></a-entity>
</a-marker>
<a-entity camera></a-entity>
</a-scene>
</body>
gix-fixed.js:
/******/ (function(modules) { // webpackBootstrap
/******/ // The module cache
/******/ var installedModules = {};
/******/ // The require function
/******/ function __webpack_require__(moduleId) {
/******/ // Check if module is in cache
/******/ if(installedModules[moduleId])
/******/ return installedModules[moduleId].exports;
/******/ // Create a new module (and put it into the cache)
/******/ var module = installedModules[moduleId] = {
/******/ exports: {},
/******/ id: moduleId,
/******/ loaded: false
/******/ };
/******/ // Execute the module function
/******/ modules[moduleId].call(module.exports, module, module.exports, __webpack_require__);
/******/ // Flag the module as loaded
/******/ module.loaded = true;
/******/ // Return the exports of the module
/******/ return module.exports;
/******/ }
/******/ // expose the modules object (__webpack_modules__)
/******/ __webpack_require__.m = modules;
/******/ // expose the module cache
/******/ __webpack_require__.c = installedModules;
/******/ // __webpack_public_path__
/******/ __webpack_require__.p = "";
/******/ // Load entry module and return exports
/******/ return __webpack_require__(0);
/******/ })
/************************************************************************/
/******/ ([
/* 0 */
/***/ function(module, exports, __webpack_require__) {
'use strict';
var _typeof = typeof Symbol === "function" && typeof Symbol.iterator === "symbol" ? function (obj) { return typeof obj; } : function (obj) { return obj && typeof Symbol === "function" && obj.constructor === Symbol ? "symbol" : typeof obj; };
var _gifsparser = __webpack_require__(1);
if (typeof AFRAME === 'undefined') {
throw 'Component attempted to register before AFRAME was available.';
}
/* get util from AFRAME */
var parseUrl = AFRAME.utils.srcLoader.parseUrl;
var debug = AFRAME.utils.debug;
// debug.enable('shader:gif:*')
debug.enable('shader:gif:warn');
var warn = debug('shader:gif:warn');
var log = debug('shader:gif:debug');
/* store data so that you won't load same data */
var gifData = {};
/* create error message */
function createError(err, src) {
return { status: 'error', src: src, message: err, timestamp: Date.now() };
}
AFRAME.registerShader('gif', {
/**
* For material component:
* @see https://github.com/aframevr/aframe/blob/60d198ef8e2bfbc57a13511ae5fca7b62e01691b/src/components/material.js
* For example of `registerShader`:
* @see https://github.com/aframevr/aframe/blob/41a50cd5ac65e462120ecc2e5091f5daefe3bd1e/src/shaders/flat.js
* For MeshBasicMaterial
* @see http://threejs.org/docs/#Reference/Materials/MeshBasicMaterial
*/
schema: {
/* For material */
color: { type: 'color' },
fog: { default: true },
/* For texuture */
src: { default: null },
autoplay: { default: true }
},
/**
* Initialize material. Called once.
* @protected
*/
init: function init(data) {
log('init', data);
log(this.el.components);
this.__cnv = document.createElement('canvas');
this.__cnv.width = 2;
this.__cnv.height = 2;
this.__ctx = this.__cnv.getContext('2d');
this.__texture = new THREE.Texture(this.__cnv); //renders straight from a canvas
this.__material = {};
this.__reset();
this.material = new THREE.MeshBasicMaterial({ map: this.__texture });
this.el.sceneEl.addBehavior(this);
this.__addPublicFunctions();
return this.material;
},
/**
* Update or create material.
* @param {object|null} oldData
*/
update: function update(oldData) {
log('update', oldData);
this.__updateMaterial(oldData);
this.__updateTexture(oldData);
return this.material;
},
/**
* Called on each scene tick.
* @protected
*/
tick: function tick(t) {
if (!this.__frames || this.paused()) return;
if (Date.now() - this.__startTime >= this.__nextFrameTime) {
this.nextFrame();
}
},
/*================================
= material =
================================*/
/**
* Updating existing material.
* @param {object} data - Material component data.
*/
__updateMaterial: function __updateMaterial(data) {
var material = this.material;
var newData = this.__getMaterialData(data);
Object.keys(newData).forEach(function (key) {
material[key] = newData[key];
});
},
/**
* Builds and normalize material data, normalizing stuff along the way.
* @param {Object} data - Material data.
* @return {Object} data - Processed material data.
*/
__getMaterialData: function __getMaterialData(data) {
return {
fog: data.fog,
color: new THREE.Color(data.color)
};
},
/*==============================
= texure =
==============================*/
/**
* set texure
* @private
* @param {Object} data
* @property {string} status - success / error
* @property {string} src - src url
* @property {array} times - array of time length of each image
* @property {number} cnt - total counts of gif images
* @property {array} frames - array of each image
* @property {Date} timestamp - created at the texure
*/
__setTexure: function __setTexure(data) {
log('__setTexure', data);
if (data.status === 'error') {
warn('Error: ' + data.message + '\nsrc: ' + data.src);
this.__reset();
} else if (data.status === 'success' && data.src !== this.__textureSrc) {
this.__reset();
/* Texture added or changed */
this.__ready(data);
}
},
/**
* Update or create texure.
* @param {Object} data - Material component data.
*/
__updateTexture: function __updateTexture(data) {
var src = data.src;
var autoplay = data.autoplay;
/* autoplay */
if (typeof autoplay === 'boolean') {
this.__autoplay = autoplay;
} else if (typeof autoplay === 'undefined') {
this.__autoplay = true;
}
if (this.__autoplay && this.__frames) {
this.play();
}
/* src */
if (src) {
this.__validateSrc(src, this.__setTexure.bind(this));
} else {
/* Texture removed */
this.__reset();
}
},
/*=============================================
= varidation for texure =
=============================================*/
__validateSrc: function __validateSrc(src, cb) {
/* check if src is a url */
var url = parseUrl(src);
if (url) {
this.__getImageSrc(url, cb);
return;
}
var message = void 0;
/* check if src is a query selector */
var el = this.__validateAndGetQuerySelector(src);
if (!el || (typeof el === 'undefined' ? 'undefined' : _typeof(el)) !== 'object') {
return;
}
if (el.error) {
message = el.error;
} else {
var tagName = el.tagName.toLowerCase();
if (tagName === 'video') {
src = el.src;
message = 'For video, please use `aframe-video-shader`';
} else if (tagName === 'img') {
this.__getImageSrc(el.src, cb);
return;
} else {
message = 'For <' + tagName + '> element, please use `aframe-html-shader`';
}
}
/* if there is message, create error data */
if (message) {
(function () {
var srcData = gifData[src];
var errData = createError(message, src);
/* callbacks */
if (srcData && srcData.callbacks) {
srcData.callbacks.forEach(function (cb) {
return cb(errData);
});
} else {
cb(errData);
}
/* overwrite */
gifData[src] = errData;
})();
}
},
/**
* Validate src is a valid image url
* @param {string} src - url that will be tested
* @param {function} cb - callback with the test result
*/
__getImageSrc: function __getImageSrc(src, cb) {
var _this = this;
/* if src is same as previous, ignore this */
if (src === this.__textureSrc) {
return;
}
/* check if we already get the srcData */
var srcData = gifData[src];
if (!srcData || !srcData.callbacks) {
/* create callback */
srcData = gifData[src] = { callbacks: [] };
srcData.callbacks.push(cb);
} else if (srcData.src) {
cb(srcData);
return;
} else if (srcData.callbacks) {
/* add callback */
srcData.callbacks.push(cb);
return;
}
var tester = new Image();
tester.crossOrigin = 'Anonymous';
tester.addEventListener('load', function (e) {
/* check if it is gif */
_this.__getUnit8Array(src, function (arr) {
if (!arr) {
onError('This is not gif. Please use `shader:flat` instead');
return;
}
/* parse data */
(0, _gifsparser.parseGIF)(arr, function (times, cnt, frames) {
/* store data */
var newData = { status: 'success', src: src, times: times, cnt: cnt, frames: frames, timestamp: Date.now() };
/* callbacks */
if (srcData.callbacks) {
srcData.callbacks.forEach(function (cb) {
return cb(newData);
});
/* overwrite */
gifData[src] = newData;
}
}, function (err) {
return onError(err);
});
});
});
tester.addEventListener('error', function (e) {
return onError('Could be the following issue\n - Not Image\n - Not Found\n - Server Error\n - Cross-Origin Issue');
});
function onError(message) {
/* create error data */
var errData = createError(message, src);
/* callbacks */
if (srcData.callbacks) {
srcData.callbacks.forEach(function (cb) {
return cb(errData);
});
/* overwrite */
gifData[src] = errData;
}
}
tester.src = src;
},
/**
*
* get mine type
*
*/
__getUnit8Array: function __getUnit8Array(src, cb) {
if (typeof cb !== 'function') {
return;
}
var xhr = new XMLHttpRequest();
xhr.open('GET', src);
xhr.responseType = 'arraybuffer';
xhr.addEventListener('load', function (e) {
var uint8Array = new Uint8Array(e.target.response);
var arr = uint8Array.subarray(0, 4);
// const header = arr.map(value => value.toString(16)).join('')
var header = '';
for (var i = 0; i < arr.length; i++) {
header += arr[i].toString(16);
}
if (header === '47494638') {
cb(uint8Array);
} else {
cb();
}
});
xhr.addEventListener('error', function (e) {
log(e);
cb();
});
xhr.send();
},
/**
* Query and validate a query selector,
*
* @param {string} selector - DOM selector.
* @return {object} Selected DOM element | error message object.
*/
__validateAndGetQuerySelector: function __validateAndGetQuerySelector(selector) {
try {
var el = document.querySelector(selector);
if (!el) {
return { error: 'No element was found matching the selector' };
}
return el;
} catch (e) {
// Capture exception if it's not a valid selector.
return { error: 'no valid selector' };
}
},
/*================================
= playback =
================================*/
/**
* add public functions
* @private
*/
__addPublicFunctions: function __addPublicFunctions() {
this.el.gif = {
play: this.play.bind(this),
pause: this.pause.bind(this),
togglePlayback: this.togglePlayback.bind(this),
paused: this.paused.bind(this),
nextFrame: this.nextFrame.bind(this)
};
},
/**
* Pause gif
* @public
*/
pause: function pause() {
log('pause');
this.__paused = true;
},
/**
* Play gif
* @public
*/
play: function play() {
log('play');
this.__paused = false;
},
/**
* Toggle playback. play if paused and pause if played.
* @public
*/
togglePlayback: function togglePlayback() {
if (this.paused()) {
this.play();
} else {
this.pause();
}
},
/**
* Return if the playback is paused.
* @public
* @return {boolean}
*/
paused: function paused() {
return this.__paused;
},
/**
* Go to next frame
* @public
*/
nextFrame: function nextFrame() {
this.__draw();
/* update next frame time */
while (Date.now() - this.__startTime >= this.__nextFrameTime) {
this.__nextFrameTime += this.__delayTimes[this.__frameIdx++];
if ((this.__infinity || this.__loopCnt) && this.__frameCnt <= this.__frameIdx) {
/* go back to the first */
this.__frameIdx = 0;
}
}
},
/*==============================
= canvas =
==============================*/
/**
* clear canvas
* @private
*/
__clearCanvas: function __clearCanvas() {
this.__ctx.clearRect(0, 0, this.__width, this.__height);
this.__texture.needsUpdate = true;
},
/**
* draw
* @private
*/
__draw: function __draw() {
this.__clearCanvas(); this.__ctx.drawImage(this.__frames[this.__frameIdx], 0, 0, this.__width, this.__height); this.__texture.needsUpdate = true;
},
/*============================
= ready =
============================*/
/**
* setup gif animation and play if autoplay is true
* @private
* @property {string} src - src url
* @param {array} times - array of time length of each image
* @param {number} cnt - total counts of gif images
* @param {array} frames - array of each image
*/
__ready: function __ready(_ref) {
var src = _ref.src;
var times = _ref.times;
var cnt = _ref.cnt;
var frames = _ref.frames;
log('__ready');
this.__textureSrc = src;
this.__delayTimes = times;
cnt ? this.__loopCnt = cnt : this.__infinity = true;
this.__frames = frames;
this.__frameCnt = times.length;
this.__startTime = Date.now();
this.__width = THREE.Math.nearestPowerOfTwo(frames[0].width);
this.__height = THREE.Math.nearestPowerOfTwo(frames[0].height);
this.__cnv.width = this.__width;
this.__cnv.height = this.__height;
this.__draw();
if (this.__autoplay) {
this.play();
} else {
this.pause();
}
},
/*=============================
= reset =
=============================*/
/**
* @private
*/
__reset: function __reset() {
this.pause();
this.__clearCanvas();
this.__startTime = 0;
this.__nextFrameTime = 0;
this.__frameIdx = 0;
this.__frameCnt = 0;
this.__delayTimes = null;
this.__infinity = false;
this.__loopCnt = 0;
this.__frames = null;
this.__textureSrc = null;
}
});
/***/ },
/* 1 */
/***/ function(module, exports) {
'use strict';
/**
*
* Gif parser by @gtk2k
* https://github.com/gtk2k/gtk2k.github.io/tree/master/animation_gif
*
*/
exports.parseGIF = function (gif, successCB, errorCB) {
var pos = 0;
var delayTimes = [];
var loadCnt = 0;
var graphicControl = null;
var imageData = null;
var frames = [];
var loopCnt = 0;
if (gif[0] === 0x47 && gif[1] === 0x49 && gif[2] === 0x46 && // 'GIF'
gif[3] === 0x38 && gif[4] === 0x39 && gif[5] === 0x61) {
// '89a'
pos += 13 + +!!(gif[10] & 0x80) * Math.pow(2, (gif[10] & 0x07) + 1) * 3;
var gifHeader = gif.subarray(0, pos);
while (gif[pos] && gif[pos] !== 0x3b) {
var offset = pos,
blockId = gif[pos];
if (blockId === 0x21) {
var label = gif[++pos];
if ([0x01, 0xfe, 0xf9, 0xff].indexOf(label) !== -1) {
label === 0xf9 && delayTimes.push((gif[pos + 3] + (gif[pos + 4] << 8)) * 10);
label === 0xff && (loopCnt = gif[pos + 15] + (gif[pos + 16] << 8));
while (gif[++pos]) {
pos += gif[pos];
}label === 0xf9 && (graphicControl = gif.subarray(offset, pos + 1));
} else {
errorCB && errorCB('parseGIF: unknown label');break;
}
} else if (blockId === 0x2c) {
pos += 9;
pos += 1 + +!!(gif[pos] & 0x80) * (Math.pow(2, (gif[pos] & 0x07) + 1) * 3);
while (gif[++pos]) {
pos += gif[pos];
}var imageData = gif.subarray(offset, pos + 1);
frames.push(URL.createObjectURL(new Blob([gifHeader, graphicControl, imageData])));
} else {
errorCB && errorCB('parseGIF: unknown blockId');break;
}
pos++;
}
} else {
errorCB && errorCB('parseGIF: no GIF89a');
}
if (frames.length) {
var cnv = document.createElement('canvas');
var loadImg = function loadImg() {
frames.forEach(function (src, i) {
var img = new Image();
img.onload = function (e, i) {
if (i === 0) {
cnv.width = img.width;
cnv.height = img.height;
}
loadCnt++;
frames[i] = this;
if (loadCnt === frames.length) {
loadCnt = 0;
imageFix(1);
}
}.bind(img, null, i);
img.src = src;
});
};
var imageFix = function imageFix(i) {
var img = new Image();
img.onload = function (e, i) {
loadCnt++;
frames[i] = this;
if (loadCnt === frames.length) {
cnv = null;
successCB && successCB(delayTimes, loopCnt, frames);
} else {
imageFix(++i);
}
}.bind(img);
img.src = cnv.toDataURL('image/gif');
};
loadImg();
}
};
/***/ }
/******/ ]);
This provider should output the application bundle as a UTF-8 string for the frontend.
We need HTML templates for generating Location-based application.
I'm trying to have extended image tracking on webar using AR.js but have been unable to find much documented information about it online. One page says that it is not supported, but it's a post from 5 years ago, so it's not reliable information. My apologies if I've missed any updates. If it helps at all, here's the link to my code:
If frontend side calls the methods to generate templates with scale/rotation proprties, on the backend side we should handle them and translate them into a-frame properties, on the template.
It should be possible to have the minified (and not mifinified) version on the lib available on this repository, that can then be retrieved using services like https://raw.githack.com/, to be imported in other libraries (like studio
, the frontend project).
What do you think @le0m ?
This will avoid the manual import of the minified library on studio
project.
Create a set of initial and mandatory utility methods with a signature, specifying at least input parameters
Nice to have on the user repo, created with AR.js Studio:
package.serve
when packageType is github
const barcodeMarkerSVG = MarkerModule.getBarcodeMarkerSVGDataURI(MATRIX_3X3_HAMMING_63, 8);
Error: barcode value out of range: 8
pattFileDownload.click(); // trigger download
[object Promise]
master
branch failed. π¨I recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.
You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. Iβm sure you can resolve this πͺ.
Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.
Once all the errors are resolved, semantic-release will release your package the next time you push a commit to the master
branch. You can also manually restart the failed CI job that runs semantic-release.
If you are not sure how to resolve this, here is some links that can help you:
If those donβt help, or if this issue is reporting something you think isnβt right, you can always ask the humans behind semantic-release.
An npm token must be created and set in the NPM_TOKEN
environment variable on your CI environment.
Please make sure to create an npm token and to set it in the NPM_TOKEN
environment variable on your CI environment. The token must allow to publish to the registry https://registry.npmjs.org/
.
Good luck with your project β¨
Your semantic-release bot π¦π
Pinch & Zoom to rotate/scale the 3D model.
See: https://github.com/fcor/arjs-gestures
To be added only on marker experience for now.
When the user deploys a project to GitHub, we should send him an email with the URL.
Currently the backend library just returns the URL to frontend.
The email content is TBD, It'll contain some disclaimer and information about how to request to delete the project (since we will use ARjs Studio Org to publish to GitHub).
Audio template should be changed following this one, as they actually do not work:
// blabla imports...
<script>
AFRAME.registerComponent('registerevents', {
init: function () {
var marker = this.el;
var sound = document.querySelector('#sound-entity');
marker.addEventListener('markerFound', function() {
sound.components.sound.playSound();
})
}
});
</script>
<body style='margin : 0px; overflow: hidden;'>
<a-scene
vr-mode-ui='enabled: false;'
renderer="logarithmicDepthBuffer: true;"
embedded arjs='trackingMethod: best; sourceType: webcam; debugUIEnabled: false;'>
<a-assets>
<a-asset-item id="sound" src="<sound-file-url>" response-type="arraybuffer"></a-asset-item>
</a-assets>
<a-marker
// blabla same as pattern marker for other content
registerevents
emitsevent="true">
</a-marker>
<a-entity id="sound-entity" sound="src: #sound" autoplay="false"></a-entity>
<a-entity camera></a-entity>
</a-scene>
</body>
Implement the pattern marker generator behaviour in a method.
When given parameters regarding input images (base64 maybe?) and other like border thickness and color, it outputs .patt file and .png (base 64 again?) of the generated marker.
behaviour is practically the same of: https://jeromeetienne.github.io/AR.js/three.js/examples/marker-training/examples/generator.html
We do not have source code of barcode marker generator. But don't worry:
https://github.com/nicolocarpignoli/artoolkit-barcode-markers-collection
On that repository there are all barcode markers the user can ask for.
We just have to decide a subset of those present in there, maybe 3 sets looking for best ratio between high hamming distance/number of markers.
Then we have to include those png files on backend project and respond to a user request giving them the barcode file (base64)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.