Giter Club home page Giter Club logo

webgazertutorial's People

Contributors

xiaozhi2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

webgazertutorial's Issues

What exact modifications to Webgazer increase the framerate?

First of all, Great Work! I am exploring the potential of Webgazer for cognitive research myself (binary choice task, jsPsych) and was very happy to find your preprint (https://psyarxiv.com/qhme6/).

My eye-tracking framerate is currently between 5 and 30Hz. I looked at your suggestions in the README and the preprint and tried three things to improve the framerate. However, none of that seemed to work.

  1. Imported all necessary files from your repository (webgazer.js, the webgazer-worker files, jcanvas.js, & eye-tracking.js) for my own study, and called getPrediction() with setInterval() in my own jsPsych-choice-task-plugin (instead of setGazeListener), like instructed in your README-file. Also tested this setup with several participants on other computers. The frame rate did not improve and I got very severe problems with the "clmtrackr" face tracker, which could barely detect any face. Moreover, threaded ridge regression did not work for me but it was also discussed on the Brown HCI Repo that this regression model might not add much of a benefit (Issue #130)
webgazer.showVideo(false);
webgazer.showPredictionPoints(true);
webgazer.showFaceOverlay(false);
webgazer.showFaceFeedbackBox(false);
var starttime = performance.now();
var eye_tracking_interval = setInterval(
 function() {
   var pos = webgazer.getCurrentPrediction();
   if (pos) {

     var relativePosX = pos.x/screen.width ;
     var relativePosY = pos.y/screen.height;
     eyeData.history.push({
      // 'x': pos.x,
     //  'y': pos.y,
       'relative-x': relativePosX,
       'relative-y': relativePosY,
       'elapse-time': performance.now() - starttime
     });
   }
 },20);
  1. Cloned your git repository and ran your experiment exactly like instructed and printed out the data with console.log().

  2. Used the newest Webgazer version from the original authors (https://github.com/brownhci/WebGazer), the one that now uses the FaceMesh face tracker as a default, and call the data with getPrediction() and setInterval() instead of setGazeListener(), which should bypass the eventually unnecessary computations.

None of these methods seemed to improve the frame rate compared to simply using the current Webgazer code with setGazeListener() (https://github.com/brownhci/WebGazer) and I could not get anywhere close to 50Hz (20ms interval).

Since I could not find any information in the commits, could you provide more information on how you changed the Webgazer library exactly (any specific lines of code you removed?) or what functions beyond the library (maybe in eye-tracking.js?) bypass the unnecessary computations in webgazer.js? In your preprint, you write "The original code calls the getPrediction() function at every animation frame". What modifications in the loop function() How is this bypassed?

loop function of the most recent version of webgazer:

async function loop() {
  if (!paused) {

    // [20200617 XK] TODO: there is currently lag between the camera input and the face overlay. This behavior
    // is not seen in the facemesh demo. probably need to optimize async implementation. I think the issue lies
    // in the implementation of getPrediction().

    // Paint the latest video frame into the canvas which will be analyzed by WebGazer
    // [20180729 JT] Why do we need to do this? clmTracker does this itself _already_, which is just duplicating the work.
    // Is it because other trackers need a canvas instead of an img/video element?
    paintCurrentFrame(videoElementCanvas, videoElementCanvas.width, videoElementCanvas.height);

    // Get gaze prediction (ask clm to track; pass the data to the regressor; get back a prediction)
    latestGazeData = getPrediction();
    // Count time
    var elapsedTime = performance.now() - clockStart;


    // Draw face overlay
    if( src_webgazer.params.showFaceOverlay )
    {
      // Get tracker object
      var tracker = src_webgazer.getTracker();
      faceOverlay.getContext('2d').clearRect( 0, 0, videoElement.videoWidth, videoElement.videoHeight);
      tracker.drawFaceOverlay(faceOverlay.getContext('2d'), tracker.getPositions());
    }

    // Feedback box
    // Check that the eyes are inside of the validation box
    if( src_webgazer.params.showFaceFeedbackBox )
      checkEyesInValidationBox();

    latestGazeData = await latestGazeData;

    // [20200623 xk] callback to function passed into setGazeListener(fn)
    callback(latestGazeData, elapsedTime);

    if( latestGazeData ) {
      // [20200608 XK] Smoothing across the most recent 4 predictions, do we need this with Kalman filter?
      smoothingVals.push(latestGazeData);
      var x = 0;
      var y = 0;
      var len = smoothingVals.length;
      for (var d in smoothingVals.data) {
        x += smoothingVals.get(d).x;
        y += smoothingVals.get(d).y;
      }

      var pred = src_webgazer.util.bound({'x':x/len, 'y':y/len});

      if (src_webgazer.params.storingPoints) {
        drawCoordinates('blue',pred.x,pred.y); //draws the previous predictions
        //store the position of the past fifty occuring tracker preditions
        src_webgazer.storePoints(pred.x, pred.y, src_k);
        src_k++;
        if (src_k == 50) {
          src_k = 0;
        }
      }
      // GazeDot
      if (src_webgazer.params.showGazeDot) {
        gazeDot.style.display = 'block';
      }
      gazeDot.style.transform = 'translate3d(' + pred.x + 'px,' + pred.y + 'px,0)';
    } else {
      gazeDot.style.display = 'none';
    }

    requestAnimationFrame(loop);
  }
}

I hope to apply these modifications to the current version of Webgazer myself in order to achieve a higher framerate for my study. My computer is probably not the reason for the currently low frame rate: I have Intel Core i5, 8GB RAM, and a 60 fps Webcam, and closed all CPU intensive programs while testing.

Thank you in advance for a quick answer!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.