Giter Club home page Giter Club logo

Comments (3)

spaghettidba avatar spaghettidba commented on June 20, 2024

I'm not really sure that you can avoid this type of errors completely. There will always be a number of events that you will not be able to replay, especially at the beginning of the replay itself.
The ideal solution depends on what you are trying to achieve: are you trying to compare production with test, test with test? What is the purpose of the replay?
Depending on the answer, you could:

  1. Ignore the errors. If you are comparing a first replay on test with a second replay on test, both replays will get the same errors, so the comparison is still fair
  2. Remove the offending events from the captured events. If you need the replay to contain 0 errors, you could open the source .sqlite file with a sqlite client like DB Browser for Sqlite and delete the events that are causing trouble. You could even decide to delete the first 5 or more minutes of events. If this solution suits you, I could add a property to make the replay skip a certain amount of minutes or events before enqueueing events for replay.

Hope this helps

from workloadtools.

JakubKad avatar JakubKad commented on June 20, 2024

Hello,

After a while we have conclusion. We used MARK (Begin Transaction With Mark) to get highlight spot. Using this we restored Backup and Logs to the MARK point and started replay from the beginning (when the Capture started). Quick little question at the end. These little files after the Capture was concluded, are they containing data from the main TEMP file and are therefore to inject some small samples or just doing the check of the capture file (sqlite)?

Capture is creating big TEMP file, these cache files are there after the capture was concluded.

Edit: I know there is a CACHE option in JSON (We have not defined it). But I am not entirely sure of the purpose of these files (around 100 before we shutted all down for good, 'cause the CMD were not writing a thing anymore and only these CACHE files were popping up, we used the timer option in JSON).

image

from workloadtools.

spaghettidba avatar spaghettidba commented on June 20, 2024

Glad you sorted it out!
Regarding the cache files, those are for caching events to disk before processing them. The events queue is a memory-mapped file and needs the cache to avoid using all the memory in the host. I'm surprised those file don't get deleted though, this is not the intended behavior. Thanks for reporting it, I'll have a look at the code.

from workloadtools.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.