Giter Club home page Giter Club logo

archives's People

Contributors

albertc5 avatar r888888888 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

archives's Issues

Process.daemon redirects all output to /dev/null by default

When no arguments are specified, Process.daemon redirects all output to /dev/null; therefore, if the service is run directly via bundle exec ruby services/sqs_processor.rb --logfile=my.log --pidfile=my.pid, no output is logged. After changing the line to Process.daemon(true, true), it executes fine and logs output as expected.

Disclaimer: I have no idea what sets the RUN environment variable, because googling that turned out quite difficult.

deploying

Sorry to bother you with newb stuff.

Is there any way I could get instructions for deploying this? iqdb/iqdbs and archives seem less straight-forward than danbooru is.

Enforce FIFO order

I vaguely remember @r888888888 mentioning there's no guarantee that messages will get delivered/processed in order they were pushed. Due to the nature of versioning service this spells disaster, because if two versions for the same post get delivered out of order, the diffs will get calculated and stored incorrectly.

Some thoughts:

  1. AWS supports FIFO mode as far as I can tell, they advertise it on their front page; it should be enabled if it isn't already
  2. Version ID can be generated from a sequence on a main danbooru site and pushed to the queue as well. That way, if versions get delivered out of order, we at least have a chance of reordering everything properly: calculate diff to the nearest prior version and recalculate all diffs for later versions. Version merging would interfere, though, so it's best to make sure that queue is processed in order.

booru_id is null

When I create a post or a pool, the operation is successful but Archives' database is not updated due to the "booru_id" column being null.
I should probably specify that I'm using Archives without AWS, after I substituted the cloud service with a local service reading from a UNIX socket (it's just a ruby server with two threads writing to and reading from a shared Queue object.)
Despite that radical, the data is sent correctly to Archives from the main Danbooru and in fact inspecting it I could see that all the expected fields are there with a value, except "booru_id" which is missing from the JSON and thus null.
I solved this by creating a new migration dropping the "booru_id" column, but I don't know if this is the correct solution.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.