Giter Club home page Giter Club logo

Comments (7)

truemedian avatar truemedian commented on September 23, 2024 2

So, this is mostly just an artifact of how libuv's async_t works. Async callbacks are called on the event loop.

This leads to a few different things:
a) async callbacks are called on the main thread.
b) async callbacks are queued in the event loop, so they must be called synchronously.
c) async callbacks will not be called until the event loop starts running.
d) async callbacks will prevent run from returning.

The description of async_t hints at this behavior:

Async handles allow the user to "wakeup" the event loop and get a callback called from another thread.

If the event loop is sleeping (like waiting for io), async_send will cause a tick so that the callback can be called nearly immediately (on the tick).
However, its mostly intended for other (not main) threads to call a function on the main thread.

An Addendum

If you're looking for a function to run completely separately (on a different thread), then you can use either:

  • work_t: which will offload the function to libuv's threadpool (and may not be called immediately if its very busy).
  • thread_t: which will spawn a new os-level thread to call the function.

Just be aware that nearly all lua implementations (lua-lanes notwithstanding) are not re-entrant, so you can only pass threadargs between them (which notably does not include tables or coroutines).

from luv.

miversen33 avatar miversen33 commented on September 23, 2024

I had a feeling that is what async was doing, the first handful of points I knew. I didn't know async prevents run from returning, though that does explain the behavior I am seeing here.

Spawning a OS thread is an option though IMO a very expensive one. I will investigate the threadpool stuff :)

The documentation for work_t is also scarce, is there further documentation on it somewhere? Mainly I want to know, can I "join" against a work_t handle? IE, start it and then wait for it to complete? I am aware that goes against trying to run work asynchronously, I am more thinking in a "pool"/"group" context where I have multiple work_t handles that I start up, and then I wait for them all to complete before returning

from luv.

truemedian avatar truemedian commented on September 23, 2024

Its a bit more complicated and involved. new_work takes a work_callback (which is what does the actual work) and a after_work_callback (which gets called after the work is done).

work_callback will be called on a different thread. (with the arguments to queue_work)
after_work_callback will be called on the main thread. (with the returns from the work callback)

Using this you could create a bit of code that yields the current coroutine when you queue the work and then resume it in the after_work_callback.

from luv.

miversen33 avatar miversen33 commented on September 23, 2024

That makes sense to me. I appreciate you! I will close this out as I have the answers I am looking for.

from luv.

miversen33 avatar miversen33 commented on September 23, 2024

So I hate to be that person who re-opens issues, I thought I had what I needed with our conversation earlier. However, after dinner and testing this a bit, I am running into very similar behavior to what I was seeing with the async code above.

My code

local luv = require("luv")

local func1 = function(params)
    print(params)
    local max_loop = 1000000000
    local _ = 0
    while _ < max_loop do _ = _ + 1 end
    return 'dun'
end
local func2 = function(params)
    print(params)
    local max_loop = 1000
    local _ = 0
    while _ < max_loop do _ = _ + 1 end
    return 'dun'
end

local func1work = luv.new_work(func1, function(result) print(string.format("func1 complete! %s", result)) end)
local func2work = luv.new_work(func2, function(result) print(string.format("func2 complete! %s", result)) end)
print("Starting background operations")
func1work:queue("hello")
func2work:queue("world")

print("Running long operation in foreground while we wait")
local max_loop = 1000
local _ = 0
luv.run()
while _ < max_loop do
    _ = _ + 1
end
print("Completed foreground operation")

My expected output

Starting background operations
Running long operation in foreground while we wait
hello
world
func2 complete! dun
Completed foreground operation
func1 complete! dun

Actual output

Starting background operations
Running long operation in foreground while we wait
hello
world
func2 complete! dun
func1 complete! dun
Completed foreground operation

It seems that the work_t handles are also not allowing run to return which means that my "main" work load (the loop outside the handles in this case) is still being blocked by the processes that should be running in the background. Am I missing something here?

from luv.

truemedian avatar truemedian commented on September 23, 2024

Sorry if this wasn't clear: luv.run() runs the libuv event loop until there is nothing left in the queue. That means that everything that might call a callback must be finished before run can return.

This is mostly just a limitation of event loops. To call a callback, the event loop must be running.

You should basically never be writing code after luv.run() that isn't cleanup code that should happen before the process exits.

from luv.

miversen33 avatar miversen33 commented on September 23, 2024

Ahh that makes sense! In this case, I am testing something that will (eventually) be running in a program that manages the libuv event loop for me, so with that being said, it sounds like I may not have to worry about that. To avoid prematurely closing this out (again), I will report back once I have tested that in said program to ensure that it does indeed work without me having to worry about run :)

Edit: Tested and indeed you are correct! My above issue is due to me managing the libuv event loop, and since the program I am writing for manages it for me, this works perfectly. Thank you!

from luv.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.