Giter Club home page Giter Club logo

fal's People

Contributors

badayvedat avatar burkaygur avatar chamini2 avatar drochetti avatar efiop avatar fal-bot avatar i64 avatar isidentical avatar mederka avatar squat avatar turbo1912 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fal's Issues

Show better HTTP failure details

A user sent this stack trace

HTTPStatusError                           Traceback (most recent call last)
Cell In[11], line 5
      2 recursive_interplation_passes = 1
      3 output_fps = 32 * (2**(recursive_interplation_passes))
----> 5 handler = fal_client.submit(
      6     "fal-ai/amt-interpolation",
      7     arguments={
      8         "video_url": public_url,
      9         "output_fps": output_fps,
     10         'recursive_interpolation_passes': recursive_interplation_passes,
     11     },
     12 )
     14 log_index = 0
     15 for event in handler.iter_events(with_logs=True):

File .../python3.11/site-packages/fal_client/client.py:356, in SyncClient.submit(self, application, arguments, path)
    349     url += "/" + path.lstrip("/")
    351 response = self._client.post(
    352     url,
    353     json=arguments,
    354     timeout=self.default_timeout,
    355 )
--> 356 response.raise_for_status()
    358 data = response.json()
    359 return SyncRequestHandle(
    360     request_id=data["request_id"],
    361     response_url=data["response_url"],
   (...)
    364     client=self._client,
    365 )

File .../python3.11/site-packages/httpx/_models.py:761, in Response.raise_for_status(self)
    759 error_type = error_types.get(status_class, "Invalid status code")
    760 message = message.format(self, error_type=error_type)
--> 761 raise HTTPStatusError(message, request=request, response=self)

HTTPStatusError: Client error '403 Forbidden' for url 'https://queue.fal.run/fal-ai/amt-interpolation'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403

The httpx raise_for_status method gives a very bad detail when failing. We should take the response's content and show that to the user instead. For example, for a 403 we get from the API:

{"detail": "User is locked. Reason: Insufficient funds."}

support chained exceptions

Having support for chained exceptions would be great for internal and external users.

The way we currently handle user exceptions is we save the exc object (without __cause__ or __traceback__), a was_it_raised flag and a stringified traceback (this is mostly happening in isolate).

We should be able to just sprinkle tblib's tb/exc pickle/unpickle onto dill and that should get us all we want. This should also make stringisized_traceback field in isolate redundant from what I can tell, but need to check.

Pre-requisite for #140

Gradio demo for fal-ai

I want to make a Gradio demo for fal-ai, showing its capabilities interactively. Is there a need for this? Can I contribute?

fal: add ability to run apps in code

Right now you need to find and hit the endpoint yourself, but it would be nice to be able to start and use the app natively in the code. E.g.

app.start()
app.endpoint_method(arg1, arg2)

Need to investigate how hard it would be to convert python inputs to json to send to the endpoint and then convert the results back. But maybe could just use dicts...

Publish Releases on Git Tag

Right releases are published to pypi by manually triggering the release.yml workflow, however I think it would be more semantic and a best practice to use Git tags to mark releases of the project and then use the tag event in CI to automatically trigger the steps needed to publish.

I think this is better because it:

  • clearly indicates to the public what versions of the program exist, i.e. someone can look at all of the tags of the repo to know what versions have been released;
  • allows for signing versions; and
  • allows for checking out a specific version of the code using the release name.

pydantic 2 migration issues

Tested with the diff main...matteo/pydantic2-test

I was trying out what the migration to Pydantic 2 could look like and I get the following:

from __future__ import annotations

from fal import cached, function


@cached
def my_cached_function():
    import datetime

    print("Cached function")
    return datetime.datetime.now()


@function(
    machine_type="S",
    serve=True,
)
def hello_query(name: str):
    started_at = my_cached_function()
    print(started_at, f"Messaged by: {name}")
    return f"Hello, {name}"

# Now with pydantic dataclasses for the payload
from pydantic.dataclasses import dataclass

@dataclass
class HelloModel:
    name: str

@function(
    machine_type="S",
    serve=True,
)
def hello_dataclass(data: HelloModel):
    started_at = my_cached_function()
    print(started_at, f"Messaged by: {data.name}")
    return f"Hello, {data.name}"

Runing the query version it works as expected and can be used

❯ fal fn run t/other.py hello_query

...

2024-01-12 18:23:30.971 [info     ] Installing collected packages: typing-extensions, tblib, sniffio, protobuf, platformdirs, idna, h11, grpcio, dill, click, annotated-types, uvicorn, pydantic-core, isolate, anyio, starlette, pydantic, fastapi
2024-01-12 18:23:33.576 [info     ] Successfully installed annotated-types-0.6.0 anyio-4.2.0 click-8.1.7 dill-0.3.7 fastapi-0.109.0 grpcio-1.60.0 h11-0.14.0 idna-3.6 isolate-0.12.3 platformdirs-4.1.0 protobuf-4.25.2 pydantic-2.5.3 pydantic-core-2.14.6 sniffio-1.3.0 starlette-0.35.1 tblib-3.0.0 typing-extensions-4.9.0 uvicorn-0.25.0
2024-01-12 18:23:33.850 [info     ]
2024-01-12 18:23:33.850 [info     ] [notice] A new release of pip is available: 23.3.1 -> 23.3.2
2024-01-12 18:23:33.850 [info     ] [notice] To update, run: python -m pip install --upgrade pip
2024-01-12 18:23:35.990 [stdout   ] Compression complete. Uploading it...
2024-01-12 18:23:36.127 [stdout   ] Upload complete.
2024-01-12 18:23:36.128 [stdout   ]
2024-01-12 18:23:57.195 [info     ] Access your exposed service at https://1774a50b-9241-4db0-ab5b-0457cc6a5b9f.gateway.alpha.fal.ai
2024-01-12 18:23:59.430 [stderr   ] INFO:     Started server process [42]
2024-01-12 18:23:59.431 [stderr   ] INFO:     Waiting for application startup.
2024-01-12 18:23:59.432 [stderr   ] INFO:     Application startup complete.
2024-01-12 18:23:59.432 [stderr   ] INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
2024-01-12 18:28:33.325 [stdout   ] INFO:     10.52.4.70:42250 - "GET / HTTP/1.1" 405 Method Not Allowed
2024-01-12 18:28:52.066 [stdout   ] INFO:     10.52.4.70:48494 - "POST / HTTP/1.1" 422 Unprocessable Entity
2024-01-12 18:28:57.409 [stdout   ] Cached function
2024-01-12 18:28:59.038 [stdout   ] 2024-01-12 18:28:57.409594 Messaged by: matteo
2024-01-12 18:28:59.039 [stdout   ] INFO:     10.52.4.70:56564 - "POST /?name=matteo HTTP/1.1" 200 OK

But when I try to use the dataclass version I get an error

❯ fal fn run t/other.py hello_dataclass
2024-01-12 18:30:44.615 [info     ] Access your exposed service at https://4c793170-2fe2-4fbe-a633-64d52d1e704f.gateway.alpha.fal.ai
Error while deserializing the given object

Notice this error while deserializing seems to be local (gRPC deserializing?)

Specifically, running the above command with --debug I get

...
│ .../python3.11/site-packages/isolate/connections │
│ /grpc/interface.py:29 in _                                                                       │
│                                                                                                  │
│   26                                                                                             │
│   27 @from_grpc.register                                                                         │
│   28 def _(message: definitions.SerializedObject) -> Any:                                        │
│ ❱ 29 │   return load_serialized_object(                                                          │
│   30 │   │   message.method,                                                                     │
│   31 │   │   message.definition,                                                                 │
│   32 │   │   was_it_raised=message.was_it_raised,                                                │
│                                                                                                  │
│ .../python3.11/site-packages/isolate/connections │
│ /common.py:76 in load_serialized_object                                                          │
│                                                                                                  │
│    73 │   │   │   importlib.import_module(serialization_method)                                  │
│    74 │   │   )                                                                                  │
│    75 │                                                                                          │
│ ❱  76 │   with _step("deserializing the given object"):                                          │
│    77 │   │   result = serialization_backend.loads(raw_object)                                   │
│    78 │                                                                                          │
│    79 │   if was_it_raised:                                                                      │
│                                                                                                  │
│ .../python3.11/contextlib.py:155 in __exit__                │
│                                                                                                  │
│   152 │   │   │   │   # tell if we get the same exception back                                   │
│   153 │   │   │   │   value = typ()                                                              │
│   154 │   │   │   try:                                                                           │
│ ❱ 155 │   │   │   │   self.gen.throw(typ, value, traceback)                                      │
│   156 │   │   │   except StopIteration as exc:                                                   │
│   157 │   │   │   │   # Suppress StopIteration *unless* it's the same exception that             │
│   158 │   │   │   │   # was passed to throw().  This prevents a StopIteration                    │
│                                                                                                  │
│ .../python3.11/site-packages/isolate/connections │
│ /common.py:42 in _step                                                                           │
│                                                                                                  │
│    39 │   try:                                                                                   │
│    40 │   │   yield                                                                              │
│    41 │   except BaseException as exception:                                                     │
│ ❱  42 │   │   raise SerializationError("Error while " + message) from exception                  │
│    43                                                                                            │
│    44                                                                                            │
│    45 def as_serialization_method(backend: Any) -> SerializationBackend:                         │
...

revisit release process

Current release actions push to pypi right away and then create a PR to bump to the next alpha, which is hella confusing. I'll revisit this to be a bit more practical.

migrate to cloudpickle from dill

Also this makes a topic of migrating to cloudpickle (which I saw was discussed in the past in some places) relevant again, as it is nice to be able to dump our in-fal things (e.g. exceptions, so that we don't have to deploy our library version to the worker) by value but load things back by reference (so things like isinstance work). Patching out dill to handle that would rpobably not work as we'll need that serverside as well (e.g. won't be able to use new fal client with older server).

Originally posted by @efiop in #141 (comment)

Filter `fal alias list`

Right now, if I want to get some information about a serving app, such as keep_alive length, the only way to do it is fal alias list. Would be great to be able to be able to filter by app alias.

Bonus points: filter by regex or other column values. For example, I might want to list all the shared apps that I have.

get fal on conda

This is just a matter of time until someone asks for it, but it takes some time for the submition to be reviewed and merged. I'll start the process and once we are there will add automation so we can avoid worrying about it most of the time.

For the record: this is super low priority

fal: introduce fal.yaml

E.g. something like having fal.yaml in your projects root with smth like this:

functions:
- name: FastSDXL
  module: fastsdxl.export
  alias: myfastsdxl
  auth: public

this could spare you remembering what apps you have and what commands to run to deploy them. And commands like fal fn run/serve (and I guess #173 deploy?) could use it. Open to any suggestions on how it should look and what could include.

forbid nested isolated functions

If one tries to call one isolated function from another isolated function we currently might fail with a deserialization error

match=r"accessed through 'test1'",
, but depending on how pickling goes we might fail to serialize it in the first place. It would be nice to just raise a proper exception instead. E.g. we could utilize a global flag to raise when an isolated function is trying to to run another isolated function, or we might go the code inspection route to hunt for IsolatedFunction instances in the function code, or we could also do something on the isolate server side to forbid nesting.

Clarification: we probably don't want to limit someone using isolated function in another library, but we definitely want to limit isolated functions in pickled code.

Related #168

ui: reduce traceback to user code for user exceptions

It is surprising and confusing for users to see our code in their tracebacks if their function raised some exception, we should hide that during normal operations.

The way isolate and fal currently work we would need to just litteraly walk through the traceback and remove stuff that is not from the user code, but I'm also considering maybe changing the logic in isolate's load_serialized_object to not raise but instead return something that could both cover a return value and an exception, so that we can later decide how to handle it and have nice original traceback to work with.

This is relevant for both api and cli.

Successfully installed catboost wants to import as _catboost

I'm trying to use a model trained outside dbt to predict labels via python under dbt-fal

system deets:

Running with dbt=1.5.9
Registered adapter: fal=1.5.4
Registered adapter: postgres=1.5.9

fal-project.yml:

environments:
  - name: ml
    type: venv
    requirements:
      - scipy
      - pandas
      - numpy
      - statsmodels
      - catboost

catboost was just added, and the first run of the file below produced a long installation log to stdout, ending with

[builder] [info] Successfully installed [...] catboost-1.2.2 [...]

Running the python model below with dbt run select ... gives me the subsequent error

from catboost import CatBoostRegressor
from pandas import concat

def model(dbt, fal):
    dbt.config(fal_environment="ml")

    df: pandas.DataFrame = dbt.ref("tr_rep_gentrification_prediction_inputs")

    X = df\
        .drop(['col0', 'col1', 'col2'], axis=1)\
        .fillna(0.0)

    model = CatBoostRegressor()
    model.load_model('cb_model.cbm')

    pred = model.predict(X)
    results = concat([df, pred], axis=0)

    return(results)

stdout:

No module named '_catboost'
22:55:01  1 of 1 ERROR creating python table model trans.tr_rep_gentrification_prediction_outputs  [ERROR in 42.02s]
22:55:02  
22:55:02  Finished running 1 table model in 0 hours 0 minutes and 58.89 seconds (58.89s).
22:55:02  
22:55:02  Completed with 1 error and 0 warnings:
22:55:02  
22:55:02  No module named '_catboost'

At the very least the module appears to be picking up a leading underscore. Why? This hasn't happened previously with any of the other imports.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.