Giter Club home page Giter Club logo

notebooks's People

Contributors

charliefruan avatar davidpissarra avatar hzfengsy avatar islavutin avatar jinhongyii avatar masterjh5574 avatar rickzx avatar sbelcmu avatar sudeepag avatar tqchen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

notebooks's Issues

`Tutorial_compile_llama2_with_mlc_llm.ipynb` dose not work

I follow this notebook in colab with T4 GPU.
There are several problems:

  • Now the version of cuda is 12.2 instead of 11.8. mlc-ai-nightly-cu118 mlc-chat-nightly-cu118 should be replaced by mlc-ai-nightly-cu122 mlc-chat-nightly-cu122
  • The prebuilt_libs does not work for colab T4 GPU (Get a error CUDA_ERROR_NO_BINARY_FOR_GPU). I compiled the model lib on colab with command !mlc_chat compile ./dist/Llama-2-7b-chat-hf-q4f32_1-MLC/mlc-chat-config.json --device cuda -o dist/Llama-2-7b-chat-hf-q4f32_1-cuda.so and it works to me. Maybe consider update the prebuild_libs.

ModuleNotFoundError: No module named 'tvm'

Hello,

I tried to run the notebook on google colab: https://github.com/mlc-ai/notebooks/blob/main/mlc-llm/tutorial_extensions_to_more_model_variants.ipynb

I created a conda environment, activated it, installed MLC-AI and MLC-Chat nightly build packages and finally installed mlc-llm package. But when trying to run:

import mlc_llm
import mlc_chat
import tvm

I got the following error message:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
[<ipython-input-20-4d568b070d66>](https://localhost:8080/#) in <cell line: 1>()
----> 1 import mlc_llm
      2 import mlc_chat
      3 import tvm

2 frames
[/content/mlc-llm/mlc_llm/dispatch/dispatch_tir_operator.py](https://localhost:8080/#) in <module>
      1 # pylint: disable=missing-docstring
----> 2 import tvm
      3 from tvm import IRModule
      4 
      5 

ModuleNotFoundError: No module named 'tvm'

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------

Is there a workaround since packaged seem to be installed regarding to following working commands:

!python -c "import tvm; print('tvm installed properly!')"
!python -c "import mlc_chat; print('mlc_chat installed properly!')"
tvm installed properly!
mlc_chat installed properly!

Notebook 7 fails due to missing pip dependency

When attempting to run notebook 7 (Ep7: GPU and Hardware Acceleration, Part 1) in a Google Colab, runing the first cell (the pip install step) fails. Specifically, the first cell goes as follows:

!python3 -m  pip install mlc-ai-nightly-cu110 -f https://mlc.ai/wheels

However, this leads to the following error:

Looking in links: https://mlc.ai/wheels
ERROR: Could not find a version that satisfies the requirement mlc-ai-nightly-cu110 (from versions: none)
ERROR: No matching distribution found for mlc-ai-nightly-cu110

API change in Notebook 6 - `DynTensorType` throwing errors

Thank you for an excellent course!

Running this block from notebook 6:

A = relax.Var("A", (128, 128), relax.DynTensorType(2, "float32"))
B = relax.Var("B", (128, 128), relax.DynTensorType(2, "float32"))

Results in:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[17], line 1
----> 1 A = relax.Var("A", (128, 128), relax.DynTensorType(2, "float32"))
      2 B = relax.Var("B", (128, 128), relax.DynTensorType(2, "float32"))

File ~/opt/anaconda3/envs/mlcai/lib/python3.9/site-packages/tvm/relax/expr.py:416, in Var.__init__(self, name_hint, struct_info, span)
    414     struct_info = tvm.runtime.convert_to_object(struct_info)
    415     if not isinstance(struct_info, StructInfo):
--> 416         raise TypeError(
    417             "struct_info needs to be an instance of StructInfo. "
    418             "If you attempt to pass in shape, "
    419             "use relax.TensorStructInfo(shape, dtype)."
    420         )
    421 self.__init_handle_by_constructor__(
    422     _ffi_api.Var if isinstance(name_hint, str) else _ffi_api.VarFromId,  # type: ignore
    423     name_hint,
    424     struct_info,
    425     span,
    426 )

TypeError: struct_info needs to be an instance of StructInfo. If you attempt to pass in shape, use relax.TensorStructInfo(shape, dtype).

Modifying the original block to the block below works:

A = relax.Var("A", relax.TensorStructInfo([128, 128], "float32"))
B = relax.Var("x", relax.TensorStructInfo([128, 128], "float32"))

But I can't tell from following the code on TVM's Unity branch if this change is consistent with the original intent of using relax.DynTensorType.

Likewise, under Create Map Function, from_fx contains the line:

input_var = relax.Var(
    node.target, shape, relax.DynTensorType(len(shape), "float32")
)

Which throws:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[27], line 10
      7     A = node_map[node.args[0]]
      8     return bb.emit_te(te_relu, A)
---> 10 MyModule = from_fx(
     11     fx_module, 
     12     input_shapes = [(1, 128)], 
     13     call_function_map = {
     14       torch.matmul: map_matmul,
     15       torch.relu: map_relu, 
     16     },
     17     call_module_map={},
     18 )
     20 MyModule.show()

Cell In[26], line 33, in from_fx(fx_mod, input_shapes, call_function_map, call_module_map)
     31 shape = input_shapes[input_index]
     32 input_index += 1 
---> 33 input_var = relax.Var(
     34     node.target, shape, relax.DynTensorType(len(shape), "float32")
     35 )
     36 fn_inputs.append(input_var)
     37 node_map[node] = input_var

File ~/opt/anaconda3/envs/mlcai/lib/python3.9/site-packages/tvm/relax/expr.py:416, in Var.__init__(self, name_hint, struct_info, span)
    414     struct_info = tvm.runtime.convert_to_object(struct_info)
    415     if not isinstance(struct_info, StructInfo):
--> 416         raise TypeError(
    417             "struct_info needs to be an instance of StructInfo. "
    418             "If you attempt to pass in shape, "
    419             "use relax.TensorStructInfo(shape, dtype)."
    420         )
    421 self.__init_handle_by_constructor__(
    422     _ffi_api.Var if isinstance(name_hint, str) else _ffi_api.VarFromId,  # type: ignore
    423     name_hint,
    424     struct_info,
    425     span,
    426 )

TypeError: struct_info needs to be an instance of StructInfo. If you attempt to pass in shape, use relax.TensorStructInfo(shape, dtype).

Here, a similar modification seems to fix the problem (but I'm still concerned that I don't understand if we're losing something by giving up on relax.DynTensorType):

input_var = relax.Var(
    node.target, relax.TensorStructInfo(shape, "float32")
)

Thanks in advance for your assistance.

Can not run this code when opening it in Colab

The Colab cores will be broken if I run the following part:
@R.function
def main(x: R.Tensor((1, 784), "float32"),
w0: R.Tensor((128, 784), "float32"),
b0: R.Tensor((128,), "float32"),
w1: R.Tensor((10, 128), "float32"),
b1: R.Tensor((10,), "float32")):
with R.dataflow():
lv0 = R.call_tir(linear0, (x, w0, b0), (1, 128), dtype="float32")
lv1 = R.call_tir(relu0, (lv0,), (1, 128), dtype="float32")
out = R.call_tir(linear1, (lv1, w1, b1), (1, 10), dtype="float32")
R.output(out)
return out
Please tell me how to deal with it. Thanks a lot, sorry for my poor English.

[Question] MetaSchedule tune_tir output

Hi everyone,

I'm recently trying to run mlc.ai course about optimization using ms.tune_tir (6.1. Part 1 โ€” Machine Learing Compilation 0.0.1 documentation).

Here is my source code:

import tvm
from tvm.ir.module import IRModule
from tvm.script import tir as T, relax as R
from tvm import relax
from tvm import meta_schedule as ms
import numpy as np

target="nvidia/geforce-rtx-3060"
dev = tvm.cuda(0)

@tvm.script.ir_module
class MyModuleMatmul:
    @T.prim_func
    def main(A: T.Buffer((1024, 1024), "float32"), 
             B: T.Buffer((1024, 1024), "float32"), 
             C: T.Buffer((1024, 1024), "float32")) -> None: 
        T.func_attr({"global_symbol": "main", "tir.noalias": True})
        for i, j, k in T.grid(1024, 1024, 1024):
            with T.block("C"):
                vi, vj, vk = T.axis.remap("SSR", [i, j, k])
                with T.init():
                    C[vi, vj] = 0.0
                C[vi, vj] = C[vi, vj] + A[vi, vk] * B[vk, vj]


sch_tuned = ms.tune_tir(
    mod=MyModuleMatmul,
    target=target,
    max_trials_global=64,
    num_trials_per_iter=64,
    work_dir="./tune_tmp_66",
)

sch = ms.tir_integration.compile_tir(sch_tuned, MyModuleMatmul, target)
rt_mod = tvm.build(sch.mod, target=target)

The tune_tir step creates database_tuning_record.json and database_workload.json in the work_dir.
But when running the compile_tir step, the sch becomes NoneType. So when building the module it produced the following error message:

AttributeError: 'NoneType' object has no attribute 'mod'

Why did the sch become NoneType, how can I solve the problem? Is the tuning process not run completely?

Thank you for your assistance.

API change in Episode 4: Build End to End Models

I am trying to run code block in jupyter notebook.
But it seems I encounter API mismatch issue.

@R.function
    def main(x: R.Tensor((1, 784), "float32"), 
             w0: R.Tensor((128, 784), "float32"), 
             b0: R.Tensor((128,), "float32"), 
             w1: R.Tensor((10, 128), "float32"), 
             b1: R.Tensor((10,), "float32")):
        with R.dataflow():
            lv0 = R.call_tir(linear0, (x, w0, b0), (1, 128), dtype="float32")
            lv1 = R.call_tir(relu0, (lv0,), (1, 128), dtype="float32")
            out = R.call_tir(linear1, (lv1, w1, b1), (1, 10), dtype="float32")
            R.output(out)
        return out

TypeError: got an unexpected keyword argument 'dtype'
Given that relax is under development, please help correct this API call.

Migrate `code2html` to `IRModule.show()`

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.