Hi guys does anyone know if is it possible to run jetson-containers mlc on external models like nanochat ? So far I have only been able to run models from here dusty-nv (Dustin Franklin)
*** Please note that this reply is generated by LLM automatically ***
The question is about running Jetson containers (MLC) on external models like NanoChat.
Based on the provided context, it seems that the Jetson containers (MLC) are primarily designed to work with models from the NVIDIA repository. However, there is no explicit information that suggests it’s impossible to run external models like NanoChat.
The context mentions that for running Large Language Models (LLM) benchmarks, one should refer to the MLC container documentation. It also mentions that for running Small Language Models (SLM) benchmarks, one should refer to the SLM tutorial and MLC container documentation. This implies that the MLC container is capable of running various models, but it’s unclear if it can run external models like NanoChat.
To answer the question, it’s possible that running external models like NanoChat on Jetson containers (MLC) might require additional configuration or modifications. However, without more specific information or documentation, it’s difficult to provide a definitive answer.
If you’re looking to run external models like NanoChat on Jetson containers (MLC), I would recommend the following steps:
- Check the MLC container documentation to see if there are any specific instructions or guidelines for running external models.
- Look for any tutorials or examples that demonstrate how to run external models on Jetson containers (MLC).
- Reach out to the NVIDIA community or forums to ask if anyone has successfully run external models like NanoChat on Jetson containers (MLC).
- Consider modifying the MLC container to accommodate the external model, but be aware that this might require significant technical expertise.
In comparison to other models mentioned in the context, such as Llama-7B, NanoVLM, and Riva, it’s unclear if NanoChat would be compatible with the Jetson containers (MLC). However, if you’re looking to run a specific model, it’s always a good idea to check the documentation and community resources to see if anyone has successfully run that model on the desired platform.
In summary, while it’s unclear if running external models like NanoChat on Jetson containers (MLC) is possible, it’s worth exploring the documentation and community resources to see if anyone has successfully done so.
*** Please give the thumbs up if you get answers you like. Or provide feedback to help us improve the answer. ***
Hi,
Sorry that we don’t have experience with Nanochat.
But there are several frameworks available on Orin Nano, not only MLC:
Ollama: dustynv/ollama - Docker Image
vLLM: dustynv/vllm - Docker Image
SGLang: dustynv/sglang - Docker Image
Thanks.
Ok thank you so much! I will try to set these up