Home » NVIDIA’s AI chatbot is now available for download! What’s so great about Chat With RTX?

NVIDIA’s AI chatbot is now available for download! What’s so great about Chat With RTX?

What is it best used for?

by Justin Cheng
0 comment

With the advent of AI PC, NVIDIA has released an AI chatbot Chat with RTX exclusively for Windows systems, which allows users to run AI on the PC itself and choose to use the Mistral or Llama AI model. An early technical version of Chat with RTX is now available for free download.

In fact, Chat with RTX has already received a lot of attention from domestic and foreign media before it was even opened for download. What are the features of Chat with RTX? Can it run on every computer? What are the comments from users who have already unboxed and tested it?

What are the features of Chat with RTX? 2 highlights

The biggest selling point of Chat with RTX is its offline and custom functions, which can provide more personalized generative AI services on the computer.


1. You can run AI without connecting to the Internet. No one knows what you said to AI.

Let’s talk about offline functionality first. NVIDIA is the first among AI giants to release a free AI chatbot designed for offline use. Compared with other generative AI services such as Copilot and Gemini, the biggest difference between Chat with RTX is that it can run generative AI on laptops and provide users with answers without the need for an Internet connection.

Why is offline functionality important? This allows you to get query results faster on Windows, while user data remains on the device. In addition, because there is no need to share data with third parties or connect to the Internet, you can process sensitive data locally on your PC . This process is completely private and more in line with security considerations because no one will know what you discussed with the AI.

2. Provide customized functions to better meet your personal needs

Chat with RTX allows users to quickly and easily use files on the PC as a data set and connect to open-source large-scale language models such as Mistral or Llama 2 to quickly query for contextual answers.

Chat with RTX allows you to import documents in various file formats, including .txt, PDF, Word and XML, etc., and then respond based on the information you provide. For example, you can ask “Which restaurant in Las Vegas does my partner recommend?”

Users can also add information from YouTube videos and playlists – enter the video URL in Chat with RTX to integrate this information with the chatbot and perform contextual queries, for example, based on the video content of your favorite influencer To recommend travel information or various teaching course resources.

In other words, if the questions that users want to know have answers in the files on their computers, they no longer need to crawl through articles and videos one after another to answer them, which can save a lot of time and improve work efficiency.

The NVIDIA video below demonstrates usage scenarios. In addition, some foreign media also gave their experiences after testing Chat with RTX.

What is Chat with RTX suitable for? Are there any disadvantages?

A reporter from technology media “The Verge” found in actual testing that Chat with RTX is very suitable for searching content in local videos and podcasts, but it is not perfect when searching for text records of YouTube videos. In addition, Chat with RTX is also helpful for scanning PDFs and checking factual data.

After using it, a reporter from “Tom’s Hardware” commented that Chat with RTX is a very cool application that provides users with more control compared to Copilot and Gemini. However, he also mentioned that whether this is an advantage depends on the needs.

It is worth noting that the download file of Chat with RTX is very large─40 GB, and the installation time is about 30 minutes. In addition, Chat with RTX will not remember the context of the chat conversation, which means that subsequent questions you ask cannot be based on the previous question and you must give complete question information again.

In addition, Chat with RTX itself is for demonstration purposes, and errors or crashes may occur in actual execution performance.

Which PCs can be used to run Chat with RTX?

Chat with RTX leverages Retrieval Enhanced Generation (RAG), NVIDIA TensorRT-LLM software, and NVIDIA RTX acceleration technology to bring generative AI capabilities to Windows PCs powered by GeForce technology.

Therefore, if you want to play Chat with RTX, your laptop must be equipped with a GeForce RTX 30 series or higher GPU, and it must have more than 8GB of VRAM. In addition, remember to use the operating system Windows 10 or 11 and the latest NVIDIA GPU driver.

Although Chat with RTX currently only has a demonstration version, the subsequent development is worth looking forward to. Readers who are interested in trying it out may wish to download it and experience it for themselves.

Top image credit: NVIDIA

You may also like

Welcome to Techghetti, your ultimate destination for the latest insights, news, and trends in the dynamic world of AI and Technology.


Copyright ©2024 TechGhetti – All Right Reserved.