laitimes

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

author:Quantum Position

Jin Lei from Au Fei Temple

量子位 | 公众号 QbitAI

Nvidia has launched its own version of ChatGPT, and the name has a GPU flavor -

Chat With RTX。

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

NVIDIA's AI chatbot is different from the current mainstream "players".

It doesn't run on the web or in the app, but needs to be downloaded and installed on a PC.

This wave of operations will not only run faster, but also mean that Chat with RTX may not have so many restrictions in terms of chat content.

Netizens also expressed their feelings about this point:

Wow ~ This is a local run yes ~
The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

Of course, in terms of configuration, it is also required, and only an RTX 30 or 40 series graphics card of at least 8GB is required.

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

So how effective is Chat With RTX in practice, let's move on to the next step.

Nvidia version of ChatGPT

First of all, it's worth mentioning that Chat With RTX isn't a large language model (LLM) built by NVIDIA itself.

It still relies on two open-source LLMs, Mistral and Llama 2, which users can choose according to their preferences when running.

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

After picking the LLM, you can upload the local file in Chat With RTX.

Supported file types include txt, .pdf, .doc/.docx, and .xml.

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

Then it's time to start asking questions, such as:

What are Sarah's recommended restaurant names?
The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

Since it runs locally, Chat With RTX generates answers very quickly, and it's really "snap":

Sarah推荐的餐厅名字叫做The Red Le Bernardin。

In addition, another highlight of Chat with RTX is that it can answer questions based on online videos.

For example, "feed" a link to a YouTube video:

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

然后向Chat With RTX提问:

What did Nvidia announce at CES 2024?
The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

Chat With RTX also responds to video content at a blazing fast speed.

As for the technical aspects behind it, Nvidia officially just mentioned simply: "Retrieval Enhanced Generation (RAG), NVIDIA TensorRTLLM software, NVIDIA RTX, etc. ”

How to eat?

As we just mentioned, the use of Chat with RTX is simple and requires just one download and installation action.

However, in terms of configuration, in addition to the GPU requirements, there are some conditions, such as:

  • System: Windows 10 or Windows 11
  • RAM: At least 16GB
  • Driver: 535.11 or later
The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

However, Chat With RTX is not very light in size, with a total of about 35G.

So before downloading it, be sure to check the installation conditions required for Chat With RTX.

Otherwise, there will be all kinds of tragedies:

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

However, the actual measurement was complained about

The Verge在英伟达发布Chat With RTX之后,立即展开了一波实测。

However, the conclusion was a big surprise.

For example, the search video function mentioned above, during the actual test, it actually downloaded the transcript of a completely different video.

Second, if you "feed" too many files to Chat with RTX, for example, if you have Chat with RTX index 25,000 documents, it will simply "strike" and crash.

As well as it also "can't remember" the context, so the subsequent question can't be based on the previous question.

Finally, there is a slot, which is that downloading Chat with RTX consumes a full half an hour for testers......

However, in addition to the slots, The Verge also affirmed the advantages of Chat with RTX more neutrally.

For example, searching for documents on a computer is really fragrant with speed and accuracy.

And making a summary is also what Chat with RTX is good at:

The Nvidia version of ChatGPT is here, deployed on the PC side, and it is very GPU

What's more, running such a mechanism locally ensures the security of user files.

So would you pick the Nvidia version of ChatGPT?

Reference Links:

[1]https://www.theverge.com/2024/2/13/24071645/nvidia-ai-chatbot-chat-with-rtx-tech-demo-hands-on

[2]https://news.ycombinator.com/item?id=39357900

[3]https://blogs.nvidia.com/blog/chat-with-rtx-available-now/

[4]https://twitter.com/rowancheung/status/1757429733837418610

— END —

QbitAI · Headline number signed

Follow us and be the first to know about cutting-edge technology trends

Read on