Welcome to WarBulletin - your new best friend in the world of gaming. We're all about bringing you the hottest updates and juicy insights from across the gaming universe. Are you into epic RPG adventures or fast-paced eSports? We've got you covered with the latest scoop on everything from next-level PC gaming rigs to the coolest game releases. But hey, we're more than just news! Ever wondered what goes on behind the scenes of your favorite games? We're talking exclusive interviews with the brains behind the games, fresh off-the-press photos and videos straight from gaming conventions, and, of course, breaking news that you just can't miss. We know you love gaming 24/7, and that's why we're here round the clock, updating you on all things gaming. Whether it's the lowdown on a new patch or the buzz about the next big gaming celeb, we're on it.

Contacts

  • Owner: SNOWLAND s.r.o.
  • Registration certificate 06691200
  • 16200, Na okraji 381/41, Veleslavín, 162 00 Praha 6
  • Czech Republic

NVIDIA’s “Chat With RTX” Is A Localized AI Chatbot For Windows PCs Powered By TensorRT-LLM & Available For Free Across All RTX 30 & 40 GPUs

Expanding its AI ecosystem, NVIDIA has introduced "Chat with RTX", a chatbot for Windows PCs that is powered by TensorRT-LLM & available for free on the latest RTX GPUs.

NVIDIA Wants To Replace ChatGPT With Its Own Locally-Available "Chat With RTX" AI Chatbot That's Available For Free On RTX 30 & 40 GPUs

The utility of the "Chat with RTX" chatbot is very simple, it is designed as a localized system which means that you will have a personalized GPT chatbot available to you all the time on your PC without the need to go online. Chat with RTX can be fully personalized by utilizing a dataset that is available locally on your PC and the best part is that it runs across almost all RTX 40 & RTX 30 GPUs.

Related Story NVIDIA Back On Earth After Briefly Beating Alphabet, Amazon On Stock Market

Starting with the details, Chat with RTX leverages NVIDIA's TensorRT-LLM & Retrieval Augmented Generated (RAG) software which was announced for Windows PCs last year & takes full advantage of the RTX acceleration available on RTX hardware to deliver the best possible experience to users. Once again, the application is supported across all GeForce RTX 30 & 40 GPUs with at least 8 GB of video memory.

After downloading "Chat with RTX" for free, users can connect it to a local dataset available on the PC (.txt, .pdf, .doc, .docx, .xml) and connect it to a large language model such as Mistral and Llama 2. You can also add specific URLs for example for YouTube videos or entire playlists to further enhance the dataset search results. After connecting, users can then use Chat With RTX the same way as they would use ChatGPT by running different queries but the results generated will be based entirely on the specific dataset, giving you better responses compared to online methods.

https://cdn.wccftech.com/wp-content/uploads/2024/02/chat-with-rtx-demo-looping-video.mp4

Having an NVIDIA RTX GPU that supports TensorRT-LLM means that you will have all your data and projects available locally rather

Read more on wccftech.com