Welcome to WarBulletin - your new best friend in the world of gaming. We're all about bringing you the hottest updates and juicy insights from across the gaming universe. Are you into epic RPG adventures or fast-paced eSports? We've got you covered with the latest scoop on everything from next-level PC gaming rigs to the coolest game releases. But hey, we're more than just news! Ever wondered what goes on behind the scenes of your favorite games? We're talking exclusive interviews with the brains behind the games, fresh off-the-press photos and videos straight from gaming conventions, and, of course, breaking news that you just can't miss. We know you love gaming 24/7, and that's why we're here round the clock, updating you on all things gaming. Whether it's the lowdown on a new patch or the buzz about the next big gaming celeb, we're on it.

Contacts

  • Owner: SNOWLAND s.r.o.
  • Registration certificate 06691200
  • 16200, Na okraji 381/41, Veleslavín, 162 00 Praha 6
  • Czech Republic

Creating your own Microsoft Copilot chatbot is easy but making it safe and secure is pretty much impossible says security expert

We could all use our own dedicated, custom-built chatbot, right? Well, rejoice because Microsoft's Copilot Studio is a handy tool for the less technical (those of us who don't dream in Fortran) to create their own chatbot. The idea is to make it easy for most businesses and organisations to knock up a chatbot based on their internal documents and data.

You could imagine a game dev using a chatbot to help gamers ask questions about everything from how to complete a game to applying the best settings and fixing technical issues. There is, inevitably, a catch, however.

According to Zenity, an AI security specialist, Copilot Studio and the chatbots it creates are a security nightmare (via The Register). Zenity CTO Michael Bargury hosted a recent session at the Black Hat security conference, digging into the horrors that unfold if you allow Copilot access to data to create a chatbot.

Apparently, it's all down to Copilot Studio's default security settings, which are reportedly inadequate. Put another way, the danger is that you use that super-easy Copilot Studio tool to create a super-useful tool that customers or employees can use to query using natural language, only to find it opens up a great big door to exploits.

Bargury demonstrated how a bad actor can place malicious code in a harmless-looking email, instruct the Copilot bot to «inspect» it, and—presto—malicious code injection achieved. 

Another example involved Copilot feeding users a fake Microsoft login page where the victim's credentials would be harvested, all displayed within the Copilot chatbot itself (via TechTarget).

Moreover, Zenity claims the average large enterprise in the US already has 3,000 such bots up and running. Scarily, it claims 63% of them are are discoverable online. If true, that means your average Fortune 500 outfit has about 2,000 bots ready and willing to spew out critical, confidential corporate information.

Keep up to date with the most important stories and the best deals, as picked by

Read more on pcgamer.com