Can you really trust your copilot? Who to believe in the Age of AI
Artificial intelligence (AI) is all anyone’s talking about anymore. How it will make us more productive; how it’s going to steal our jobs; and how fed up we are of hearing everyone talking about it so much!
Depending on the industry you work in, you might already be seeing employees and colleagues outsourcing many of their usual tasks to the much-celebrated ChatGPT (which you can read more about here). It seems to have been embraced in a big way, without a whole lot of scrutiny or pushback.
The rapid rollout of these platforms has a lot to answer for – instant accessibility made this a unique technological development in that it was available to the masses without any initial financial outlay, meaning anybody could get their hands on it for a go.
However, in all the excitement to hop on the latest trend, many may have neglected the robust risk assessment process that would usually be necessary with the introduction of any new software in an organisation.
This is particularly relevant for industries like healthcare, where the consequences could cause untold harm, as we saw in the case of Theranos whose inaccurate test results resulted in “thousands of unnecessary and negative experiences for patients.”
The World Health Organisation has already called for care to be taken in using AI tools for healthcare applications, due to the potential for bias and inaccuracy, stating that “there is concern that caution that would normally be exercised for any new technology is not being exercised consistently with LLMs (Large Language Model tools).”
So why have AI tools been adopted so freely and enthusiastically, and seemingly without a healthy level of mistrust or scepticism?
Is it perhaps that as a society we are keener than ever to realise our Wall-E promised dreams of gliding effortlessly throughout a leisurely life, every whim catered for?
Is the disillusionment with capitalism, sharply thrown into relief during the pandemic, the catalyst for an appetite for tools which tip the balance of power into the workers’ hands for once?
Or have we collectively let our guard down when it comes to tech, with many in the workforce now having grown up with the ubiquity of accessible tech, home computing, and connected devices?
Whatever the reason, it’s crucial that we take ourselves back to basics when considering the outputs and applications of generative AI-powered tools.
Let’s look at the upcoming Microsoft Copilot for example. Due to begin rollout in June 2023, Copilot promises to unlock new heights of productivity for those already using Microsoft applications. Our Cyber Essentials partner Air IT have already looked at the benefits in their comprehensive overview here.
But we’re here to remind you that there’s a few things you should always remember when exploring new technology for your business:
- Risk assessment – it's important to carry out an investigation into how new tools may impact data storage and security, if extra training is needed, and if the software itself may be vulnerable to cyberattacks.
- Evaluate the source for bias and accuracy – even once you’ve confirmed these tools are safe to use, it doesn’t mean that their outputs are always infallible. Remember that machine learning is often only as good as the data it's been fed, so don’t take anything for granted – always double check against trusted resources for accuracy.
- Keep all software updated - this includes antivirus and firewalls, and will ensure that you’re covered by the most recent security patches and therefore less vulnerable to attacks via weaknesses in any new software’s defences.
- When in doubt, trust the experts – consult with your IT teams, Managed Service Providers, NCSC-approved Cyber Advisors, or your local Cyber Resilience Centre for a considered professional opinion.
If you’re thinking of integrating any of the new AI-powered business tools into your organisation, you may want to think about drafting policies around its use, refreshing your staff training around cyber safety, and looking at how to effectively risk assess before implementation.
Luckily, the Cyber Resilience Centre for London is here as your go-to organisation for all things cyber. Get in touch and we can point you in the right direction for advice, guidance, resources and next steps.
And don’t forget to join our community to keep up with all the latest developments, threats and opportunities in the world of cyber resilience.
You can sign up to our monthly newsletter through the button below.