Connected devices are flooding into every aspect of our lives, and every connection creates a potential weakness that could be exploited by cyber criminals for a variety of reasons. As the number of connections increases we seem to be setting ourselves up for an era of frequent cyber attacks, unless our cyber security professionals can protect our connected systems.
This week Memoori spoke with prominent cyber security expert David Emm, Principal Security Researcher with Kaspersky Labs’ Global Research & Analysis Team. Emm believes the responsibility and solution to the cyber security issues in our increasingly connected world go beyond cyber security professionals to developing a new culture of security across society. Read and listen to his insights below
Does connectivity mean vulnerability?
More connectivity certainly means a greater potential vulnerability to attack. Thinking offline for a second, the more time you spend on the street, the bigger the opportunity to get mugged or knocked down on the road. It’s no different online, the more points of connection you have with the internet, the more of an attack surface you present. It’s not inevitable however, if you’re wary and use pedestrian crossings, you can limit your exposure. It’s the same online.
So shouldn’t we be concerned about the rapid rate of growth of connectivity, or potential vulnerability?
I would be more concerned by the lack of awareness. We absorb road safety and city safety information from a young age, it’s almost intuitive. If you grow up in a city, you’re very aware of the dangers. It is not the same with connectivity. Most people think of their smartphone as a phone, not as the fully-fledged computer it is. There’s an attack surface but people don’t realize it’s there.
That’s also true at the developer end. The people who bring connected children’s toy to market are the same people who bring normal children’s toys to market. They are not taking the dangers of connectivity into account, in fact they don’t even know what questions to ask. It is as if you bring a Martian to Earth and expect them to be streetwise. They won’t be because they’re from another planet, and it’s the same with the spread of connectivity.
Then where does the responsibility for cyber security lie?
As with nearly all these things, the responsibility is shared. Take the example of a car; we expect car manufacturers to build in safety features. If you buy a car and it doesn’t have side-impact bars or airbags, you would start asking questions. There is also a regulatory requirement for them to include such features. However, there’s also an onus on us; not to drink drive or drive tired, to understand the road signs and be able to physically control the car.
I’m not sure it’s a whole lot different with connectivity. I think we have the right to expect whoever is bringing a connected toy to market, should have considered the security implications. They shouldn’t be passing the password across an unencrypted Wi-Fi, for example.
As consumers we also have a responsibility. You wouldn’t buy your child a toy that has dangerous parts, and what you would buy for an eight-year-old, you wouldn’t buy for a two year old, because you realize the dangers. In digital terms that means, for example, if you buy a device with a default password, it’s your responsibility to change it.
We have become used to government involvement too. If you see the CE mark or an ISO, it tells you that something has been tested. We don’t see that on digital devices yet, but i think we will have to get to that point. I’m not suggesting government be involved in design, but much broader things. We expect our furniture or clothes be fire retardant for example, and I think it should be the same for digital safety. Every device should have a unique password, for example, or it mustn’t send credentials in clear text.
Four or five clear points would raise the bar for developers who would have to comply with the law. Then you wouldn’t need to be a techie, you would just find whoever can ensure your product complies, otherwise you can’t go to market. Although I fear this won’t happen until something bad occurs.
Considering the rate of increase in connectivity, and its pervasive nature, shouldn’t someone be proactive on this matter?
Going back to what i said about phones, the psychology of this is important. If i give you a laptop and you’re not a techie, you at least know that there’s a technical component, even if you don’t understand it. If someone pitches a smart meter at you - there’s been meters as long as there’s been electricity - you’re not thinking digital, you’re thinking about the functionality. Be it a baby monitor, toaster, coffee machine, or smart meter, your expectations are going to be different.
I think government has a responsibility in terms of education. I think companies have a responsibility too. I’m sure for example, with Mattel’s connected product ‘Hello Barbie,’ had they gone to their own IT staff and said “we’re thinking about this kind of toy,” privacy and security concerns would have been raised. It underlines the point that people developing connected products are not even sure which questions to ask.
The rate of connectivity increase is phenomenal and therefore people find it hard to assimilate it. I don’t think you could turn it back but there are some good signs - like when the UK government, rejigged the computing curriculum to focus on software design and social responsibility.
Have we past the stage where we can actually protect the digital endpoints, or was that never the plan?
In recent years I have been grappling with the bring your own device (BYOD) trend, which offers excellent productivity benefits. The businesses go ahead with this for the productivity, then IT has to wrap its arms around the whole issue, which inevitably means restrictions - in the same way email was introduced with filters and controls. BYOD could end up meaning bring your own vulnerability. Your device could be infected on your home network and then you walk straight into your office with it, putting your whole company at risk.
I think we’re seeing the further end of de-perimeterization, as the Jericho Forum called it - I am the network wherever i happen to be, so if you want to secure the network you have to secure me. However, we can protect some things but not others, the bandwidth on smart meters does not feasibly allow for anti-virus for example, there the whole system will need to be protected through dialogue with companies like Kaspersky.
The other point on this is, what do we actually need to connect and what level of connectivity do we need to give it? This is also true for critical infrastructure, where the conversation was mainly about availability but now also about cyber security. Not everything needs to be connected, and for the things that do, what level and what means of connectivity is required, then we ask what degree can we control that.
It keeps coming back to this point of education, but the speed of say curriculums taking effect is much slower than the growth of connectivity, so are we destined for catastrophe on the way to developing this culture of security?
It is much slower, and we may be destined for catastrophe, but i don’t think it’s inevitable. It’s not that simple when we talk about education or training. When we try to educate people about phishing, for example, we ask, can you spot these phishing attacks?
However, people don’t need to be able to spot phishing tactics, the culture needs to change and they need to think would i open my front door to this person or company and ask the same online. If we can develop a security mindset, it should cut across this or that technology. It was floppy disks once, then CDs, now USBs, people just need to think, this is a removable drive and that’s dangerous.
If we were to have another discussion in 2025; what might be the key topics of that conversation and how might it be different to our conversation today?
I would hope we won’t be having the same conversation, and i would hope similar things that apply to non-digital products today, would also be enforced for digital products. I think the culture of cyber security will be much more present.
We are dealing with a generation of young people who are growing up with the technology. That doesn’t make them experts, and one of the dangers is older people thinking younger people can drive the technology safely. My 16 year-old son could drive my car but whether he’d be safe is a different issue.
The young generation are going to be the designers of the future, but we need to give them the right education. Barristers don’t carry around the whole weight of the law in their heads but they know where to go to find it. Their professional skill is understanding how the law works and the questions they need to ask to get to the bottom of what they want to do for their clients.
There will be other questions in 2025, such as biometrics and biotech, but if we get the message right for this young generation then we have some reason for optimism in the future.
[contact-form-7 id="3204" title="memoori-newsletter"]