Are you designing secure software? Four ways to ensure the answer is "yes"
July 07, 2015
Last month, I discussed how - in the absence of a regulatory body - software developers more than ever need to implement, and stick with, appropriate...
Last month, I discussed how – in the absence of a regulatory body – software developers more than ever need to implement, and stick with, appropriate processes.
Let’s now look at the importance of careful design. A recent security breach at a well-known retailer was caused when a “Point of Sale” (POS) computer was reverse engineered and used to access a central database. In this case, the quality of the algorithm wasn’t particularly relevant to the security breach. What’s your best protection against this type of attack? A well thought-out approach to system design.
Software cannot be treated in isolation; rather, the whole design of the system has to be considered. Below, I’ve identified four design issues that need to be mitigated to ensure security. Can you answer yes to the following?
1. Does your hardware operate solely with authenticated software, and do your new software releases only work with authentic hardware?
2. Can you ensure the integrity of your software and that it hasn’t been modified by any external party?
3. Can you guarantee that your software is secure and can’t be read by a third party?
4. Have you minimized system complexity so your security component only focuses on security and nothing else?
Let’s use complexity as an example – just to demonstrate how important it is. Take the example of an industrial drill press. Say it has a safety feature that shuts off the drill if an employee’s hand comes too close. In that case, you’d probably agree that you’d prefer to have that safety component focus only on safety and nothing else … well, the same goes for security. When you have the security component focusing just on security, it minimizes the risk of back doors and unforeseen consequences. Not only that, but when your security assessment is carried out on equipment that is complex in its own right, you run the risk that it may not be the natural domain of a security expert.
In an increasingly interconnected world, developers need to be more vigilant when designing secure software – and a little careful planning can lead to huge pay-offs.
In my four posts so far (1, 2, 3), we’ve covered a lot of ground when it comes to ensuring a secure future – and that future is bright, as long as we remain vigilant and dedicated to security. Stay tuned for my conclusion, where I’ll recap my main points and look ahead toward industry next steps. Let me know if you want me to drill down on any additional details.
Dave Hughes is the CEO and founder of HCC Embedded, a developer of re-usable embedded software components. Dave is a “hands-on” embedded specialist, who still actively contributes to the strategy and direction of HCC’s core technologies. His extensive experience has made him one of the industry’s leading authorities on fail-safe embedded systems, flash memory, and process-driven software methodologies. He is a graduate of the University of Sussex in England.