Within traditional Software Development Life Cycles (SDLC's) the planning and design phase is the step where architects and development teams determine how to implement critical features into an application. While the requirements phase decides the blueprint and what needs to be included in the software, the design phase determines how to implement those features into the application by planning a software architecture that will be implemented. That said, traditional SDLC's move from blueprint establishment to the step of designing the architecture by first conducting a team-based analysis of application requirements. The user interface (UI) design, screen flow, positional structure of data elements on the screen, components and modules of the application, functions of the program, data calls, API calls, data storage, interactions between modules, etc. encompass the software architecture decisions that are made during this phase.
There are many different types of applications, and each will generally use different frameworks, models and programming languages for the purpose of meeting very specific business requirements, for example using Node.js versus Angular.js, utilizing Python versus Java in the backend, using Waterfall versus Agile models of development, etc. The design phase is where the programming language, model, and framework will be selected by the development team. When discussing software architecture, it is also important to understand the three-tier and four-tier application architectural standards, which represent the separations of processing between different organizational levels. A software package may embody a Web Application and a Mobile application, and may use the Representational State Transfer (ReST) network application architectural protocol that is often used for web services over HTTP. Typically, the client-server three-tier standard uses the presentation layer (e.g. associated with a web browser for a web application), application layer, and database layer - all built as separate modules - which presents the data to the end-user, processes data, and accesses the database.
Web applications and mobile apps that connect to online network systems often use ReST to provide web services and access to online resources. Database systems today are often relational database management systems (RDBMS) such as MySQL and Microsoft SQL Server, or NoSQL database such as MongoDB. Much like the OSI network model, software application tiers/layers work in concert to provide key functionalities to end-users.
Software engineers and development teams must determine how to turn a requirements blueprint into a working application, establishing how the software will meet the requirements set during the requirements phase of the SDLC. While traditional SDLC's focus completely on functionality and features, Secure SDLC's view security as a core concept of the blueprint implementation and require security features to ensure that the software is as secure as possible. This often requires developers to think not only about what the software should do to carry out its requirements, but about what the software shouldn't do, in order to ensure data security.
Security planning can encompass many tasks such as documenting the native security flaws and protections built into different platforms, programming languages and frameworks, along with identifying different encryption ciphers and libraries for implementation such as encoding libraries, data validation libraries, and other modules to be used in the application. When using tier-architectural protocols for building the software architecture, developers also define trust boundaries between the tiers (web, mobile, services, and data), establish data encryption and system authentication methods between tiers, and determine how the tiers interconnect and interact, while designing the authentication, session management, and authorization methods for application users.
The SSDLC design phase differs greatly from traditional models in other ways, such as the fact that SSDLCs usually include two critical functions in the aforementioned phase - threat modeling and risk management.
Threat modeling is the entire overview of all security features, threats, risks, attack vectors, vulnerabilities, etc. associated with a given application. Put simply, it is all of the key information associated with the cyber-security of a software application.
Threat modeling is often used in conjunction with risk management, which can encompass everything from architectural risk analysis - the identification of all security issues in a software's architecture and the innate risks to business assets as a result - to general risk assessments.
While risk assessments focus on identifying risks to business assets, threat modeling is used to provide the necessary information to serve as the basis for risk analysis - determining attack surfaces via in-depth analyses to identify entry points that may be targeted by cyber-attackers, and identifying high-risk interfaces available to external threat actors.
With this data, probable vulnerabilities can be determined in conjunction with analyzing the frameworks and programming languages to be used, so that feasible attack vectors can be identified. This information is used to determine what business assets are at risk, and to what extent.
An in-depth risk analysis must systematically consider all threats present to business assets, along with their associated risks. However, it is often advantageous to have the ability to prioritize risks in order to present a better defensive front that is aligned to protecting the most critical business infrastructures. That said, having a complete understanding of all threats, attack surfaces, vulnerabilities, security controls, and countermeasures allows for a powerful defensive front by determining both a cyber-attack's likelihood of occurrence and impact to business assets, and the probability of an attacker discovering and exploiting each vulnerability.Based on this, development teams can determine the severity of each risk and rate them.
Rating risks according to the OWASP methodology requires first taking into consideration the likelihood of an attack, which is based on two factors - the threat agent and vulnerability factors. Threat agents are any cyber-attacking elements that can be seen as a threat to corporate assets, and factors associated with such agents include skill level, possible availability of hacking tools, opportunities and motives of probable agents, and how many potential threat agents exist. Vulnerability factors include how easy it could be to discover vulnerabilities, how feasible it would be to exploit said vulnerabilities, whether the vulnerability is well-documented in hacker or security communities, along with ease of detection. Combining the threat agent factor with the vulnerability factor can allow your company to determine the likelihood of an attack, and thus the probability associated with each risk to your business assets.
In the event that a cyber-criminal conducts a successful attack, the level of impact on your corporate systems must be analyzed.
This is done by ascertaining the technical and business impact of an exploit being successfully deployed against your business. The technical impact is associated with the effects on your technical infrastructure, which may include the loss of (data) integrity, privacy (confidentiality) and availability (e.g. in the event of a DoS or DDoS), all of which depends on the type of attack. The business impact considers the more immediate business losses associated with fines due to non-compliance with data security legislation, lawsuits, loss of reputation, financial loss, and privacy violations associated with a data breach occurring.
The severity of a risk to your business is innately tied to the overall impact of a data breach associated with the combined levels of the technical impact & business impact of a cyber-attack.
All perceived risks to your specific business assets are then assigned appropriate risk levels based on obtaining an overall risk value, which equals the impact and likelihood values multiplied together. This value gives a general picture of how much your business assets are at risk and is followed by establishing and implementing suitable security controls and countermeasures to eliminate as much of the risks as possible.
All risks are prioritized and labeled according to a relative scale from high to low. This allows for proper, appropriate remediations of risks to your business systems.
Though the above risk assessment methodology is a powerful tool to protect your business systems, as software changes it is important to continuously update and utilize threat models and risk analysis to keep your business infrastructure protected. During the design phase of the SSDLC, establishing and incorporating threat models, risk analysis, and security features before actual development (coding) allows everyone on the development team to fully understand the significance of security - and the importance of ensuring the integration of security features - in the software project.
Because the design phase of the Secure SDLC is the bridge between the what
of the software functionality and the how
- coupled with the fact that many security flaws in applications result from faulty design - it is imperative that the design phase is carried out thoroughly and with security in mind.