Updated: Apr 18, 2019
Cylidify's core tenants feature "balance", "compromise", and "convergence" which are all aspects of a Venn diagram. A Venn with each region's scope, size, and the intersections can be helpful to understand a challenge (or "opportunity") and illustrate the benefits of your plans and investments. In this post, we'll explore static versus dynamic analysis with a focus on their Venn (mostly the intersection).
Synopsis has a created a good illustration of each approach, but basically you are analyzing something either:
Static; not operating or dormant, so architecture, design, code, or a documented process/practice. See also SAST or Black-box approaches. - OR -
Dynamic; operating or at run-time, so systems, solutions, applications, or a process/practice as they execute. See also DAST or White-box approaches.
For development, we recommend an emphasis on dynamic analysis. This depends on your stack and deployments, but it's been generally trending toward dynamic. Especially true as DevOps proliferates, with components and services being integrated (late bindings) and scaled to form solutions - common in software or platform "as a Service" and "cloud" approaches, and the norm in public clouds. In SaaS and cloud, deployments and configurations are very dynamic which increases the value of a similar style of evaluation. This is not to say that static analysis isn't valuable, but rather less so in the balancing with dynamic analysis in most modern scenarios. You must still do static analysis of components and applications! Static code analysis (SCA) via tools (e.g. Fortify or Checkmarx) should be part of any Security Development Lifecycle (SDL). Static analysis can start earlier in an SDL/SDLC and reduces the risk of components as they move through dynamic analysis and into production.
So, lean your investments toward dynamic analysis as needed while keeping the following in mind:
Left-shifting dynamic analysis as early as possible in your SDL narrowing the gap to static analysis and creating a more seamless transition.
Document and use a separate a "fix" bar and time periods for static and dynamic analysis findings. For example, Static: fix all high or critical issues within weeks, triage mediums (more demoting), and suppress low or informational. Dynamic: fix or mitigate all high or critical issues within days, triage mediums (more promotion), and filter low or informational with backlogging of lows. Also, have a discrete, documented, and proven mechanism for deploying security "patches".
Do dynamic analysis regression passes and port/endpoint scanning when environments, deployments, integrations, configurations, or external controls change (e.g. perimeter/firewall, accesses, etc.).
Stay current, agile, and ready for opportunities to add value through emerging analysis approaches such as: automated or "AI" via active monitoring and alerting/action tools (a super-set of dynamic analysis which has improving capability and value), Run-time Application Self-Protection (RASP), Interactive Application Security Testing (IAST), and steadily left-shifting offensive security techniques.
Remember that this is a balancing exercise, so these approaches are complimentary or "in addition to" versus "instead of" each other.
One of the high value practices Cylidify recommends and helps organizations implement is Threat Modeling. This is a long-standing, but evolving practice that forms the Venn intersection and effective transition point between static and dynamic analysis. Threat Modeling relies on a diagram of the system to statically define the components, data, interactions, and controls, but shifts to dynamic in the modeling, analysis, or gamification aspects. Stay tuned for future posts on Threat Modeling, but in the meantime you can peruse materials from Adam Shostack: Threat Modeling book, LinkedIn learning, and EoP game.