Everyone is shifting left. This isn’t a dance move. It’s a much bigger move that effects the tech world and everyone from developers to the general public. But shifting left comes with problems. Best practices are not always clear, nor easy to implement. We’re going to cover how you can be the Justin Timberlake of smoothly shifting left.

Things we’ll explore:

  • GitLab’s DevSecOps Report
  • Best security practices for DevOps
  • Monolithic vs. microservices architecture
  • How to securely move to the cloud and microservices

Revelations from GitLab’s DevSecOps Report

One of the most polarizing aspect of software development today is Security. Everyone knows the need, but no one knows who should be responsible and how to create frictionless security practices.

According to the GitLab 2019 Global Developer Report: DevSecOps, nearly 70 percent of the developers say they are expected to write secure code. But, in most organizations, the mechanisms to enable developers to securely code remains quite elusive.

Incorporating Security into DevOps results yields 3 times the effectiveness of discovering vulnerabilities and 90% increase in likeliness to security test.

Good DevSecOps security implementation is critical if the goal is to increase secure code. Developers alone cannot do it. Per the GitLab report, incorporating Security into DevOps results yields 3 times the effectiveness of discovering vulnerabilities and 90% increase in likeliness to security test. By testing in CI/CD, code can be tested before each release. The onus is on getting the right real-time tools that integrate easily into toolsets.

Despite the expectation and obvious need for secure code, forty-nine percent of surveyed security professionals report struggling to get developers to make remediation for the vulnerabilities a priority. The reasons are evident: there are limits to development production processes based on the market evolution. We cannot blame developers who are set up to either fail at security or dev deliverables. The pressure to release is enormous. Testing has historically been isolated and overly intrusive. The evolution of DevOps has meant the full dev cycle has shrunk and security is too frequently relegated to the end of the process. And for the same reason, only 44 percent of organizations report that security vulnerabilities are a performance metric for developers.

Only 44 percent of organizations report that security vulnerabilities are a performance metric for developers.

GitLab’s report a timely demonstration of exactly how important DevOps and DevSecOps has become for the industry. And, you have likely encountered the crossroads between the need for speed in DevOps lifecycles and slow-and-safe security. Now let’s take a look at where these results leave us.

Monolithic vs. Microservices Architecture: Security Considerations

shift to microservices 1

In all likeliness, microservices and containerized apps are already part of your architecture design or you are actively moving to a containerized cloud environment. Both situations provide an excellent opportunity to step up your DevOps and your DevSecOps games.

Let’s take a look at the difference between the traditional monolithic and the microservices architecture. Monolithic architecture is like a traditional model for designing and developing software. This monolithic application consists of a single, self-contained unit in which all of the code is existing in a single code base and modules are interconnected. At the deployment time, the entire code base is deployed and the scaling is achieved by adding additional nodes.

For microservices architecture, they come from the building of modules. Modules address a specific task or a specific business objective. Microservices were created in order to overcome the constraints of monolithic applications. But the addition of microservices, creates complications.

Because microservices are modular in the simplest form, they help build an application as a suite of small services… written in any programming language and/or using different data storage techniques.

As applications grow and become larger and larger this sort of tight coupling between the components results in slower and more challenging deployment. That’s why microservices solve these challenges. Because microservices are modular in the simplest form, they help build an application as a suite of small services. Each is running its own process and is independently deployable. Unfortunately, that creates incredible diversity between data sets. Services may be written in any programming language and/or using different data storage techniques.

A key takeaway from the GitLab survey is that teams need a single solution something that can provide visibility into both sides of the process for streamlined deployment development with security. Single solutions have to be very sophisticated to parse all the various data types and differences between various microservice deployments.

3 Security Considerations for Shifting Left in DevOps

Three top recommendations for creating real-time, effective security for DevOps are:

  • Automation
  • Machine learning
  • Containers with runtime protection

Combining security testing and protection automation into high-powered security tools and progressively stronger machine learning and advanced neural networks provides the specialized security needed for complex, microservices architectures.

Each of these components help security teams cover more ground with less work, but together they supercharge security both in present DevOps lifecycles and as an organization grows and cycles speed up. A containerized, self-contained environment comes with a lot of advantages to boosting deployment, scalability, and closer parity between development environments and security processes. These three elements help build performance-focused coding practices into the design pattern. They help developers become security practitioners without grossly disrupting the CI/CD workflow.

Automated testing frees up developers.

Automated security testing is a no-brainer for fixing the problems facing DevOps adopters who realize that increasing responsibility is located squarely on the developers. Ultimately, hoping already taxed developers can become constantly up-to-date security professionals is a fool’s errand.

Through automation, developers can meet their responsibility to facilitate fast releases and quicker code fixes.

Automating security testing in CI/CD, such as in Wallarm’s FAST, leverages existing testing developers are already doing. Automated testing can create and update baselines before every release with OWASP Top 10 and advanced libraries that are informed by discovered abnormalities across clients. Through automation, developers can meet their responsibility to facilitate fast releases and quicker code fixes. It releases teams from the impracticality of safer, more homogenous coding or heavy training.

Machine Learning for Growth

Machine learning (ML) works with automation to detect potential threats and vulnerabilities based on abnormalities in data. ML can monitor an organization’s own traffic, learning what is “normal” for the organization. In addition to monitoring for previously discovered threats, drawing from a library, machine learning can identify changes or abnormalities to flag for investigation. This means that the initial automation is increasingly augmented by detection technology that gets smarter and stronger as it runs.

ML can also lower the rate of false-positives, learning from previously identified false-positives. This helps teams work more efficiently, rather than duplicating previously done tasks or manually sorting through high levels of false positives. It also reduces fatigue, as security professionals spend time sorting through data needlessly.

Machine learning benefits cybersecurity and big data scanning
Machine Learning is an Increasingly Necessary Part of Cybersecurity

Choosing a containerized environment.

Containers are simply a far better choice for microservices architecture than virtual machines.

Runtime containers are dramatically better for security than isolation. They encapsulate a lightweight runtime environment for your application and present a course-consistent software environment that can actually follow the application from the developers’ desktop, through testing, and onto the final production deployment. This allows you to run containers on physical or virtual machine containers. By contrast, perform execution isolation happens at the worst level, the operating system level.

A single operating system instance can support multiple containers, each running its own instance separate execution environments. By running multiple components on a single OS, you can reduce overhead and free up the processing power for increased efficiency.

VM containers are a powerful option to consider while migrating or transforming your architecture. Because containers enable multiple execution environments to exist on a single VM, multiple components can coexist in a single VM container. Some of these VM containers are much smaller than other containers, as small as one-tenth to one-hundredth of the size of the virtual machine. Since they don’t require the OS to be continually spun up, VM containers are also more efficient at the initialization. Overall, containers start in seconds or even milliseconds.

Of course, VM containers also raise their own security concerns and additional challenges. Traditional security tools used for the monolithic applications are no longer applicable. They are incapable of securing microservices and containers. Applications based on microservices architecture contain thousands of containers that make up a significantly larger attack surface.

Security Best Practices for Larger Attack Surfaces in Microservices Environments

How can we introduce security into our applications without losing the benefits of microservices architecture? To answer that question, let’s go over some of the best practices you can use.

Immutable containers help protect microservices.

By using immutable containers, any defects or vulnerabilities can be fixed by developers simply by rebuilding and redeploying these containers. Developers should store the data outside the container so when some of them are replaced all the data is still available to their new versions. That means, you need to know where the container originates.

A dynamic policy can create like a sort of baseline of normal communication and notify you know if there are any traffic spikes or unusual traffic flow.

Create trusted image repositories.

When your containers are updated—and if they are free from any known vulnerabilities and malicious code—it is best to establish a trusted image repository and run images only from that trusted source. In addition, developers should also check application signatures in their scripts before putting containers into production.

The right tools and policies for containers.

Use container-native monitoring tools so they first collect the events and then examine them against the security policies. A deterministic security policy can define what services can be run and which containers are allowed to make things like an external HDD request, by way of an example. A dynamic policy can create like a sort of baseline of normal communication and notify you know if there are any traffic spikes or unusual traffic flow.

Access is vital to microservices security.

Access is another critical factor to microservices security. Secure access to microservices with the API access control. This is absolutely fundamental to truly securing applications that consist of microservices software. There are multiple, independent API services that require additional tools to manage API access control. Be thorough with each tool you use.

Create Defensive Depth

Protecting your microservices will never come down to a singular solution or practice. You need to think like a security expert. Think: defense in depth. We have to create defensive depth, including as applied to microservices, not just top-down. The defense in depth approach creates a multi-layer security to prevent attacks.

Defense in depth includes security measures such as:

  • filtering communication flows;
  • authentication authorization;
  • access controls for microservices; and
  • using the encryption technologies specific to a group.

These containers are specific to certain microservices, meaning there is a single host operating a system kernel. Attackers face no difficulty in compromising these different groups.

Again, you need to Integrate automated security testing into your build or continuous integration process (CI/CD). Automation is the key to integrating quality protection in a way that ensures quick feedback on the impact of new changes. Continuous security testing for continuous software development helps realize the speed and the flexibility and ensures the faster recovery.

With integrated security, defects can be found much faster. Test automation that makes it possible to test fast and early gives stakeholders the feedback to accurately assess risk. That helps make better decisions at the business level.

Above All: Avoid These Security Mistakes in DevOps Environments

Trying to choose between speed or quality instead of implementing both is an incredibly bad decision. How you find the balance between the two is really dependent on your business case. So, what is critical to business? You have to align security goals with the business skills to determine how fast you want to go at what risk level. The faster you go, the more chance security gets shorted.

The main consideration is not to compromise on quality. Sometimes people put too much emphasis on moving as fast as possible. Business needs to understand that an unchecked quickness of release is not realistic. It can hobble reasonable security efforts, which don’t have to work against reasonably quick releases.

Steady rigidity can encumber realizing best practices and changing to the right tools for your environment. When we are talking about transformation in the move to microservices, we are really talking about a tremendous amount of change across the organization. In navigating transformation, make it somewhat flexible and allow flexibility within which you can continuously improve.

To that end, with great change takes a great responsibility to educate. Be patient with the process, educating and creating consensus both within and across teams. Having complete understanding and buy-in can help identify where processes are not working or could be better helped along.


  • Never choose between speed and quality
  • Prioritize creating a balance in DevOps that allows fast, safer releases
  • Be flexible when undergoing any organizational transformation
  • Educate team members and larger org teams around best practices and new approaches