Effective Note-Keeping for Web Security Code Reviews

One of the recurring questions I get during my Web Security Code Review Training is how to keep notes when multiple reviewers are working on the same codebase. This is a topic I also hear AppSec teams discuss frequently.

The Bad!

Let’s face it: reviewing source code to find vulnerabilities is challenging—very challenging. To make things even harder, it's often a timeboxed exercise. The last thing you want is for reviewers to start from scratch every time they look at a codebase, especially if it has already been reviewed. Instead, each review should build on the previous one, making the process faster and easier for subsequent reviewers. To achieve this, your team needs to capture the right artifacts. This blog post will guide you on how to do that. But first, let’s talk about The Ugly.

The Ugly!

Now, let's discuss The Ugly. One of the worst practices I've seen for keeping artifacts during code review is the use of generic checklists. These often originate from managers without a code review background, who may create them based on advice from a pentest team, ChatGPT, or an uninformed blog post.

Such checklists often look something like this:

IDNameDetailsChecked [Y/N]
CR-001 Input Sanitization All inputs are properly sanitized
CR-002 Parameterized Queries All SQL queries are parameterized
CR-003 Encryption keys Encryption keys are handled securely
CR-... ... ....

There are only a few reasons why this format might seem satisfactory:

  1. You’re trying to shift blame to the reviewers if something is missed.
  2. You’re not fully aware of what effective code review looks like.
  3. Your review team lacks experience, and you're inadvertently ensuring they don't improve.

This kind of checklist sets people up for failure. Reviewers are likely to resort to grepping for bugs, which usually yields poor results. They won’t develop a deep understanding of the codebase, and you’ll end up with an artifact that serves compliance rather than actual security. Worse, if something goes wrong, the blame will fall on the reviewers.

The Good!

To create a more effective process, I recommend keeping code review notes close to your pentest notes. This minimizes duplicated effort—such as searching for vulnerabilities already identified in a pentest—and promotes cross-pollination of ideas between the pentest and code review teams. For example, if a pentest uncovers a suspicious behavior that couldn’t be exploited, this is gold for code reviewers. Ensure that all insights from one team are accessible to the other.

Regarding format, I suggest using plaintext files or a wiki. Plaintext files are easier to search (especially with grep), but they have limitations in embedding other content types like screenshots or design documents.

So, what should these notes include?

First, maintain general notes that provide an overview of the security architecture of the project. For example: “Authentication is handled via ..., authorization is done using ..., file uploads are accepted but don’t touch the disk..., most security checks are implemented in this file...” Also, document any “gotchas” within the application—things that are difficult to figure out or don’t make sense until they do. Highlight fragile aspects of the application, such as: “If they forget to X, it’s a direct XSS vulnerability.” Additionally, include basic project information like where to find the documentation and how to run the application locally. Ideally, you should point to the dev team’s documentation and make it their responsibility to keep it updated.

Next, record specific details about each review, including:

  • Date and version/commit ID reviewed
  • Reviewers involved
  • Reason for the review (e.g., new version, incident, new CVE)
  • Key contacts (who to talk to for more information)
  • Areas reviewed and strategies used
  • Areas of concern and possible improvements
  • Weaknesses and vulnerabilities found

For example, you might write: “When conducting the review on [date], reviewers focused on [strategy] and identified concerns related to [area]. Weaknesses included [issues], and vulnerabilities were documented [reference to issue tracker].” Potential improvements should also be noted to help guide future reviews.

It's also essential to establish a process. The first task for each reviewer should be to ensure that the existing notes are up to date. Are the vulnerabilities still present? Are the weaknesses still relevant? Have any of the suggested improvements been implemented?

Below is an example of how you might structure these notes in a plaintext file (you can also leverage vim’s scaffolding to standardize entries):

Application: CorpWiki

Source: https://git.co.....

Last reviewed: 01/09/2024

{{ Contact
The best person to talk to is John Doe <john.doe@corp....> in the CorpWiki team <corpwiki-dev@....>.

}}

{{ Architecture / Security Architecture:

This is a Rails application.
Authentication is done using Devise.
Most Authorization checks are done using before_action.
CSRF protection is based on the built-in protection offered by Rails.
The code accepts file uploads but everything goes to AWS s3.
XSS protection is based on the built-in protection offered by Rails.
Data access is done via the datamapper.

}}

{{ Review done on the 02/04/2022:
version: 3.1.0 (commit:8082c74e)

Reason: Initial review
Time spent: 10 days with 2 reviewers
Who: Jane Doe and Richard Roe

Strategies: Mostly reviewed the results of running Brakeman and gaining an idea of the overall architecture of the application.

Area of concerns: 
  * PDF generation using wkhtmltopdf 
  {{ PDF are generated using wkhtmltopdf in lib/pdf.rb 
    It seems like this may be an issue for ....
  }} 

  * CORS
  {{ CORS is handled by ....
    The check in place seems to be based on the assumptions that...
  }} 

  * File handling
  {{ in lib/pdf.rb the PDF generated are added to ...
     This may present an issue if ...
  }}
 

Weaknesses:
  * Outdated version of Devise
  {{ Gemfile.lock line 9
    devise (....)
  }}

  * Some regular expressions don’t rely on Ruby \A and \z
  {{ app/views/users/show.html.erb line 1337
    name =~ /..../
  
  }}

  *  
  {{

  }} 

Potential Improvements:
  * Secrets management
  {{ config/initializers/hack.rb line 89
    ...
  }}  

  * A lot of parameters are not cast as strings using .to_s
  {{ Example  from app/controllers/articles_controller.rb line 5
      ...

  }} 

  * 
  {{

  }} 

Vulnerabilities discovered:
  * SQL injection found in ... (see: https://... issue #PTL001) 
  {{ app/controller/blogs_controller.rb line 39
 
  }} 

  * XSS in app/views/.... (see: https://... issue #PTL003)
  {{ app/views/articles/index.html.erb line 33
  ...
  }}

  * CSRF in /users/logout (see: https://... issue #PTL005)
  {{ app/controllers/users_controller.rb line 30
  ...
  }} 

 
}}

{{ Review done on the ../../....:
version: ...... (commit: ........)

Reason: ...
Time spent: ...
Who: ...

Strategies: ...


Area of concerns: 
  * ...
  * ...
  * ...
  
  
Weaknesses:
  * ...
  {{

  }} 

  * ...
  {{

  }} 

  *  
  {{

  }} 

Potential Improvements:
  * ... 
  {{

  }} 

  * ...
  {{

  }} 

  * ...

Vulnerabilities discovered:
  * ...
  {{

  }} 

  * ...
  {{

  }} 

  * ...
  {{

  }} 

}}
  

Notice that vulnerabilities are referenced rather than fully documented in the review artifact. This approach helps prevent diluting the key content for reviewers. This may or may not work based on your current process.

I hope this article provides you with a strong foundation for documenting your team's code review. Make sure you you tailor this content to fit your team’s needs and processes.

Photo of Louis Nyffenegger
Written by Louis Nyffenegger
Founder and CEO @PentesterLab