Skip to content

CSRF mitigation and SameSite=Strict #3052

Open
@Jxck

Description

@Jxck

https://datatracker.ietf.org/doc/html/draft-ietf-httpbis-rfc6265bis-20#section-5.6.7.1

Lax enforcement provides reasonable defense in depth against CSRF attacks that rely on unsafe HTTP methods (like POST), but does not offer a robust defense against CSRF as a general category of attack:

Attackers can still pop up new windows or trigger top-level navigations in order to create a "same-site" request (as described in Section 5.2.1), which is only a speedbump along the road to exploitation.
Features like [prerendering] can be exploited to create "same-site" requests without the risk of user detection.

My understanding is that the only request that can be generated by this “1 and 2” attack is a GET. In other words, the attack implicitly relies on the service having an API that accepts GET requests with side effects.

In general, there is no fundamental countermeasure against “CSRF via GET” other than ensuring that side-effect APIs are not implemented in GET endpoints. Instead, those APIs should be implemented via POST (PUT/DELETE).

If a service has enough resources to classify cookies into “read” and “write” and add the Strict attribute to “write” cookies, it might be more effective simply to change any side-effecting GET endpoints to POST instead. According to the RFC, deploying Strict does offer robust defense in depth, but it does so only because “the request is sent but no cookie is granted.” This is not a valid final layer for a multi-layer defense, and it is questionable to recommend it as the primary measure.

Essentially, CSRF countermeasures should be prioritized as follows:

  1. Do not implement APIs with side effects in GET.
  2. Check the Origin header for any API with side effects.
    • Note that not only fetch() but also <form> submissions send the Origin header in all major browser now.
  3. Restrict cookie sending (e.g., SameSite=Lax or stricter) so that cookies are not automatically included if the request originates from a different site by attack.

Given that points 1 and 2 are properly implemented, I wonder how much value there really is in separating cookies into “read” and “write” categories or distinguishing between Strict and Lax. It seems to me that there is no major problem with just making one Session cookie with Lax as before, but am I missing something?

I understand that this RFC is not a guideline for CSRF mitigation. However, if that were the case, the current use of Strict and the separation of “read”/“write” and the description as if it is important as a CSRF countermeasure seem to invite misleading.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions