Tim's Tech Blog

CSRF When You Least Expect It.

Update 2022-12-21: I acknowledge this blog post isn’t as fun or interesting as I wanted it to be, it also contains a couple of minor errors.
I still believe it covers an interesting topic, I’ll find a way to do it better.
Thanks again to anyone who has read it and offered me their feedback.

Update 2022-12-22: Previously stated that cookies were set with SameSite=Lax by default in modern browsers, which is incorrect.
Updated with accurate browser compatibility info, links to the updated standard, MDN compatibility table, Firefox bug tracker.


Black & white photo of a surfer falling off their board

Photo by Kaitlyn Jackson - Unsplash

Cross-Site Request Forgery (CSRF) - a type of attack where the user's browser is tricked into performing unwanted actions on a target website, on behalf of the user - are among the most common web application security risks.
As such, CSRF is usually well known among web developers, and most of today's websites are sufficiently protected against its most severe forms by implementing a set of best practices & defense mechanisms (OWASP) (ie: CSRF tokens, request origin checks, not using GET requests for state changing operations, ...).

By the way, feel free to take a look at my Cross-Origin Resource Sharing For The Web (Extension) presentation for a refresher on the topic.

SameSite Cookies ? #

SameSite Cookies allows you to declare if a cookie should be restricted to a first-party or same-site context. In other words, with either SameSite=Lax or SameSite=Strict, a cookie will not be forwarded in cross-origin subrequests. Declaring session cookies as same-site are the best defense-in-depth mitigation against CSRF.

SameSite=Lax by default ? #

The Standard (RFC 6265bis specification) recently changed so that SameSite=Lax should be the default behavior if the SameSite attribute is not specified in the Set-Cookie HTTP response header.
However, as of today (December 22nd 2022), this default behavior is either not yet implemented (Safari) or not yet activated by default (Firefox), for 22% of current browser coverage according to browserslist.

SameSite=Lax default behavior browser compatiblity: #

See SameSite Cookies Browser Compatibility on MDN.

SameSite=None #

However, there are still many websites out there which rely on SameSite=None as they:

But since these sites are using other anti-CSRF mechanisms & SameSite cookies are “defense in depth”, this should be fine, right? RIGHT?
Well, mostly, I mean, kinda ... it depends ... ok not really.

Introducing Timing Attacks #

A timing attack is a side-channel attack in which an attacker is able to gather information by measuring the time various operations take to run.
It is usually referred to in the context of cryptographic algorithms, or in the case of transient execution CPU vulnerabilities such as the famous Spectre & Meltdown.

Timing Attacks ... on the Web ? #

However, per Wikipedia, “timing attacks can be applied to any algorithm that has data-dependent timing variation” and, what more than today's websites that, gated behind authentication & user accounts, querying a myriad of unequally optimized microservices in order to display user-specific content, are subject to ”data-dependent timing variation” ?
It is fair to assume the server handling requests to pages for user dashboards or social media timelines for example, behave slightly differently whether or not the user is logged-in or not.
Indeed, in a lot of cases, those requests will take significantly longer to be answered compared to the login page that is served (or redirected to) in its place.

Note: this assumption may not be true for some websites, nowadays even personalized content can be heavily cached, and public content may not be faster to retrieve either.
However, as mentioned earlier, things are rarely equally optimized, which means that even though the main personalized content of a given websites may be very fast to retrieve, it may not be not the case for more niche content, located at another URL.

Ok that's cool and all, but what does it have to do with CSRF & SameSite Cookies?

CSRF Based Web Timing Attacks #

Given a website that use SameSite=None for their session cookies, and, even with some safeguards against most common & severe forms of CSRF, we can still perform attacks that fall into the CSRF category, in fact, in an actually much simpler way than with usual forged cross-site requests.
The twist resides in actually sending out two simple GET requests, with & without credentials:

We can do this easily with JavaScript using the Fetch API:

fetch("https://target.example.com/", {
credentials: "include", // "omit"

Note: for optimal results on a case by case basis, one may further tune the parameters of the requests & the behavior of the browser according to the Fetch Standard, by using the mode & redirect options.

Now these requests are made from a given origin, let's say https://evil.example.com, to another origin, say https://target.example.com.
Of course, in real life those would be totally different domains, but in any case, they're different origins, and, for this example to be correct, we're assuming that session cookies weren't set with the Domain directive.

Due to Same-Origin Policy (and the target origin presumably not allowing our evil origin with the Access-Control-Allow-Origin: https://evil.example.com response header), the responses from these requests are opaque and cannot be read, so we can't poke around in the response HTML for the actual content.
But we're fine with that, as mentioned earlier, we're interested in the timing of things, and that, we do have access to!
All we have to do is to measure the time it took between the firing of the request, and the subsequent settlement (rejection) of the Fetch promise.
Again, this is only possible because https://target.example.com is using SameSite=None session cookies.

const start = Date.now();
const total = await fetch("https://target.example.com/", {
credentials: "include", // "omit"
// swallow the likely NetworkError exception occuring from the CORS error.
.catch(() => {})
.then(() => {
const end = Date.now();
return end - start;

With the following code, we are able to put together an attacker webpage, get an hypothetical victim to visit it, and begin poking around to find out if, for a given URL, fetches take significantly longer to respond with credentials (cookies) included, or not.
If they do, chances are you are able to detect whether or not the user is logged-in to this particular website.

However, in order to have more confidence in the results, we should update our exploit code to be aware of what constitutes "significantly longer", for instance, we could draw the line at a 40% tolerance.
We also might want our code to repeat the experiment a few times and ensure that the results are the same 80% of the time, so we'd be able to tell the results are consistent.

View Gist with complete example code of the malicious page

How could this be used ? #

Such Web Timing Attacks could be used by bad actors in order to further identify potential targets, for example as the ~first of a multi-stage attack, starting from a spam campaign linking to a malicious webpage controlled by the attacker and containing the exploit code, allowing them to identify whether or not victims are users of specific sites or services, before engaging in a more targeted phishing campaign.
Depending on the website or service being targeted and after careful analysis by the attackers, one may even be able to gather more precise information about the victim than just their authentication status: potentially their customer profile (ie: a specific page of a given website may take a very long or telling time to load, when in the context of a large business).

How to prevent such web timing attacks ? #

  1. Do not use SameSite=None, use either SameSite=Strict or SameSite=Lax: this way, cookies will not be forwarded in cross-origin subrequests.
  2. Use a modern, up-to-date web browser, in fact, use Firefox.

Bad Performance is Bad Security #

One slightly disingenuous way of looking at this is to blame the feasability of this attack on the lack of performance focus: if everything is as fast as it can be, then a third party cannot interpolate private information based on response times.
Alternatively, one other way to look at this is to aim for making all requests roughly respond in the same time, by artificially slowing them down, but that would be silly, please don't do that.

Don't try this at home #

Or rather, please do try this at home, but don't use this for nefarious purposes, I wrote this article for fun and profit and somewhat educational purposes.

Share the love #

If you enjoyed this article, please feel free to share it and let me know on Twitter or Mastodon, don't hesitate to take a look at my about page as well.
Did I mention I'm looking for a job?