Software Development

Integrity checking for JavaScript

Including JavaScript files from a CDN can be beneficial in many ways as you don’t have to ship the code with your code and caching can be done by the browser of a proxy server. It also allows for injecting untrusted code into a web page as someone else is hosting the code you rely on. But Firefox, Chrome, and Opera already support Subresource Integrity checking script and link tags. Hopefully, both Safari and Edge (or Internet Explorer) will support it soon.

But how does it work? First, let calculate the SHA256 hash of JQuery version 3.2.1 hosted by Cloudflare. Also, keep in mind to verify this number with the official version offered by JQuery. In this example, we download the minimized version of JQuery with curl and run it twice through OpenSSL to generate the checksum and encode the result in base64 format.

$ curl -s | openssl dgst -sha256 -binary | openssl enc -base64 -A

Now that we have the hash we can add the integrity attribute to the script tag and the prefix for the hash is “sha256-” to indicate the hashing used. From this point forward a browser that supports SubResource Integrity will require that the provided hash will match the calculated hash of the downloaded file.

<script src="" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>

Besides SHA256 the specification allows for SHA384 and SHA512 to be used. The calculation is the same as with SHA256 and we only change the algorithm that OpenSSL needs to use.

$ curl -s | openssl dgst -sha512 -binary | openssl enc -base64 -A

We could put only the SHA512 hash in the attribute, but we can put multiple algorithm results in the same attribute by just splitting them with space. This leaves a lot of room for proper lifecycle management of hashing algorithms as you can present multiple hashes when you switch to a better version instead of doing it big bang style and hope for the best.

<script src="" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4= sha512-3P8rXCuGJdNZOnUx/03c1jOTnMn3rP63nBip5gOP2qmUh5YAdVAvFZ1E+QLZZbC1rtMrQb+mah3AfYW11RUrWA==" crossorigin="anonymous"></script>

The next step is to have a fallback when the CDN you rely on goes down or is serving corrupt files. You could add a second src tag as in the example below that tells the browser to use the Google CDN when Cloudflare has issues serving the correct files.

<script src="" noncanonical-src="" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4= sha512-3P8rXCuGJdNZOnUx/03c1jOTnMn3rP63nBip5gOP2qmUh5YAdVAvFZ1E+QLZZbC1rtMrQb+mah3AfYW11RUrWA==" crossorigin="anonymous"></script>

The next step is to get the Content-Security-Policy header correct, but for now, only Firefox 49 and higher have the option to act on the require-sri-for attribute. This would basically force the browser to only load scripts and style sheets if the SRI-steps are successful, but many a lot of developers need to optimize their build pipeline to produce correct hashes and have correct monitoring to detect problems.

Internet, Unix en security

Firefox 10 and bye bye Flash

Firefox 10 beta 6 was released on last week and with the final release coming soon it was time to have a closer look at Firefox 10. I must say that this is a release worth installing like Firefox 5 was with decent HTML5 video support. But what makes Firefox 10 different then previous releases? Then answer is simple, WebGL. WebGL is a way to do 3D programming and rendering directly from within JavaScript.

With Firefox 10 WebGL works and there for also Google Street View works without the need of Flash. Yes, another dependency on Flash has been removed. The previous major dependency was YouTube, but as some may have noticed they also are in a transition from Flash to HTML5 video where you get the HTML5 variant when Flash doesn’t work.

As more and more websites switch from a Flash-player for video toward HTML5 in under a year it makes you wonder what WebGL is going to change. Was HTML5 a year ago only for the geeks and cutting edge, now more and more starts to depend on it. With HTML5 Canvas a lot of Arcade games where rewritten to run in a webbrowser. With WebGL the question comes when Doom has been rewritten to run in a webbrowser. Maybe something for a Google Summer of Code project?

Internet, Unix en security

Te vaak verversen?

Webapplicaties wordt gemaakt op een desktop of op een server op hetzelfde netwerk en dan komen problemen niet snel naar boven. Behalve dat het wat langzamer wordt als je veel gebruikers hebt of veel data hebt, maar sommige developers verzoeken de goden. Het volgende is leuk als je de enige gebruiker bent en de server direct om de hoek staat, maar een refresh in JavaScript schrijven met een interval van 10 seconden is niet de meest ideale oplossing.

var date = new Date();
var ts = Math.round(date.getTime() / 1000);
if (ts - last_scheduled_update > 10 || _force_scheduled_update) {

De logfile van de webserver groeide dus snel, want elke webbrowser die deze applicatie draaide deed een POST bij de webserver om de laatste updates te verzamelen. Voorlopig is deze waarde naar 60 seconden verhoogd in de code en veel gebruikers merken er weinig van. Mogelijk dat de waarde zelfs naar 300 seconden kan wat voor veel webmail applicaties ook een acceptabele waarde lijkt te zijn.

var date = new Date();
var ts = Math.round(date.getTime() / 1000);
if (ts - last_scheduled_update > 60 || _force_scheduled_update) {

Sommige developers zijn blijkbaar nog niet bewust van keuzes die ze maken, want stel dat deze applicatie door een proxy moet om bij Internet te komen. Dan wordt die proxy server ook onnodig belast, terwijl veel mensen echt niet zitten te wachten op een refreshrate van de data in een webapplicatie elke 10 seconden. Zeker niet als er voldoende tekst is om te lezen om een minuut of langer mee te vullen.