oh noes, o cert my *sniff* cert

September 26th, 2011

papieren bitteI’m not going to tell you about DigiNotar, whose file of bankruptcy this month held shock for no one after recently having lost the keys to the grand vault, in which the government held much stock. Though I have many comments upon the sophistication of the player that so thoroughly owned the most trusted agencies of the digital age….

The cracker hardly needed them skillz, considering it has been a challenge to keep that whole corrupt industry accountable. The trouble with the central authority system is that even if only one of the keys is compromised, the system is broken and gives no assurances whatsoever. No warning bells either. Just a sweet silent man in the middle, passing along all the best parts to his lover.

It’s not a joke for the 300,000+ people who documentedly had their emails and facepalms compromised. We thought he was kind to give an interview and we wait in awe for his next move.

I’m not going to mention the fatal flaws in certificate revocation that became embarrassingly apparent when the damage was done.
What’s hardly the matter since this kind of thing is bound to crop up, that hole in TLS was deemed unexploitable – now there’s a Titanic if I ever saw one. Un sinkable. Too fat to die.
cert failure

SSL is an open book for those who dare to look, and it’s got more than a couple old bugs. It’s okay though, we can patch it, they will say. Dare to look the other way!
Not that you need those anyway, since there are some really nice sslsnarfing techniques out there that entirely forgo attacks on SSL as “too inefficient”.

But I say nay! Unacceptable. There is another way.. and we’re already doing it! We sign our own signatures and we back each other’s signatures.
Now that’s business, something that the companies on your CA trusted list were painfully aware of when they laid down the law of the code and put themselves on the trust list. Yet still ca-cert is not on your trust list, and warning bells fly off on some of the most trustworthy sites- self-signed ones.

Just don’t ask them why or how, or anything that isn’t directly relevant. Do you even know what is on your trust list? You might just be surprised at what you can find.

# ls -al /etc/ssl/certs | wc -l
479

How many of these do you trust? How many of these should you trust? I’ll tell you: *none*.

We should not be adding static lists of central signing authorities to our systems. This is a brittle and dangerous system. We knew this, but hackers have now thankfully demonstrated it.
A better way is for every person (and by extension every browser) to keep their own list of signing certs, and to exchange these certs with their friends (automagically, if you like). Your friends lists can come out of a social network, any social network, and it will mean that any site that has been vetted by one or more of your friends will likely be safe for you to use as well. It’s even better than that, you can check certs from multiple friends and detect discrepancies.

green padlock
That, my friends, is called the Web of Trust, and is a design that is heading in the right direction. convergence.io is doing something similar already to a Firefox near you, while GPG has worked like this for three decades!

It has to be simple. It has to be very simple. And it has to be chemically free of one word: ‘central’.

One real easy way to do this on linux would be using git and signed manifests. I already do this in gone to assure that only files on a manifest signed by a trusted key get installed.

ip6 DNS wildcards considered harmful

September 23rd, 2011

I discovered something yesterday that might be of consequence:
If you have ip6 connectivity the domain name resolver will prefer an ip6 wildcard domain over a ip4 A or CNAME record. This breaks things like ssh. You’d expect the resolver to choose the response that is most specific, the same way ip4 wildcards work, and not to blindly prefer ip6 wildcards.

Consider the case of Mary, who’s been around and has lots of domains:

hail.mary.com
naked.mary.com
see.mary.com

and she’s also wildcarding all the other *.mary.com to her vanity host me.mary.com… you get the idea, it’s fairly typical. Those hosts only have ip4 connectivity. Now she adds a new address ip6.mary.com and puts a wildcard ip6 record *.mary.com, expecting that people accessing foo.mary.com on ip6 get the wildcard address – and they do! But she gets alot more than the doctor ordered, her ip6 clients will also get the ip6 wildcard address for all her other domains! hail.mary.com, naked.mary.com and see.mary.com will all land on ip6.mary.com instead of the ip4 A records. What happened here?
Effectively, Mary’s ip6 wildcard broke all ip6 to ip4 connectivity for Mary’s existing subdomains!

Yep, you can fix it on your machine, but this is a client problem and you can’t fix everybody else’s resolvers, so what you have to do is avoid ip6 wildcard domains ENTIRELY. Thanks a bunch.

On a completly different node:

“debug This option is recognized by pam_ldap but is presently ignored.”

I mean wow. What did they do, write the whole module flawlessly on the first try? I wish.

firefox + geolocation = m0ar paranoia

August 26th, 2011

Just a quick note pertaining to a previous post, namely the new evil that is firefox geolocation. This is new in firefox 3.5. Yes, it is opt-in and yes firefox does not track you but yes the servers you opt in to will track you and that my friends is one of the most serious misfeatures of our times, repeated again and again in stuff like Google Latitude, Android and Apple photo geo-tagging.
If you care about your personal security at all you do not want the internet tracking where you are, which is essentially what this amounts to.
Disable it now by going to the about:config location in your firefox, typing geo. in the search field and double clicking the geo.enabled line so that it says

geo.enabled    user set  boolean   false

That’s it for now.

failcode

August 18th, 2011

In my time as an application programmer. developer and designer, breif stint as team lead and project manager,
as well as my time as a systems consultant, I have witnessed first-hand and also heard many credible tales of systematic failure that rival any of the stories on The Daily WTF. My collegues and I have seen so many examples of bad design, bad code and systemic failure that we have considered writing a book titled How To Write Ugly Code.

I have also read the Texas Instruments Chainsaw massacre and personally met Gomez while debugging applications.

My speciality and my interest lies in diagnostics and the analysis of problems as well as system security, and my experience has showed that one can venture to say something about the qualitative difference of different design methodologies and what they have to say for the end result.

Firstly however, it is worth noting that the software industry as a whole has one primary problem: the time pressure to deliver new features at the face of mouting expectations.

This pressure to deliver is seen as the driving force behind industry progress and ever leaner, more economic applications, however contrary to this belief I have evidence that it leads to incentives for sloppy work, overengineering and poor considerations of the problem domain. It seems the process itself rewards poor application design, regardless of development methodology.

Large corporate and government tenders, which affect many hundreds of thousands of peoples lives, get bid on by large software houses that believe they can deliver everything (at low risk: if they cannot deliver it is very hard for the customer to contest this to a major software house).

What we get by and large out of this process are bloated top-down applications designed by people who do not understand the (whole) problem, leading to misguided decisions for such things as

  • choice of platform and language
  • choice of coding standards (check out Systems Hungarian if you don’t believe me)
  • programming methodology
  • communication tools: source control, ticket and forum tools for developers and system managers
  • Not Invented Here-practices
  • monkey-coding by people whose talents could be employed to solving the problem

What usually goes for as “agile” development causes frequent ineffective blame-game meetings.
Unit test driven development frequently causes micromanagement of program details and inflexible designs,
… all these methodologies were designed to improve programs, not bog them down! why then do they cause so much breakage?

The pressure to deliver requires the application developer to prefer large swathes of ready-made library code and a high level of abstraction to allow her to meet deadline demands.

A high abstraction level causes low debuggability and poor performance.
Low debuggability because bugs are by definition conditions caused by circumstances unforseen by the application developer. Abstractions are employed by the developer to hide implementation details to aid clairty and speed of application development, at the cost of debuggability.

The very tools and abstractions employed by the application developer create the frame through which the developer can see the circumstances of her design and code. Bugs most frequently occur on the boundries between abstractions, where the developer has no possibility to forsee these circumstances. Furthermore, in a system that has a passibly high level of abstraction there is a whole stack of hidden details which must be traced and unwound to discover the bug. Therefore, every additional layer of abstraction obscures the debugging process.

The debuggability and algorithmic simplicity is key in achieving optimal performance. In other words, if we have a clear problem statement it is possible to achieve performance. If there is no clear problem statement, and the program is further muddled by abstractions and interactions there is no effective path to performance.

Any artist will be able to tell you that the most interesting, creative and innovative work comes out of having a stress-free, playful environ. Since innovative coding is a creative activity, the same applies to developing applications, something that game developer companies and creative shops have known for years, and behemoths like Google and Microsoft have picked up on, reinvesting up to 15% of their revenue into research and development and getting that part right, as witnessed by the sheer output of innovation.

If there is a clear path to solving these fundamental problems of IT then it is putting the people who know what they are doing in the pilot seat, enabling developers to choose for themselves not only toolchains, methodology and communication tools but also engaging the systems thinkers into creating the specifications and architecture of the systems they are going to implement. The good news is that as customers and managers get savvy to this method of achieving IT success, we are going to see more developer autonomy and less spectacular fails.

security or privacy? both please!

July 11th, 2011

Hello readers, fellow bloggers, fell trolls… it’s been a while.

Happy Wheel

If you didn’t know that just following a link could lead to loss of your identity, virus infection and the unknowing participation in denial of service sieges, distribution of child pornography and letting attackers break through your company’s firewalls (not to mention immanentizing the eschaton), you could be forgiven for not paying attention to the happy happy field of information security.

If you knew this rather distressing fact about the web of links, but thought you could defend with an up-to-date antivirus, a current browser and the avoidance of “shady” websites I would hate to prick your bubble by telling you regular honest websites are being used against us, browsers all have to run the all-singing-all-dancing-all-executing flash and jave and anti-virus is not only ineffective but doesn’t even target this type of attacks. Knowing these things might be a little distressing so I won’t tell you.

At least my bank is secure, you say, it’s got SSL and everything! Well, you’d be twisting my arm into telling you, embarassedly, that SSL as implemented in browsers is very neatly broken, that all we needed was one of the Certified Trusted Authority Parties to go bad for us all to wake up with a butt-ache, but we now have not one but at least three such bad parties, not to mention all the MiM magic and DNS trickery that you don’t want to hear about anyway.

I will tell you however that the only defense is two-pronged: not allowing websites to script – which is a pain – and damage containment, which is not exactly perfect.

Let us say you already knew all of this, but no longer cared because there was an even greater danger on the web: the total breach of containment of privacy that is social media and tracking cookies which all want to integrate and track your every move through the web so that usage and mood patterns can be profiled, tracked, bought and sold. Doesn’t matter, right? You’re logged in to Facebook, Linkedin and Google+ and get all your links from there, so you have your own personal filter which only shows you stuff you care about, turning your blind eye towards anything you need to know that comes from left field, suckering you into giving away your privates every time you hit “like” or “add to friends list”.
pacman ghost

In a post-panopticlick world there is really only one lesson to be learned: care about your privacy, because noone else will give a damn. It’s not about whether you have anything to hide, it’s about the accumultion of your private info by crooks to use as they please!

Oh and crooks include the great people at Facebook, Microsoft and Google, that’s why I recommend disabling all tracking features that come in the guise of “better speed” or “increased security”. Pictures below show how to do it in chromium and firefox.

chromium dialog, how to disable google tracking

Ok that was Goggle’s Chromium – note all the unchecked- checkboxen… disabling prefetch, malware blocklists and suggestion services, all of which are sending far too much data to Google’s scaredy-ass all seeing eye. Aaaand here’s Firefox:

fuckfox prefetch prevention

Mhm that’s pointing the browser at about:config, searching for prefetch and setting it to false. Yep.

Those pictures show you how to find and set better privacy features in your browser. Safari users are up shit creek, but unlike the Internet Explorer users, they at least have a paddle! Great stuff, what next?

Keep a “secure” browser to browse with that you don’t log into anything personal with.. and make this your default browser!

What is a “secure” browser? Let’s talk a little about that without insulting anyone’s intelligence, shall we?
First off, I’m putting the word “secure” in uhm qoutes, because uhm the browser will never be secure, be it ever so protected. Ok, moving on you want to be running noscript and or adblock and or disconnect and or noflash, depending on whether you are a Mac, Windows (but then you are at game over already) or Linux user with Firefox or Chromium (NOT IExploder, that shit is scary!).

All of these tools make it so the sites you visit don’t get to run code on your machine. The end result is that sites are marginally less usable, but that popup popunder popver poopop ads and scary tracker/botnet/mal stuff doesn’t run automagically. Here are some links:
noscript
adblock
disconnect
Flashblock
– Have you heard about sandboxing?

java and flash denied in chromium Chromium is getting the right idea by killing java and flash from the get-go. This should be the default, but you have to make it so!

You should definitely be cloaking your user-agent, as it is a useless yet very telling string your browser hoes out to all. Do this with the User-Agent Modifying Extension.

Also, you may find Torbutton and Foxyproxy interesting, the real privacy is in bouncing your traffic thru things that wash it for you. Putty or any other decent ssh client will create a proxy for you:

`ssh -D8080 me@myshellserver`

and you can point your browser’s SOCKS proxy settings to localhost:8080 putting all your traffic through your shell server!

sshnuke

The best has yet to come mua ha ha ha.

datalagring i praksis: politiraidet mot autistici

January 18th, 2011

Historien om politiraidet på den frivillige italienske organisasjonen Autistici er et praktisk eksempel på hvorfor Datalagringsdirektivet ikke bør innføres.

Autistici.org er en non-profit organisasjon som tilbyr betalingsfrie e-post og bloggtjenester laget for å være motstandsdyktige mot sensur.

Her er en kort oppsumering av saken:

  • Som ledd i tiltakene mot sensur ligger Autistici sine tjenester fordelt på servere over hele verden, og dataene ligger kryptert på disk.
  • Autistici logger ikke oppkoblinger og bevarer ingen personlig informasjon om sine brukere.
  • Ett av Autistici sine tjenere står i serverparken til en norsk organisasjon for fremmelsen av fri programvare.
  • Den 5. november 2010: Politiet beslaglegger en harddisk fra serverparken basert på en anmodning fra italiensk politi.
  • Anmodningen fra Italia navngir én e-postkonto og ber om innhold, innlogginger og endringer på gitte e-post konto.
  • Beslageleggelsens begrunnelse er en trusselsak.
  • Den italienske anmodningen beskriver det straffbare forhold som fornærmelser mot omdømmet til to ledere av den neo-fascistiske organisasjonen Casa Pound.
  • Norsk politi går utover anmodningen og tar speilkopi av to harddisker, som inneholder e-posten til 1500 brukere og kontoinformasjonen til 7000 brukere.
  • De beslaglage diskene inneholder ikke e-postkontoen nevnt i den utenlandske anmodningen.
  • Lignende andmodninger antas å ha blitt sendt til Autistici sine serverparker i Nederland og Sveits.

Onkel tar allerede for mange friheter med andres data. I dette tilfellet har 1500 uskyldige menneskers private e-post blitt rammet av en ransaking på vegne av en anmodning fra en fremmed nasjons interesser, innblandet i et forhold av tvilsom legalitet, der siktede er en ukjent person.

Saken ruller videre for å undersøke beslagets lovlighet og sikre at kopiene ikke uleveres i sin helhet til italienske myndigheter.

Saken er også omtalt i:

Morsomme eksempler på fremtidig misbruk av DLD kan du finne på #DLDloggen.

Din sikkerhet og datalagring

December 27th, 2010

Det er en stund siden jeg tok taxi – tror jeg – og det er kanskje derfor jeg ble svært overrasket når jeg satt meg inn i en taxi her om dagen og oppdaget at den er videoovervåket! Endringen må ha skjedd så og si over natta her i Oslo, og nå kan du knapt finne en uovervåket bil til tross for at video-overvåknings-ordningen er valgfri for sjåførene. Nåja sier jeg, den er nok ikke helt valgfri, det er bileieren som velger å overvåke bilen – sjåføren velger ikke om det skal overvåkes, og passasjeren aller minst. Videoovervåking er bra svarer sjåføren, de som ikke gjør noe galt har ikke noe å frykte.

storebror ser deg
Jeg tror det er nettop de som ikke har gjort noe galt som har mest å frykte. Taxi-sjåfører lever allerede i en verden der hele arbeidsdagen deres blir overvåket. Hva synes du om at nettop din arbeidsplass videoovervåkes? På kontoret, sykehuset, i butikken? Hva med i bilen din, på skolen eller i barnehagen? (For øvrig skjer dette allerede). Når kommer vi til å “frivillig” sette kamera i hjemmet, for å koble dem til videosentraler der menn betales for å sitte hele dagen og hele natta og stirre på skjermer?

Hvor er all dokumentasjonen som sier at vi blir tryggere av alle disse kameraene? Er det bare undersøkelser som er sponset av sikkerhetsfirmaer som tjener godt på den økte mistilliten, eller er det noen uavhengige sikkerhetseksperter som har uttalt seg?

Svaret er selvfølgelig at sikkerhetseksperter har uttalt seg, eksperter med millitær bakgrunn såvel som politimyndigheter og påtalesmakter, og det har også jurister og etterforskere av volds- og pedofili-saker, og det viser seg at ekspertene er enige i at økt overvåkning ikke vil føre til flere oppklarte saker, og at det tvert imot vil føre til at vi som blir overvåket gir opp vårt privatliv, ergo vår personlige sikkerhet blir med ett utslettet av at alt vi foretar oss blir logget og sporet.

Kriminelle trenger ikke å bry seg om personvern, om overvåkning og om hva som er rett og galt – de kan stjele identiteter, skjule seg bak falske adresser, kontonummre og proxy servere. Kriminelle kan stjele mobiltelefoner og kaste dem etter bruk, eller de kan som alle andre, bruke krypterte forbindelser og anonymiseringstjenester. Det krever litt ekstra kompetanse, men kriminelle har ekstra incentiv. Derfor vil ikke datalagringsdirektivet få has på kriminelle: ikke bare er direktivet rent teknisk bak mål, men grunntanken bak at økt overvåkning reduserer kriminalitet og øker oppklaringsevnen – er feil!

Nå vil jeg nevne at jeg har fulgt utviklingen av saken om Datalagringsdirektivet med stor interesse siden jeg hørte om forslaget et par år siden. Derfor tok jeg meg en tur til Rosenkrantzgate når det var DLD seminar i november. Her trekker jeg fritt paralleller til videoovervåkning selv om det dreier seg om mye mere: det dreier seg om geografisk sporing av deg og meg via posisjonssensor på mobilen, og det dreier seg om kartlegging av alle personer du har kontakt med noensinne over epost, telefon, SMS og sosiale medier.

Spesielt interessant har det vært å høre om implementasjon av det tilsvarende EU-direktivet i forskjellige land i Europa, og hvordan leverandører av kommunikasjonstjenester forholder seg til direktivet her hjemme. På seminaret stod sikkerhetssjefen i Tele2 frem og fortalte at de pr i dag lagret noe informasjon i 3 til 5 måneder av hensyn til faktureringsspørsmål, og her har det alltid vært snakk om hvor fort de kan slette informasjonen for å begrense resurssbruket. Datalagring vil føre til en regimeendring, sier han, der de må opprette og drifte en egen datalagringsbase i tilfelle myndighetene skulle trenge informasjonen.

Allerede nå har det vært tilfeller der myndigheter uten fullmakt til slik informasjon har prøvd å få tak i telefonlogger og fakturaer og lykkest i flere tilfeller på grunn av at personer ansvarlige for informasjonen ikke er sikre på hvordan de skal forholde seg til forespørslene.

Synes du at NAV skal få tilgang til fakturaene dine når de behandler en søknad fra deg? Hva med fullstendig kart over hvor du har vært de siste 5 årene? Hvis ja, hvem andre bør få tilgang når du selv ikke har innsyn?

Pressen vil værne kilder selv om de utsettes for press på å utlevere kildene sine. Menneskerettsdomstoler krever tillit til kildeværn. Verktøyet (datalagring) blir fort misbrukt, og det må settes begrensninger for ellers vil det bli misbrukt! Det kan dreie seg om utilsiktet oppdagelse av identiteten til kilder.

Påtalemyndighet som jobbet i narkosaker var med på DLD seminar og forklarte at hun har brukt lagret data i etterforskning. Hennes konklusjon var at vi må tåle noe kriminalitet, ellers lever vi i en politistat! Det er politisk vanskelig å si nei pga EU/EØS, men dette dreier seg om paradigmeskifte som snur vitnesbyrde opp ned. Redd et barn fra Helga Pedersen! Problemet er når vi ikke har mistenkte. Skal vi sitte og søke etter mistenkte?

Når vi trår inn i Orwells styggeste fantasi, skal vi da gi individuell tilgang til data? Skal vi åpne for at det finnes noen anonyme samtaler, for eksempel kilder til journalister eller hjelpelinjer for voldtektsofre? Hvordan skal vi sortere og analysere dataen, og hvor lenge? Straks vi setter igang lagring blir det vanskelig å bli kvitt dataene, og vanskelig å holde styr på lovendringer og tilganger, de som forvalter dataene, altså telefon og data-tilbydere, har ikke kapasitet eller kompetanse til å lagre dataen forsvarlig.

EU-direktivet sier at hver medlemsstat kan trekke seg fra datalagring hvis de kan begrunne hvorfor – og Romania og Tyskland har sagt nei fordi det strider mot grunnloven!

Hvis du tilfeldigvis befinner deg 100 meter fra stortinget idet en voldelig demonstrasjon pågår, og denne informasjonen blir lagret, hvor lenge tror du du må forklare deg for å slippe å bli implisert i volden? Norsk politi har allerede brukt posisjonsdata for å kalle inn til avhør 100 mennesker som var i nærheten av en kriminell hendelse. Glimrende politiarbeid, vil jeg si!

Om en kjent kriminell ringer opp en kjent og kjær politiker som Storberget og legger på idet den andre tar telefonen, og samtalen blir logget vil da Storberget kunne assossieres med kriminelle og kalles inn til avhør basert på mistanke. Lekker saken om innkalling til avhør er politikerens karriære over uansett om han var skyldig i noe eller ei.

Der ble meg fortalt en søt men sann historie fra USA, der myndighetene i en liten delstat på landet vant en stor pengepremie som de skulle bruke på å øke sikkerheten i delstaten. De brukte pengene på å sikre demninger i delstaten, og når sikkerhetsfolk kom på besøk og spurte dem hvorfor delstatskontoret var åpent og usikret forklarte senator i delstaten at hvis det er noen som vil ha ham til livs, får de bare komme og skyte ham nårsomhelst. Han var mere opptatt av sikkerheten til folket sitt, og den var best tjent ved å sikre mot flom og andre naturkatastrofer.

Det viser seg forøvrig, ved hjelp av de meget hjelpsomme cablegates, at i hvert fall ett lands datalagringslovgiving har blitt satt på den politiske agendaen med sterke føringer fra en ekstern supermakt. Hvem trenger å snakke om konspirasjonsteorier og paranoia, når vi har ekte konspirasjoner.

Siden en av oppgavene mine er nettopp sikkerhet og overvåking av nett tenkte jeg det verdifult å dele med deg mine erfaringer med datalagring og hvordan ditt privatliv vil bli direkte berørt av et lovforslag som er konstruert for å gi store gutter flere leketøy og inget annet. Dette oppå eksisterende lovgivning som tillater overvåkning av kriminelle.

… og vet du hva? Taxi-sjåføren visste heller ikke hvor kameraet var hen i bilen. 1984 ringte og vil ha tilbake storebror som alltid ser deg! Kanskje regjeringen burde komme seg på Facebook og “friend-e” alle så de kan følge med på hva som skjer?

caching wikileaks with varnish

December 3rd, 2010

In times like these I like to remind everyone that truth is a virtue and that the best resistance to corruption is decentralization.

With that in mind I quickly threw together a cache for wikileaks at wikileaks.uplink.li. This is not a full mirror obviously but it will stay up even though the main sites go down.

The host in question isn’t really specced for high loads but that is beside the point. The point is that you can do this in your own varnish instance. Here is how you do the same, in VCL:

# help wikileaks stay online by providing caching
# helps more if you have persistent storage.                                                                                                
#
# comotion@krutt.org 2010-12-03
# http://kacper.blog.linpro.no
 
backend wikileaks2 {
   .host = "213.251.145.96";
   .probe = {
      .url = "/img/favicon.ico";
      .interval = 60s;
   }
}
backend wikileaks3 {
   .host = "wikileaks.ch";
   .probe = {
      .url = "/img/favicon.ico";
      .interval = 60s;
   }
}

# won't work while the DNS name is taken out
#backend wikileaks1 {
#   .host = "wikileaks.org";
#   .probe = {
#      .url = "/img/favicon.ico";
#   }
#}
director wikileaks round-robin {
   #{ .backend = wikileaks1; }
   { .backend = wikileaks2; }
   { .backend = wikileaks3; }
   { .backend = wikileaks4; }
}
 
sub vcl_recv {
   if (req.http.host ~ "^(wiki)?leaks" ||
       req.url ~ "^/(wiki)leaks" ||
       req.http.referer ~ "leaks"){
      set req.backend = wikileaks;
      if(req.backend.healthy){
         set req.grace = 7d;
      }else{
         set req.grace = 365d;
      }
   }
}
 
sub vcl_miss {
   if(req.url ~ "^/(wiki)?leaks"){
      set bereq.url = regsub(req.url,"^/(wiki)?leaks","/");
   }
}
sub vcl_fetch {
   if(req.url ~ "^/(wiki)?leaks"){
      set beresp.grace = 365d;
   }
}
 

You can save that to /etc/varnish/default.vcl and reload varnish.
Or, if your Varnish instance has other sites on it, you could save it to /etc/varnish/wikileaks.vcl and add the following near the top of your default.vcl:

include "/etc/varnish/wikileaks.vcl";

Isn’t it beautiful?
You may not be able to set up a full mirror, but now you can go set up your varnish to cache Wikileaks!

PS.
The opinions expressed here are not necessarily those of my employer nor anyone else associated with me, Varnish or anything really.

consolekit is evil

December 1st, 2010

… and hates me

I should really tell you about the DLD seminar three weeks ago, or the PARANOIA security conference, or even just that Adobe should be considered harmful but things have been crazy and between this and electromagnetism I haven’t had the mind space. After the 6th of december, I promise I’l come back with pictures and relations and maybe even sounds (I have notes, don’t worry I’ll remember).

On the other hand here’s a nasty hack to kill console-kit-daemon, which has a really nasty way of polluting the PID-space… and annoys me enough to warrant a public humiliation as well. What does it do, and why? Who cares what it does, it’s doing it poorly enough to catch attention to itself! So here’s how to kill it:

root@wasp:/usr/sbin# dpkg -S console-kit-daemon
consolekit: /usr/sbin/console-kit-daemon

DON’T try to purge the package because that’s just one end of a really big ugly yarn of unneccessary dependency pain that I’d like to spare you…

DON’T try to replace /usr/sbin/console-kit-daemon with your own stub… turns out dbus autostarts this “service”, and that approach will make dbus block your (ssh) session when you log in… not forever, but that’s even more annoying than the pid pollution.

Instead, debian bug #544147 and #544483 clewed me in to the following hack:

cp /usr/share/dbus-1/system-services/org.freedesktop.ConsoleKit.service \
  /usr/local/share/dbus-1/system-services/
echo Exec=/bin/false >> /usr/local/share/dbus-1/system-services/org.freedesktop.ConsoleKit.service

which is a two-liner, and would have been less ugly and easier to debug if it hadn’t been for the fine hubris of the freedesktop dudes…

Free society conference – my experience

November 9th, 2010

Until the very last minutes I was unsure whether I’d make it to FSCONS, the Free Society Conference and Nordic Summit. I did not know what to think of it, despite gentle pushings from one set to speak at the conference. Three days later and with the event somewhat filtered in my mind, there is no doubt that it was well worth the opportunity costs and then some.

I'm going to FSCONS 2010!

My first impressions while browsing the event programme were that there was no way to attend all the interesting talks! An insurmountable problem, and I hadn’t even gotten there: my meat could not be in two to three places at the same time, while my mind could not rationally choose away interesting unknowns.. so I opted to leave it to a vague notion of chance and intent.

What I hadn’t expected was the powerful impact that the people attending would have on me. Cyber heroes and heroines, freedom fighters, game changers, inventors, uncelebrated cryptoanarchists and everything makers were some of those that I got to know, that engaged me in dialogue, that dared discuss openly some (most?) of the hardest problems that we, the world, are facing today. With the full intent to do something about these problems.

Read the rest of this entry »