Better writing measured

We say as writers that we can make writing better, but how can we measure this?

You can use editorial authority, or user research, but I wanted to use a way that was simple to analyse, could be done by anyone, and could justify the work we’d been doing.

Word cloud of Defra content.

Word cloud of the improved government content.

 

Content analysed

I’ve written about this in detail on my Defra digital blog on writing analysis. It’s not too different from other ways I gathered content and analysed it.

As before, it was about getting data and finding what to analyse. Unlike my previous work, where I looked at sentiment, here I wanted something more solid and less controversial.

I was helped in that my content editor had already written on the general principles of what we did to improve government writing so that gave me somewhere to start. 

So to treat this like a proper experiment it meant coming up with testable theories.

Writing ideas to test

Keeping the tests simple, these were my four writing improvements to test:

  1. Better titles
  2. Shorter content 
  3. More readable content
  4. The reader, not the government, is the focus

All of these things can be tested through a variety of programs and time. In practice this meant:

  1. Better titles – these will be longer as they’ll be more descriptive of what’s in there
  2. Shorter content – if each page is to just look at one or two user needs the average page length should fall
  3. More readable content – improved readability scores, less jargon, shorter words and sentences
  4. The reader is the focus – as the government style is to use ‘you’ to address the user, there’ll be more use of ‘you’ and ‘your’ and less use of Defra, government, us etc

Did our writing pass the test?

Yes. The blog has the full details but all these tests were passed.

Though the blog is only one case study, I looked at all Defra content from February 2015 and compared it with February 2013 and the results were even more impressive.

What next for writing analysis?

This was a success. Though sentiment analysis is still an important tool, looking at terms that are less contentious is a better way to get others to trust an analysis.

Last year I was excited by claims that there’s a way to measure writing success of past books. Until I carried out the test myself.

Next I plan to publish my findings on what was wrong, what was glossed over and propose a better way.