Measuring and monitoring our own energy use
Our energy use
This graph below shows CSE's electricity use for March 2013 (4,783 kWh of which the server room accounts for 2,004 kWh).
This is the highest monthly total since March 2012.
The spikes show when staff come into the office and switch on lights, computers, water-heaters etc. The red band shows electricity used by the server which remains roughly constant hour by hour and day by day.
Click on thumbnails below the main image for graphs of previous months' energy usage.
When it comes to cutting energy consumption, we at CSE are obviously keen to do our bit, and key to this is monitoring. We've invested time and money into measuring our energy use, particularly collecting data that enables us to map exactly where energy is used, where energy is wasted, and how best to manage the settings on our office equipment.
So how are we doing? Let's take gas consumption first. This is very low indeed – about 5% of the 'benchmark' figure for similar buildings with similar uses. This is because we don't heat the office much. It's a very new building with few exposed walls, a library below and flats above, so once it's warm, it stays warm. In fact, we probably use more energy keeping the office cool in summer.
So, the largest proportion of our energy use, by far, is electricity, split between lighting, appliances (computers, printers, fans etc), hot water and the computer servers, of which more later.
Now, we're pretty good at the 'behavioural' side of energy saving: switching things off, not over-filling the kettle, etc, and through these efforts we had been making some inroads into our electricity use.
But despite this, our consumption is higher than it was when we began our monitoring programme. This is because we're now doing more data-analysis work, and number-crunching on this scale requires more computing power and more electricity to run the servers.
To counter this, we're exploring several possibilities including: more 'aggressive' power management settings on our servers; 'virtualisation' of servers; voltage optimisation; changes to our lighting (although as tenants our options are limited); better shading to cut down on summer cooling; and a 'thin client system' in which all the computer processing is done on the server and the many desktop PCs in the office are replaced by small devices that interface with the it.