A data center guideline for quoted companies: a call to action for FSMA?

1-enquete-disaster-recoveryA survey of Belgian, quoted companies that LCL ordered, showed that data security is not seen as essential within IT governance, not even with quoted companies.

One of our clients, a health care company, chose to work with us after their government control body said it was no option to work with only one data center.

In case of a disaster, you risk losing absolutely all your data. After your power shuts down, your company does too.

If you really want to be safe, at least 25 km should separate both data centers.


Moreover, best practices dictate that one should separate the development environment from the production systems.

What are the odds that the current mentality – we all trust that all will go well – will change in the short term?

Only a minority of companies interviewed said they were planning to set up a second data center.

2nd dc?

distance between dc's?


If we really want change, it will have to be directed by the stock exchange control body: FSMA.

So, in the best interest of our Belgian quoted companies, for the sake of their business continuity and employment – not to mention the shareholders who want return on their investment; data loss will almost certainly cause share devaluation – we call upon FSMA to issue a new guideline for quoted companies.

A guideline pushing quoted companies to have a second data center, and to either thoroughly test all back-up systems, including power backup, or to confide in a party that does just that for them. It’s a pain in the lower back part, but people will not move unless they have to.

Laurens van Reijen, Managing Director LCL

Data security doesn’t really seem to be a priority…

1-enquete-disaster-recoveryA survey of Belgian, quoted companies, commissioned by LCL, shows that only 3% of the targeted companies ever test their power backup systems by actually turning off the electricity. Meaning that they will only learn whether or not the power backup systems work when there is a power cut. That’s like buying skis and not trying them on before you actually hit the snow. Or going hiking with brand new boots, straight from your favourite online shop. The only guaranteed result is sore feet.

We’ve all read and heard what deficient power backup systems can lead to. Remember the power cut at Eurocontrol? The business world couldn’t believe the company shut down just like that, by lack of well functioning backup systems.

We knew that many companies are only theoretically prepared for the worst-case power scenarios. But we never expected it to be that many. 97% of the companies plug their power backup and pray; that’s like: as good as everybody. In France, they expect to have an electricity shortage of 5 GW next week. Knowing that we generally import electricity from France, next week could represent a live test for the companies concerned…

Another astonishing fact is that 53% of the surveyed companies doesn’t have a second data center. Meaning, that in case of any disaster, not just a power cut, they have a big problem. More over: only a minority of companies interviewed said they were planning to set up a second data center.

This shows that data security is not seen as essential within IT governance, not even with quoted companies. How many Board members are aware that data security is taken so lightly in their company? More and more, ICT is on the Board’s agenda, and rightly so. All we need to do now, is educate Board members so that they can evaluate the security systems in their company/ies, and make sure that they really are as safe as they should be.

Laurens van Reijen, Managing Director LCL

Comatose servers: things will get worse before they get better

shutterstock_71528611Thirty percent of servers around the world are doing nothing at all. They are switched on, ready for service, and actively draw power and consume resources such as cooling, yet no-one would notice if someone decided to turn them off.
Articles are published on the energy use of data centers with clockwork regularity. It’s said that, at a global level, data centers consume as much energy as a large country such as the United Kingdom. Their carbon footprint is claimed to be roughly the same as that of the aviation industry. On a more positive note, there are signs that the energy consumption of data centers is stabilizing, although a great deal of work still needs to be done.

Forgotten servers
It goes without saying that one of the simplest ways to waste less energy is to turn off what are known as ‘zombie’ servers. You may wonder why this hasn’t happened yet. The main reason is probably that at most companies the electricity bill isn’t paid by the head of IT. In fact, the IT staff has no idea how high the bill is. This means they have no incentive to check which servers are actually being used.

Does the cloud create more zombies?
The fact that 30 percent of servers are comatose is old news. Consultancy firm Anthesis Group and Jonathan Koomey, a researcher at Stanford University, published a study on this subject in 2015 and a number of other past studies reached the same conclusion. I’m afraid that little improvement is expected in the next few years. Companies are moving more and more applications to the cloud, but this doesn’t always lead to a reduction in the number of servers at those companies. It will come as no surprise, then, that the number of servers is growing considerably worldwide. The number of comatose servers therefore tracks the rising popularity of the cloud.
This is somewhat ironic, as one would expect the cloud to bring about a more efficient use of server space. This will undoubtedly be the case in the long run, but we need to pull the plugs on the old servers first.
But the problem comes down to more than just the tendency of IT departments to expand their collection of servers. What about the ever-growing mountain of unused data stored by most companies? Most organizations have spent a great deal of time and money collecting this information and therefore hang onto it for dear life. Data doesn’t come with an expiration date, and so no-one actually gets rid of obsolete data. Instead, it fills up servers that in turn consume power and resources.

Data centers are part of the solution
One part of the solution would be for more companies to make the transition to a professional data center. It’s surprising that so many companies still run their own server rooms, which aren’t always managed with the same level of expertise as professional data centers. Even if we disregard the questionable security of their corporate data, companies that run their own server rooms also make very inefficient use of server space, cooling and the like.
Owing to the scale of professional data centers, we can invest more in efficient climate control, leading to lower energy consumption. LCL’s ISO 14001 certification is confirmation of our ongoing efforts to reduce the environmental footprint of our data centers. If all of the servers currently kept in in-house server rooms were moved to independent data centers, the global ecological footprint of the sector would be greatly reduced.

Moreover, the IT managers of LCL’s customers know full well how much electricity their servers consume each month. They can see this clearly and transparently in the invoices they receive, which are paid for out of the ICT budget. Customers of professional data centers know that it’s in their interests to seek out comatose servers and keep power consumption under control.
When it comes to excessive energy consumption, the finger of blame is often pointed at data centers. The facts, however, show that well managed data centers contribute to a more efficient and rational use of energy in the data storage sector.

Laurens van Reijen, Managing Director LCL

The Internet of Things and the impact on IT systems and connectivity

shutterstock_243606742Slowly but surely, organisations are now starting to experiment with the Internet of Things (IoT). How will this impact IT systems and connectivity? Companies will continuously collect masses of data. To collect a satisfactory and balanced volume of data (for instance geographically balanced), they may need access to other networks as well as their own. So high speed connections with other operators, like, in a carrier neutral data center, are essential.

Will the collected data be even bigger than ‘Big Data’? Nobody knows. What we do know is that organisations need flexible solutions for IoT projects. They will want the possibility to scale their storage capacity (and the rack space that goes along with it, in their favourite data center) on demand. And that’s only the beginning. They will need data warehousing solutions, as well as solutions and probably partners to analyse the collected data. And they may want to communicate with other data sources. Such as ‘open data’, offered by government organisations.

Indeed, government organisations at different levels need to offer free access to their data. Data from traffic sensors for instance, that companies as well as other government bodies may want to run against their database of, for instance, licence plates. This, again, asks for connectivity and easy access to a multitude of partners. It will take some time before all government Open Data are truly available, of course. Some government organisations are more hesitant than others to comply. And there are a lot of ongoing discussions about ownership, use, integrity, offering completely free access or rather keeping a certain level of control, etc. Open Data are here to stay however; the trend is irreversible. More and more companies offer free access to their data too. We’re all hoping that Open Data will stimulate innovation and the development of new applications. What we know for sure however, is that data centers will be needed, to store and offer safe access to Open Data if nothing else. Preferably local data centers, so that Belgian law applies. We wouldn’t want our traffic data to be in American hands, would we?

So, a bright future is ahead of anybody working in the data center industry, in network communications, in big and even bigger data, and in business analytics. We at LCL, cannot wait to see it all happen!

For more information on government Open Data click here

Laurens van Reijen, Managing Director LCL

Liggen onze overheden wakker van de veiligheid van hun data?

Het bankwezen in België heeft de boodschap begrepen. De Nationale Bank van België (NBB) heeft eind 2015 een circulaire rondgestuurd naar alle Belgische financiële instellingen met richtlijnen voor hun operationele bedrijfscontinuïteit en databeveiliging. Wie wil er nu niet dat zijn zuurverdiende centen veilig zijn? Banken hebben een kritieke rol in het financieel systeem en een groot maatschappelijk belang. Het spreekt dus voor zich dat zij voorzorgsmaatregelen nemen tegen operationele schade, verstoringen van het elektriciteitsnet of diefstal. De circulaire stelt dat een financiële instelling steeds moet beschikken over twee datacenters, minimaal Tier III, die niet binnen dezelfde stedelijke agglomeratie liggen en minimaal 15 km uit elkaar. Minder dan 15 km mag, maar dan mits voldoende onderbouwde risicoanalyse voor te leggen aan de NBB. Bijkomende voorzorgsmaatregelen en/of uitwijk- en hersteloplossingen worden voorzien op een afstand van tenminste 100 km. Minder dan 100 km mag, maar dan ook mits voldoende onderbouwde risicoanalyse.

Hoe zit het nu met de overheidsdata? De federale overheid gebruikt veelal 4 datacentra die centraal in Brussel zitten vlakbij de kleine ring en een datacenter in Anderlecht. De afstand tussen de gebouwen gaan van een paar kilometer tot de verste afstand van 5 à 6 km. Dit valt zeker niet onder de bovenbeschreven normen van de Nationale Bank.

Gooi een bom op Brussel en niet alleen alle belangrijke overheidsinstellingen zijn van de kaart geveegd, ook hun gevoelige data. De meeste overheden bevinden zich in het hart van onze hoofdstad, zowel met de eigen server rooms als de externe datacenters. Een blikseminslag of een langdurige verstoring van het elektriciteitsnetwerk is al voldoende om een overheidsinstelling lam te leggen, en daarbij alle kritische gegevens. Conclusie: de datacenters en back-up datacenters van onze overheden zijn niet opgewassen tegen een panne op hetzelfde stadsstroomnetwerk. Zij worden dus bij een stadsbrede stroompanne allebei getroffen. Brussel is bovendien een hoge risicozone. Dat wil zeggen dat de kans er op een natuurlijke ramp of een terroristische aanval veel groter is dan in pakweg Aalst of Antwerpen.

Kunnen onze overheden zich deze risico’s veroorloven? Zij beschikken over gevoelige data over u en ik, belastingaangiftes, onze sociale zekerheid, data over de financiële gezondheid van ons land, data over gesprekken tussen politieke partijen, naties. Kortom: informatie die altijd consulteerbaar moet zijn. Moeten deze data niet extra beschermd worden? Door ontdubbeling in een back-up datacenter buiten onze hoofdstad bijvoorbeeld?

De NBB geeft het goede voorbeeld.

Hoe komt het dat onze overheden ogenschijnlijk niet wakker liggen van interne of externe veiligheidsrisico’s? Vindt u ook niet dat de richtlijnen/regels voor betalingsinstellingen, verzekeraars en kredietverstrekkers ook zouden moeten gelden voor overheden? Binnen de overheid heerst een tendens die zegt dat zij zelf de datacenters moet beheren. Is het niet efficiënter om dit aan externen over te laten met goede SLA’s?

Laurens van Reijen, Managing Director LCL