Data security doesn’t really seem to be a priority…

1-enquete-disaster-recoveryA survey of Belgian, quoted companies, commissioned by LCL, shows that only 3% of the targeted companies ever test their power backup systems by actually turning off the electricity. Meaning that they will only learn whether or not the power backup systems work when there is a power cut. That’s like buying skis and not trying them on before you actually hit the snow. Or going hiking with brand new boots, straight from your favourite online shop. The only guaranteed result is sore feet.

We’ve all read and heard what deficient power backup systems can lead to. Remember the power cut at Eurocontrol? The business world couldn’t believe the company shut down just like that, by lack of well functioning backup systems.

We knew that many companies are only theoretically prepared for the worst-case power scenarios. But we never expected it to be that many. 97% of the companies plug their power backup and pray; that’s like: as good as everybody. In France, they expect to have an electricity shortage of 5 GW next week. Knowing that we generally import electricity from France, next week could represent a live test for the companies concerned…

Another astonishing fact is that 53% of the surveyed companies doesn’t have a second data center. Meaning, that in case of any disaster, not just a power cut, they have a big problem. More over: only a minority of companies interviewed said they were planning to set up a second data center.

This shows that data security is not seen as essential within IT governance, not even with quoted companies. How many Board members are aware that data security is taken so lightly in their company? More and more, ICT is on the Board’s agenda, and rightly so. All we need to do now, is educate Board members so that they can evaluate the security systems in their company/ies, and make sure that they really are as safe as they should be.

Laurens van Reijen, Managing Director LCL

Comatose servers: things will get worse before they get better

shutterstock_71528611Thirty percent of servers around the world are doing nothing at all. They are switched on, ready for service, and actively draw power and consume resources such as cooling, yet no-one would notice if someone decided to turn them off.
Articles are published on the energy use of data centers with clockwork regularity. It’s said that, at a global level, data centers consume as much energy as a large country such as the United Kingdom. Their carbon footprint is claimed to be roughly the same as that of the aviation industry. On a more positive note, there are signs that the energy consumption of data centers is stabilizing, although a great deal of work still needs to be done.

Forgotten servers
It goes without saying that one of the simplest ways to waste less energy is to turn off what are known as ‘zombie’ servers. You may wonder why this hasn’t happened yet. The main reason is probably that at most companies the electricity bill isn’t paid by the head of IT. In fact, the IT staff has no idea how high the bill is. This means they have no incentive to check which servers are actually being used.

Does the cloud create more zombies?
The fact that 30 percent of servers are comatose is old news. Consultancy firm Anthesis Group and Jonathan Koomey, a researcher at Stanford University, published a study on this subject in 2015 and a number of other past studies reached the same conclusion. I’m afraid that little improvement is expected in the next few years. Companies are moving more and more applications to the cloud, but this doesn’t always lead to a reduction in the number of servers at those companies. It will come as no surprise, then, that the number of servers is growing considerably worldwide. The number of comatose servers therefore tracks the rising popularity of the cloud.
This is somewhat ironic, as one would expect the cloud to bring about a more efficient use of server space. This will undoubtedly be the case in the long run, but we need to pull the plugs on the old servers first.
But the problem comes down to more than just the tendency of IT departments to expand their collection of servers. What about the ever-growing mountain of unused data stored by most companies? Most organizations have spent a great deal of time and money collecting this information and therefore hang onto it for dear life. Data doesn’t come with an expiration date, and so no-one actually gets rid of obsolete data. Instead, it fills up servers that in turn consume power and resources.

Data centers are part of the solution
One part of the solution would be for more companies to make the transition to a professional data center. It’s surprising that so many companies still run their own server rooms, which aren’t always managed with the same level of expertise as professional data centers. Even if we disregard the questionable security of their corporate data, companies that run their own server rooms also make very inefficient use of server space, cooling and the like.
Owing to the scale of professional data centers, we can invest more in efficient climate control, leading to lower energy consumption. LCL’s ISO 14001 certification is confirmation of our ongoing efforts to reduce the environmental footprint of our data centers. If all of the servers currently kept in in-house server rooms were moved to independent data centers, the global ecological footprint of the sector would be greatly reduced.

Moreover, the IT managers of LCL’s customers know full well how much electricity their servers consume each month. They can see this clearly and transparently in the invoices they receive, which are paid for out of the ICT budget. Customers of professional data centers know that it’s in their interests to seek out comatose servers and keep power consumption under control.
When it comes to excessive energy consumption, the finger of blame is often pointed at data centers. The facts, however, show that well managed data centers contribute to a more efficient and rational use of energy in the data storage sector.

Laurens van Reijen, Managing Director LCL

The Internet of Things and the impact on IT systems and connectivity

shutterstock_243606742Slowly but surely, organisations are now starting to experiment with the Internet of Things (IoT). How will this impact IT systems and connectivity? Companies will continuously collect masses of data. To collect a satisfactory and balanced volume of data (for instance geographically balanced), they may need access to other networks as well as their own. So high speed connections with other operators, like, in a carrier neutral data center, are essential.

Will the collected data be even bigger than ‘Big Data’? Nobody knows. What we do know is that organisations need flexible solutions for IoT projects. They will want the possibility to scale their storage capacity (and the rack space that goes along with it, in their favourite data center) on demand. And that’s only the beginning. They will need data warehousing solutions, as well as solutions and probably partners to analyse the collected data. And they may want to communicate with other data sources. Such as ‘open data’, offered by government organisations.

Indeed, government organisations at different levels need to offer free access to their data. Data from traffic sensors for instance, that companies as well as other government bodies may want to run against their database of, for instance, licence plates. This, again, asks for connectivity and easy access to a multitude of partners. It will take some time before all government Open Data are truly available, of course. Some government organisations are more hesitant than others to comply. And there are a lot of ongoing discussions about ownership, use, integrity, offering completely free access or rather keeping a certain level of control, etc. Open Data are here to stay however; the trend is irreversible. More and more companies offer free access to their data too. We’re all hoping that Open Data will stimulate innovation and the development of new applications. What we know for sure however, is that data centers will be needed, to store and offer safe access to Open Data if nothing else. Preferably local data centers, so that Belgian law applies. We wouldn’t want our traffic data to be in American hands, would we?

So, a bright future is ahead of anybody working in the data center industry, in network communications, in big and even bigger data, and in business analytics. We at LCL, cannot wait to see it all happen!

For more information on government Open Data click here

Laurens van Reijen, Managing Director LCL

Liggen onze overheden wakker van de veiligheid van hun data?

Het bankwezen in België heeft de boodschap begrepen. De Nationale Bank van België (NBB) heeft eind 2015 een circulaire rondgestuurd naar alle Belgische financiële instellingen met richtlijnen voor hun operationele bedrijfscontinuïteit en databeveiliging. Wie wil er nu niet dat zijn zuurverdiende centen veilig zijn? Banken hebben een kritieke rol in het financieel systeem en een groot maatschappelijk belang. Het spreekt dus voor zich dat zij voorzorgsmaatregelen nemen tegen operationele schade, verstoringen van het elektriciteitsnet of diefstal. De circulaire stelt dat een financiële instelling steeds moet beschikken over twee datacenters, minimaal Tier III, die niet binnen dezelfde stedelijke agglomeratie liggen en minimaal 15 km uit elkaar. Minder dan 15 km mag, maar dan mits voldoende onderbouwde risicoanalyse voor te leggen aan de NBB. Bijkomende voorzorgsmaatregelen en/of uitwijk- en hersteloplossingen worden voorzien op een afstand van tenminste 100 km. Minder dan 100 km mag, maar dan ook mits voldoende onderbouwde risicoanalyse.

Hoe zit het nu met de overheidsdata? De federale overheid gebruikt veelal 4 datacentra die centraal in Brussel zitten vlakbij de kleine ring en een datacenter in Anderlecht. De afstand tussen de gebouwen gaan van een paar kilometer tot de verste afstand van 5 à 6 km. Dit valt zeker niet onder de bovenbeschreven normen van de Nationale Bank.

Gooi een bom op Brussel en niet alleen alle belangrijke overheidsinstellingen zijn van de kaart geveegd, ook hun gevoelige data. De meeste overheden bevinden zich in het hart van onze hoofdstad, zowel met de eigen server rooms als de externe datacenters. Een blikseminslag of een langdurige verstoring van het elektriciteitsnetwerk is al voldoende om een overheidsinstelling lam te leggen, en daarbij alle kritische gegevens. Conclusie: de datacenters en back-up datacenters van onze overheden zijn niet opgewassen tegen een panne op hetzelfde stadsstroomnetwerk. Zij worden dus bij een stadsbrede stroompanne allebei getroffen. Brussel is bovendien een hoge risicozone. Dat wil zeggen dat de kans er op een natuurlijke ramp of een terroristische aanval veel groter is dan in pakweg Aalst of Antwerpen.

Kunnen onze overheden zich deze risico’s veroorloven? Zij beschikken over gevoelige data over u en ik, belastingaangiftes, onze sociale zekerheid, data over de financiële gezondheid van ons land, data over gesprekken tussen politieke partijen, naties. Kortom: informatie die altijd consulteerbaar moet zijn. Moeten deze data niet extra beschermd worden? Door ontdubbeling in een back-up datacenter buiten onze hoofdstad bijvoorbeeld?

De NBB geeft het goede voorbeeld.

Hoe komt het dat onze overheden ogenschijnlijk niet wakker liggen van interne of externe veiligheidsrisico’s? Vindt u ook niet dat de richtlijnen/regels voor betalingsinstellingen, verzekeraars en kredietverstrekkers ook zouden moeten gelden voor overheden? Binnen de overheid heerst een tendens die zegt dat zij zelf de datacenters moet beheren. Is het niet efficiënter om dit aan externen over te laten met goede SLA’s?

Laurens van Reijen, Managing Director LCL

The Google wake-up call

Despite common sense, companies still purchase cloud services online with a credit card. Google has done a great branding job – people confide in them because they really want to believe that such a big name is probably amongst the best you can get. Well, it isn’t, so everybody now knows.

Last Thursday, apparently the Google data center in Mons (Bergen) was literally struck by lightning. Days later, an incident report was finally published, and this time the Google cloud clients were struck by lightning. Apparently, there is no business continuity – the ‘backup systems’ didn’t work – no disaster recovery – there is no replication to another data center – in short: no nothing!And they aren’t even from Barcelona, as far as I know. They have batteries, but they didn’t take over, which leaves me to think they have never been properly tested. This is, like, the minimum security one should be able to count on. On top of that, the incident report took days to be published, and, as a journalist informed us, there is no one available to talk to. Great service, don’t you think?

When you start reading the incident report, it gets even worse. It’s really the clients’ fault. Clients should not go for ‘GCE instances and Persistent Disks’ but for ‘GCE snapshots and Google Cloud Storage’. The incident report was specifically for the ‘Google Compute Engine’. So, even as a cloud client, you don’t have the protection of the so-called ‘Google Cloud Storage’? This, also, comes with big publicity budgets and a fancy website promising you heaven in the cloud: you probably sign off any responsibility that could be estimated theirs. Whatever happens, it’s your problem.

A tier 3 data center really means that all elements are ‘concurrently maintainable’. So every single part of our data centers can be shut down, without influencing the uptime of the data center. Google only offers tier 1, meaning that there is a lot less security. They could replicate to one of their other data centers, giving you at least that, but they don’t. The question is whether this would be a good enough solution, even if they would replicate, as there always is a latency – their data centers are far apart.

Whether the power is cut for ten seconds or a day, data loss is inevitable. Backup systems need to be tested – otherwise one can never be sure they really do take over seamlessly. Better yet than to rely on an American public cloud, is to go for a Belgian cloud provider. They are flexible, they work with data centers which are better secured (such as ours), and your data is protected by Belgian law. Some of our clients – cloud providers – are Combell, Evonet, Nucleus, Proact and RealDolmen. We like to advice our corporate and government customers and pass on leads to our systems integration and cloud infrastructure clients – that is one of our extra services. Contact me whenever, to discuss the best solution or an innovative idea for your company!

Want to read more about this? Have a look at our press statement:
French article
Dutch article

Laurens Van Reijen