The Internet of Things and the impact on IT systems and connectivity

shutterstock_243606742Slowly but surely, organisations are now starting to experiment with the Internet of Things (IoT). How will this impact IT systems and connectivity? Companies will continuously collect masses of data. To collect a satisfactory and balanced volume of data (for instance geographically balanced), they may need access to other networks as well as their own. So high speed connections with other operators, like, in a carrier neutral data center, are essential.

Will the collected data be even bigger than ‘Big Data’? Nobody knows. What we do know is that organisations need flexible solutions for IoT projects. They will want the possibility to scale their storage capacity (and the rack space that goes along with it, in their favourite data center) on demand. And that’s only the beginning. They will need data warehousing solutions, as well as solutions and probably partners to analyse the collected data. And they may want to communicate with other data sources. Such as ‘open data’, offered by government organisations.

Indeed, government organisations at different levels need to offer free access to their data. Data from traffic sensors for instance, that companies as well as other government bodies may want to run against their database of, for instance, licence plates. This, again, asks for connectivity and easy access to a multitude of partners. It will take some time before all government Open Data are truly available, of course. Some government organisations are more hesitant than others to comply. And there are a lot of ongoing discussions about ownership, use, integrity, offering completely free access or rather keeping a certain level of control, etc. Open Data are here to stay however; the trend is irreversible. More and more companies offer free access to their data too. We’re all hoping that Open Data will stimulate innovation and the development of new applications. What we know for sure however, is that data centers will be needed, to store and offer safe access to Open Data if nothing else. Preferably local data centers, so that Belgian law applies. We wouldn’t want our traffic data to be in American hands, would we?

So, a bright future is ahead of anybody working in the data center industry, in network communications, in big and even bigger data, and in business analytics. We at LCL, cannot wait to see it all happen!

For more information on government Open Data click here

Laurens van Reijen, Managing Director LCL

Liggen onze overheden wakker van de veiligheid van hun data?

Brussel
Brussel
Het bankwezen in België heeft de boodschap begrepen. De Nationale Bank van België (NBB) heeft eind 2015 een circulaire rondgestuurd naar alle Belgische financiële instellingen met richtlijnen voor hun operationele bedrijfscontinuïteit en databeveiliging. Wie wil er nu niet dat zijn zuurverdiende centen veilig zijn? Banken hebben een kritieke rol in het financieel systeem en een groot maatschappelijk belang. Het spreekt dus voor zich dat zij voorzorgsmaatregelen nemen tegen operationele schade, verstoringen van het elektriciteitsnet of diefstal. De circulaire stelt dat een financiële instelling steeds moet beschikken over twee datacenters, minimaal Tier III, die niet binnen dezelfde stedelijke agglomeratie liggen en minimaal 15 km uit elkaar. Minder dan 15 km mag, maar dan mits voldoende onderbouwde risicoanalyse voor te leggen aan de NBB. Bijkomende voorzorgsmaatregelen en/of uitwijk- en hersteloplossingen worden voorzien op een afstand van tenminste 100 km. Minder dan 100 km mag, maar dan ook mits voldoende onderbouwde risicoanalyse.

Hoe zit het nu met de overheidsdata? De federale overheid gebruikt veelal 4 datacentra die centraal in Brussel zitten vlakbij de kleine ring en een datacenter in Anderlecht. De afstand tussen de gebouwen gaan van een paar kilometer tot de verste afstand van 5 à 6 km. Dit valt zeker niet onder de bovenbeschreven normen van de Nationale Bank.

Gooi een bom op Brussel en niet alleen alle belangrijke overheidsinstellingen zijn van de kaart geveegd, ook hun gevoelige data. De meeste overheden bevinden zich in het hart van onze hoofdstad, zowel met de eigen server rooms als de externe datacenters. Een blikseminslag of een langdurige verstoring van het elektriciteitsnetwerk is al voldoende om een overheidsinstelling lam te leggen, en daarbij alle kritische gegevens. Conclusie: de datacenters en back-up datacenters van onze overheden zijn niet opgewassen tegen een panne op hetzelfde stadsstroomnetwerk. Zij worden dus bij een stadsbrede stroompanne allebei getroffen. Brussel is bovendien een hoge risicozone. Dat wil zeggen dat de kans er op een natuurlijke ramp of een terroristische aanval veel groter is dan in pakweg Aalst of Antwerpen.

Kunnen onze overheden zich deze risico’s veroorloven? Zij beschikken over gevoelige data over u en ik, belastingaangiftes, onze sociale zekerheid, data over de financiële gezondheid van ons land, data over gesprekken tussen politieke partijen, naties. Kortom: informatie die altijd consulteerbaar moet zijn. Moeten deze data niet extra beschermd worden? Door ontdubbeling in een back-up datacenter buiten onze hoofdstad bijvoorbeeld?

De NBB geeft het goede voorbeeld.

Hoe komt het dat onze overheden ogenschijnlijk niet wakker liggen van interne of externe veiligheidsrisico’s? Vindt u ook niet dat de richtlijnen/regels voor betalingsinstellingen, verzekeraars en kredietverstrekkers ook zouden moeten gelden voor overheden? Binnen de overheid heerst een tendens die zegt dat zij zelf de datacenters moet beheren. Is het niet efficiënter om dit aan externen over te laten met goede SLA’s?

Laurens van Reijen, Managing Director LCL

The Google wake-up call

Despite common sense, companies still purchase cloud services online with a credit card. Google has done a great branding job – people confide in them because they really want to believe that such a big name is probably amongst the best you can get. Well, it isn’t, so everybody now knows.

Last Thursday, apparently the Google data center in Mons (Bergen) was literally struck by lightning. Days later, an incident report was finally published, and this time the Google cloud clients were struck by lightning. Apparently, there is no business continuity – the ‘backup systems’ didn’t work – no disaster recovery – there is no replication to another data center – in short: no nothing!And they aren’t even from Barcelona, as far as I know. They have batteries, but they didn’t take over, which leaves me to think they have never been properly tested. This is, like, the minimum security one should be able to count on. On top of that, the incident report took days to be published, and, as a journalist informed us, there is no one available to talk to. Great service, don’t you think?

When you start reading the incident report, it gets even worse. It’s really the clients’ fault. Clients should not go for ‘GCE instances and Persistent Disks’ but for ‘GCE snapshots and Google Cloud Storage’. The incident report was specifically for the ‘Google Compute Engine’. So, even as a cloud client, you don’t have the protection of the so-called ‘Google Cloud Storage’? This, also, comes with big publicity budgets and a fancy website promising you heaven in the cloud: you probably sign off any responsibility that could be estimated theirs. Whatever happens, it’s your problem.

A tier 3 data center really means that all elements are ‘concurrently maintainable’. So every single part of our data centers can be shut down, without influencing the uptime of the data center. Google only offers tier 1, meaning that there is a lot less security. They could replicate to one of their other data centers, giving you at least that, but they don’t. The question is whether this would be a good enough solution, even if they would replicate, as there always is a latency – their data centers are far apart.

Whether the power is cut for ten seconds or a day, data loss is inevitable. Backup systems need to be tested – otherwise one can never be sure they really do take over seamlessly. Better yet than to rely on an American public cloud, is to go for a Belgian cloud provider. They are flexible, they work with data centers which are better secured (such as ours), and your data is protected by Belgian law. Some of our clients – cloud providers – are Combell, Evonet, Nucleus, Proact and RealDolmen. We like to advice our corporate and government customers and pass on leads to our systems integration and cloud infrastructure clients – that is one of our extra services. Contact me whenever, to discuss the best solution or an innovative idea for your company!

Want to read more about this? Have a look at our press statement:
French article
Dutch article

Laurens Van Reijen
CEO of LCL

Dear Belgian Systems Integrator. Is cloud computing in your comfort zone? If not, it better be some time soon!

Cartoon Cloud

Cloud is in. Any systems integrator you meet, will tell you that they offer cloud solutions, and that you should go cloud. With them, of course. And then… they spoil it all by telling you they resell Amazone (or so).

Dear Systems Integrator. If you don’t have any real knowledge of cloud infrastructure, enough to set up any cloud (yourself) that suits your customer, you might as well not bother. Apart from the fact that you need to be able to really service your client, by giving him extra capacity at his beck and call for instance, there’s the fact that your client may prefer his data to stay in Belgium, so under Belgian law.

One company that is doing really well, is Proact. They’ve recently launched an innovative hybrid cloud solution, housed by LCL data centers (of course). Our newsletter tells you all about that. Proact is comfortable within the cloud ecosystem; they know what they’re talking about. Another example to check out is Nucleus. They’ve been offering hosting solutions for a long time now, and are presently evolving to cloud solutions. In my opinion, Nucleus will be one of the major players in local public cloud solutions in the years to come. Check out this article for more info, in Dutch or French. Another party that has an interesting offering in cloud infrastructure, is Evonet. They’ve partnered with Alcatel-Lucent and Dell, and offer solid cloud infrastructure building blocks.

Let me be quite clear about this: cloud is here to stay, and if you miss this opportunity, I’m not sure you will be amongst the so called ‘fittest that will survive’… Want a chat about your options over a pint this summer? Mail me!

Have a great end of spring and… why not use the summer to get into the cloud?

Laurens Van Reijen
CEO of LCL

Who has better Business Continuity than Belgocontrol?

cabletray
Data Center LCL

A lot of sarcasm on social media and in the press yesterday and today, about Belgocontrol’s failing power backup. Next to the economic damage, there’s the reputation damage. A lot of companies are no better though. When confronted with a power cut, whether as a test or a real one, most companies will bear unexpected consequences. And what is more: a lot of IT Managers are quite aware of that. I bet many IT Managers haven’t slept well last night…

It’s true that redundancy of all your critical systems and assets in general, such as Belgocontrol’s control tower, requires an investment. Some IT Managers tell us their CFO or CEO won’t give them the budgets to do what is really needed. Let’s hope yesterday’s adventures have learned these CXO’s what’s really at stake.

As a CEO, if you really want to be sure that your business continuity is satisfactory, you need to make sure a full test is done. Many don’t dare test as it should be done, so they never know whether their precautions are quite enough. For your IT it’s somewhat easier: you can go for the OPEX rather than the CAPEX model, and confide in LCL’s data centers to make sure your systems are protected and your business continuity really works… We do a real test at least every month, actually cutting the power entirely. We can safely say that we can honor the SLA of our Tier III certification. Last night, like (most) every night, I personally slept like a baby!

Laurens van Reijen, CEO of LCL data centers

The Dutch blackout and our electricity dependance

cabletray
Data Center LCL

As we’ve all read, The Netherlands had a blackout a couple of weeks ago. We didn’t, in spite of all the noise that was made over the possibility of a blackout. That is: we didn’t have one yet. In the North of The Netherlands, including in large areas of Amsterdam, there was no electricity for about two hours. When it finally came through, it took the provider two extra hours to restore the electricity in all areas. Yesterday, we learned that Belgium imported four times as much electricity this winter as we did last year. Given that we weren’t exactly freezing this winter (and so had a moderate consumption), if we didn’t have a blackout, we were probably just lucky.

The effect in The Netherlands was considerable. A lot of companies, including the airport and Dutch railways, bore the consequences. Next to 1 million families. And plenty of shops, where the security systems weren’t working. Traffic lights were failing, which created chaos. Police stations were inaccessible as their failing electric locks kept their doors shot. And the websites of several media, among which the national news service NOS, were down and unable to inform the public of what was going on.

We searched the internet for reports of the damage done to companies’ ICT systems, but these remain a well kept secret. Who will admit to the loss of data and/or systems for underestimating the consequences or because of a lack of precautions (such as contracting a data center)? As to the cause of the breakdown: provider Tennet did take its precautions: all high-tension cables are redundant. But then Murphy is never far away: there was a failure when both loops were connected because of works. So in spite of redundancy, there was still a breakdown. Where will Murphy be in our country when the electricity fails? There will be unexpected problems, such as failing internet lines. In The Netherlands, UPC’s cable network was dead. People also reported that the Vodafone mobile network was out as well as the KPN landlines.

Which is more frightening? That such a large area depends on one station, or that even redundant systems are so vulnerable… When we get an RFP requiring multiple data centers, sometimes they only need to be 5 km apart. How ridiculous is that? Like, you won’t have utilities in one area, and you will at a distance of only 5 km? They should be at least 25 km apart as the crow flies. Taking precautions is not enough. One should test them elaborately and frequently, by cutting the electricity off on a regular basis. We cut it off 36 times a year, just to make sure. There will still be unexpected problems, but at least you will have foreseen the obvious ones, and with some luck, you will be able to continue working… If you’d like some tips as to prepare for next year’s winter: there is a checklist some blogs further down.

Note that one can get a compensation in The Netherlands for damages following an electricity breakdown, but only if it lasts for more than 4 hours, which is an eternity if you don’t have a professional backup… Good luck!

Laurens van Reijen, CEO of LCL data centers

Today’s eclipse, solar panels and disaster recovery

cabletray
Data Center LCL

Today, if clouds don’t spoil it for us, we’ll get to see a solar eclipse. On a sunny day, an eclipse translates into 40% less electricity from solar panels. Given the shaky state of the electricity supply in Belgium, I do hope you have a backup server room or data center, just in case we should get the long feared electricity blackout after all.

Earlier this week, well before the eclipse, we experienced two power cuts in our data centers, shortly after one another: one in Aalst yesterday, and one at the beginning of the week in Diegem. We have ample backup systems, of course. Our clients never noticed there was a (short) blackout. When data center activities are not your core business however, you can never protect your data and assets in the same way a professional external data center can. So a power cut represents a threat.

Even if you do have a backup solution, a second data center, at a distance of, say, 5 km: how relax can you be about potential disasters such as a power cut? I would keep my mobile close by if I were you. If your server rooms/data centers are only 5 km apart, both are very likely to suffer from any power cut that occurs on the grid. Even if you have the right backup facilities in both your data centers, your staff will have to be in both places at the same time to monitor. A double power cut represents an extra risk, there’s no doubt about that. In practice, your data centers need to be at least 20 km apart to avoid any unnecessary extra risk.

In short: a lot of companies are not as safe as they think they are. A condition for growth is that you can focus on your core business. And that your backend is no source of worry…

Laurens van Reijen, CEO of LCL data centers