The shift towards the edge

LCL Data Center

The data center world is evolving as the amount of data in the world is constantly increasing. New technologies like the Internet of Things, blockchain, 5G, Artificial Intelligence require a different approach. These technologies require rapid response and real time analysis. Extra data processing and storage capacity is thus needed very close to the source of the data. That’s what edge computing is about: storing, processing and analysing data as close as possible to the point where it is generated.

The shift towards the edge means a shift towards decentralised data centers. Data transfer to a centralised hyperscale cloud data center sometimes just takes up too much time. Pushing computation and analytical capabilities closer to the edge reduces traffic and can reduce round-trip delay in sending data for analysis to and from a centralised cloud platform. This results in better security, improved availability, more privacy and increased resiliency. Every city or region will need their own data center, so this will require a lot of extra data center space.

Edge processing can raise network speed, reduce latency and help with capacity issues. Failures or congestion in networks may cause serious problems for machines, devices or user experience. Think about Pokémon Go: people all over the world were walking around with their smartphones trying to catch ‘em all. Who would like it if the connection goes down at the exact moment they’re catching a rare Pokémon. The same goes for smart watches: the output is needed immediately, so there’s no time to send all the data to the cloud to be analysed.

Another example are autonomous cars. These self-driving vehicles will produce an enormous amount of data and will exchange information with each other. If one car detects a pothole in the road, it sends this information to the next car, which will adept the suspension at the exact location of the pothole. Processing data like this must happen within less than a microsecond or accidents will happen. That’s why the processing needs to happen very close to the point of usage. Availability is key here.

The data center world is evolving, but so is LCL. We are ready for the shift towards the edge. We’re connected in three cities in Belgium: Antwerp, Aalst and Brussels. Our data centers are scalable and flexible and have all the necessary components for security, cooling, energy … already in place. We’re striving for maximum availability and reliability.

Pros and cons of outsourcing your datacenter

The most recent Business Meets IT seminar earlier this month focused on datacenters, so naturally it caught our attention. Keynote speaker of the day was Luc Verbist, CIO of media concern De Persgroep. After a presentation of their own datacenters (2 fully redundant DC's with 4 cubes in total), he also shared his thoughts on internal versus external datacenters. Some of the arguments sound very familiar: if you need a 24x 7 operations, you will more likely outsource your datacenter. If you don't have enough critical mass, you will too. But the decision will also depend on other variables such as: the availability of skilled resources, building restrictions and regulations, and whether your company has an opex or capex strategy. There are many variables but more often than you's expect, you will be driven to the decision pro outsourcing.


Other thoughts worth mentioning: "Experienced project and maintenance teams as valuable as the product itself" (Serge Bogaerts from Cenaero) and "In the year 2000 only Walmart had 200 terabyte worth of data, nowadays any average company with over 1.000 employees already has more than 200 terabyte of data." (William Visterin, Smart business) And this evolution will only accelerate, so data centers and their suppliers can rest assured: there are challenging times ahead. Not challenging as in 'will we have enough business?' but as in: 'how will we manage to keep on growing faster and faster?' A challenge that we gladly accept and that we're already tackling today.