INTELLIGENT BRANDS // Data Centres
Designing next-gen
data centres in the
era of Big Data
Big data has revolutionised the way we think about
data centre design and orchestration. However, it’s
important not to get carried away with the hype,
and to understand that big data cannot be looked
at in isolation when considering future data centre
planning, writes Glen Ogden, Regional Sales Director,
Middle East at A10 Networks.
P
erhaps the biggest challenge
introduced by big data is the
need to re-evaluate the storagecompute model. Big data components
such as Apache Hadoop enable
distributed processing to be done, but
in situ with the data, where each data
node is also a compute node. This
fundamentally changes how we view
storage and raises a bunch of questions
around what to do with existing SAN and
NAS, how to archive, and how legacy
applications are to access this data.
Internet of Things (IoT)
We can’t mention big data without also
mentioning the Internet of Things (IoT).
By 2020 various industry estimates
put the number of Internet connected
devices between 50 and 75 billion.
64
INTELLIGENTCIO
This is going to radically change how
humans interact with technology, the
visibility we have on the state of these
‘things’, and the insights gained from
analytics on those ‘things’.
In practice, this will result in the
generation of much higher volumes
of unstructured data (through
instrumentation, external feeds, etc.).
All this data will need to be stored in the
enterprise data centres and analysed
using big data solutions – something
that needs to be considered and
factored in to future IT planning.
IPv6
Given the number of devices introduced
by IoT and mobile technology, we need
to think about addressing. While there are
solid techniques for IP preservation (such as
DHCP, NAT, and Carrier Grade NAT) there
is no question that IPv6 will accommodate
these new IoT entities. From an enterprise
data centre perspective that means, at
the very least, having tools at the edge to
translate IPv4 to IPv6.
High availability
Big data deals with scale and availability by
design. Hadoop can effectively scale out to
tens of thousands of nodes – transparent
to the application. High availability is built
directly into the clustering model, negating
the need for expensive RAID arrays. This
completely changes how we think about
storage, and right now organisations are
making their own rules on the type of
hardware to deploy, driven by cost and
processing needs.
www.intelligentcio.com