These are the questions your firm should ask before going down the route of edge analytics and processing.
This article originally appeared on ZDNet.
Increasing numbers of organizations are gathering and analysing data at the edge of their networks in their quest to improve day-to-day operations.
Whether it’s for predictive maintenance on lifts and aircraft, or analysing data collected to keep oil rigs running smoothly, companies are using using low-powered edge devices to collect huge volumes of data.
This data was typically sent to a central data centre for analysis, but as the volume of data collected at the edge has increased, so has the need to analyse and prune this data close to the point of collection.
However, deploying devices to the edge of a network can pose some unique challenges for businesses.
SEE: From Cloud to Edge: The Next IT Transformation (ZDNet special feature) | Download the report as a PDF (TechRepublic)
Why you might want to engage in edge computing
The reasons for shifting data processing and analytics to the edge of your network go beyond resolving issues of insufficient bandwidth or limited connectivity, according to Bob Gill, research VP and agenda manager in Gartner’s infrastructure strategies group.
He points out that companies should consider an edge-computing setup where there is a need for devices to act upon data in real time or near-real time, so the time taken to send that data to the cloud for processing would be an issue.
“Let’s say there are a whole series of traffic lights in a city, with edge computing one of the things we can do is actually provide intelligence between devices, so they can communicate between themselves without having to talk to a central locale,” he says, highlighting the edge analytics carried out by the company Swim as an example.
That doesn’t mean firms are always best off relying on the same edge devices gathering the data to carry out the analysis. Often they might need to offload analysis to a system situated near to the edge, according to James Staten, vice-president serving CIOs at analyst Forrester.
“While many companies think that edge analysis in the device itself is good enough, in most cases, no, it isn’t,” he says. “For example, in cars today with all their sensor data, to guide drivers and future autonomous cars, you must aggregate data coming from more than just the one car.”
“This is best accomplished via an edge compute device deployed in proximity to where the cars are running (in each metro and in multiple metro locations optimally), so that analysis of data about the experience multiple cars are facing can be combined with weather, maps and other cloud-based data repositories,” says Staten, adding that Dell, HPE, Saguna and IBM offer suitable machines for carrying out this sort of analysis near to the edge.
Conducting analysis close to where data is gathered can also be necessary to comply with strict rules on where data is processed — to comply with data sovereignty stipulations under the European General Data Protection Regulation, for example.
SEE: IT pro’s guide to GDPR compliance (free PDF) (TechRepublic)
Factors to consider before embarking on an edge computing project
How secure are your edge devices?
Gartner’s Gill highlights the problems with many edge devices running insecure software platforms that are unpatched against known exploits, citing the ease with which CCTV cameras were hijacked by Mirai malware as just one example.
“There are a lot of devices out there that have pretty marginal security, and if what we’re talking about is building a critical application that relies on thousands or even millions of devices, we’ve got to ensure some kind of end-to-end security all the way back into the core,” he says.
“This brings up fascinating questions when dealing with edge device manufacturers about ‘How do we gauge the extent to which their security meets our enterprise security? How do we gauge how it fits in with our overall identity and access management scheme?’,” Gill adds.
SEE: From Cloud to Edge: The Next IT Transformation (ZDNet special feature) | Download the report as a PDF (TechRepublic)
Eric van Hensbergen, who leads the software and large-scale systems research at chip designer Arm, says: “Historically at the extreme edge there’s tonnes of gadgets that you buy that are a couple of bucks and the companies that are making these don’t put an investment into security.”
He expects the situation to improve, and says that Arm is trying to build a certain level of integrity into devices running on Arm-based processors.
Should you let a hyperscale cloud provider take care of security and maintenance?
In an attempt to secure every part of the infrastructure needed to support an edge deployment, major tech firms are offering new services and platforms.
One such example is Microsoft’s Azure Sphere, which aims to secure connected microcontrollers — of which there are 9 billion shipping every year — at both the board level and network level.
Azure Sphere has three components. The first is customised microcontroller units for IoT devices, which are authenticated using certificates encoded in on-board chips.
The second component is the Azure Sphere OS, which runs on the IoT devices and helps secure and authenticate the hardware, and which is based on a custom version of the Linux kernel.
The third is the Azure Sphere Security Service, a cloud-based offering that keeps devices patched with the latest security updates and detects threats to these connected devices for 10 years after their rollout.
SEE: Cybersecurity in an IoT and mobile world (ZDNet special report) | Download the report as a PDF (TechRepublic)
Amazon offers similar services via AWS IoT Core and AWS Greengrass, and Gartner’s Gill says this route is a viable option for firms considering an edge deployment.
“This points to the advantage of perhaps using a hyperscale cloud provider and their whole cloud-to-edge portfolio,” he says.
Both firms are also offering digital twin systems, which allow firms to model the relationship between edge devices and a physical space.
Don’t forget about physical security
Unlike the multiple layers of physical security in a data centre, edge devices will often be in public spaces or other locations that are difficult to secure, points out Arm’s van Hensbergen, so firms will need to factor how they will factor that insecurity into their plans.
“There’s a number of really important considerations, particularly security, that they just have to get right or there will be problems,” he says.
SEE: From Cloud to Edge: The Next IT Transformation (ZDNet special feature) | Download the report as a PDF (TechRepublic)
Consider if any analytics should be carried out centrally
Although the term ‘edge computing’ suggests a certain amount of data processing will take place at the edge of the network, there may still be benefits in handling some analysis centrally, particularly where it requires a level of computing power that’s not available locally.
Gartner’s Gill gives the example of the Google AIY Vision and Voice kits, which he says run a trained machine-learning model locally to handle image and speech recognition, but which also send data back to Google’s cloud platform for use in training more accurate machine-learning models.
Minimise maintenance
The remote nature and long lifespan of many edge devices means that firms are going to want devices and supporting infrastructure to be robust enough to minimise hands-on maintenance, says Gartner’s Gill.
“As much as possible we’re trying to make these devices have a greater longevity,” he says, adding many edge device manufacturers are working towards five to 10-year lifecycles.
The key is engineering systems that can be secured and receive new features via software, rather than hardware updates, he says.
“When we start getting into the billions of devices, the task of managing all those becomes a mess,” he says.
“That’s exactly one of the things that Microsoft was pushing with the whole Azure Sphere concept…we could do system updates on devices like we do with Windows Update.”
Be ready to set up an edge computing centre of expertise
Due to the nature of edge devices, staff with the relevant expertise to manage the devices’ deployment are likely to be spread across different operations, line-of-business and IT departments, according to Gartner’s Gill.
For this reason he recommends the management of edge deployments be handled by a “centre of excellence”, composed of staff from across the organization with the relevant expertise.
“It mirrors the shift from IT to the business unit with cloud deployment,” Gill says.
SEE: From Cloud to Edge: The Next IT Transformation (ZDNet special feature) | Download the report as a PDF (TechRepublic)
“We have the same thing here when it comes to who owns the devices at the edge, when it comes to things like maintenance and ownership of the data, that has to be specified and agreed to.”
Ensure a degree of compatibility between different edge deployments
Although it makes sense to trial a number of different technology platforms when piloting edge-computing deployments, you also want to ensure a level of compatibility and interoperability between these deployments in different parts of the business once they go live, says Gartner’s Gill.
“Within a retail organization do you really want four completely independent trials that don’t speak to each other, using different types of technology?” he asks.
“In a pilot that might make sense, to learn the most. When you actually start putting things into production that’s obviously not to your best advantage.”
Also see
Leave a Reply