Submit Content Become a member

Prediction 1: Edge and IoT Computing Continues to Evolve, but Growth Hindered by ‘Chicken or Egg’ Syndrome

Many industry pundits are quick to note that the concept of pushing processing to the network’s fringe, i.e., the edge, has been around a long time. It’s just that now we have billions of devices creating data and billions of users connecting at once. This convergence is generating a need for processing and network technology between it and the centralised system.

This leads me to wonder if edge computing will drive the growth of IoT in 2018, or will IoT be the catalyst for the edge? It’s true that before immersive technologies like AR and VR are commonplace, and in advance of AI becoming pervasive, autonomous vehicle fleets becoming standard, and Netflix subscribers numbering in the hundreds of millions, edge computing must become prolific.

Solarwinds

Clearly, technology is everywhere, so we predict that computing boundaries will be pushed even further in 2018 to meet the needs of IoT’s demanding applications. Edge will develop an ability to enable locally processed environment data and deliver the kind of speed that bandwidth-intensive content, like video streaming, requires. Edge must also provide the capacity to prioritise and analyse data at the source, and enable near real-time decisions, like those required for autonomous driving.

At the same time, after years of complaints from security-minded IT-Pros (not to mention a few near-miss IoT-pocolypses), we may see IoT as an industry in 2018 finally prioritise security for these devices and begin building IoT systems that corporations can feel comfortable implementing. 2018 may see a proliferation in business-relevant IoT devices, aided and abetted by IoT management tools such as AWS IoT Platform.

It doesn’t matter which came first, the chicken or the egg. The same can be said for the interdependent relationship of edge and IoT, because it’s difficult to identify either as the primary driver for distributed computing. Does immediacy trump centralisation? Cost or flexibility? Ultimately, if you consider that many industry thought leaders compare the emergence of edge to the early days of cloud, the same is likely to be said for edge computing. We see the capabilities changing as new technologies and use cases emerge. We’ll also see new technologies and use cases emerge because of edge capabilities.

Prediction 2: New Target in the Cybersecurity Wars: IT Security Researchers

What makes malware illegal? Not the creation of it or even the sale; it’s the intent to sell for criminal use. But, intent may be hard to prove (and disprove), and well-meaning security researchers may gradually find themselves being the focus of investigations.

Consider WannaCry ransomware cyberhero Marcus Hutchins, aka MalwareTech. He defused the attack, however, the high-profile nature of the hack generated too much interest in his identity. In August, Hutchins was arrested in a separate incident, accused of developing malware targeted at the banking industry. He maintains his innocence, and many in the security community believe he’s been falsely charged.

This is an example of how straight forward regulations around the world struggle to differentiate between bad intent hackers and cyber vigilante.

Thus, the potential for security researchers to get caught up in the statute’s ambiguity because the nature of their work may be an impending issue in 2018. There’s a dearth of security professionals already (according to national Cyber Security Strategy , the IT sector can expect a worldwide shortage of 1.5 million professionals by 2020.), but the steady proliferation of end points and more sophisticated cybercriminals means security professionals are needed more than ever. We expect more awareness of this issue in 2018, coupled with an increased need for robust security tools.

Prediction 3: Automation Anxiety: AI & Machine Learning

The integration of artificial intelligence (AI) and machine learning capabilities is widely perceived as critical for business success in the coming years. Although this technology is poised to offer breakthrough possibilities to business leaders, artificial intelligence also brings with it widespread uncertainty with respect to the impact on jobs – and not just in IT, but for professionals across a variety of industries.

While this is a legitimate concern – and one that has been perpetuated by large vendors like Google, Microsoft and Amazon – we predict that in 2018 we’ll see a decrease in automation anxiety and more organisations begin to embrace AI and machine learning as a way to augment their existing human resources.

The reality is that the fear, uncertainty and doubt around the impact of AI and machine learning capabilities are similar to the onset and rapid adoption of any radically new technology. Consider the Industrial Revolution: the introduction of assembly lines seemed poised to reclaim countless jobs. Instead, the nature of those jobs simply adapted to the needs of new technology (the emergence of new skills and jobs such as machine maintenance, servicing, etc). In fact, automation has historically created room for more jobs by lowering the cost and time required to accomplish smaller tasks and refocusing the workforce on things that cannot be automated and require human labor.

The same will be true of AI and machine learning. New tools (such as AI-enabled security software) and capabilities (like leveraging machine-learning to remotely spot areas for maintenance in an oil pipeline) are establishing new baselines for efficiency and effectiveness in every industry, and afford enterprises the valuable opportunity to redeploy their workforces to address other challenges. At the same time, many new functionalities still require human oversight: in order for a machine to determine if something “predictive” could become “prescriptive,” for example, human management is needed. Similarly, a machine can only consider the environment variables that it is given – it can’t choose to include new variables, only a human can do this today. However, for IT professionals this will necessitate the cultivation of AI- and automation-era skills such as programming, coding, a basic understanding of the algorithms that govern AI and machine learning functionality, and a strong security posture in the face of more sophisticated cyberattacks. For stakeholders across other industry disciplines, dissecting such a broad technology such as AI or machine-learning into a specific value-add for a business, and communicating the ROI to decision-makers, will be a challenge in the year ahead.

Rate article from Leon Adato, Head Geek™, SolarWinds: