2017 Enterprise IT Secrets Revealed!

First off, 2017 will see an increase in the quantity of clickbait article titles…
Each year about this time, analysts, bloggers, and random people on the street start to prognosticate about the coming year, seeking to predict the trend that will define the year, the hype that will get blown out, and the results that will be yielded from all of the above.  While 2016 has managed to throw at us a number of unexpected twists and turns, to help get people through to 2017, we at ActualTech Media have decided to lend our voices to the din.
2016 was dominated by a lot of different events and trends, many of which started out a bit disjointed, but as time goes on, have or will ultimately combine to create new directions for IT.  Some of the events were, well, really bad, and many of the trends have proven positive.
Read on!

Security and Data Recovery Will Be Joined at the Hip

Did you know that there were security issues in 2016?  From Russian attacks on the servers of a certain political party to DNS-attacking IoT-driven botnets that brought down massive swaths of the Internet to announcements that up to 1 BEEELION Yahoo accounts were compromised to a whole lot more, security was unhappily welcomed into the boardroom as a continuing major risk for organizations of all shapes and sizes.  Based of the sheer devastation caused by 2016 security breaches, we fully expect to see continued and renewed attention on security in 2017 and it won’t be limited to big companies with lots of juicy data.  Even smaller, seemingly uninteresting companies will work on raising their shields.
[su_quote cite=”Justin Giardina, CTO, iland” url=””]

While historically, it was the biggest organizations with the most attractive data that got hacked, increasing numbers of malicious attacks targets smaller, often weaker, targets. So, we’ll see medium-sized enterprises raising their security and business continuity efforts. Often, they’ll turn to cloud vendors to provide that security and maintain those systems, as they represent a fast path to the latest technology.

Perhaps among the most immediately impactful kinds of attacks that people suffered in 2016 resulted from the scourge that has become known as ransomware.  Ransomware is software that is developed by complete and utter societal rejects that is delivered to an unsuspecting user via email attachments and malicious web sites, among other vectors.  Once unleashed, ransomware encrypts all of the person’s files without that person’s consent.  The only way to get the unlock key is to pay the perpetrator in Bitcoins.  Unfortunately, knowing that many organizations that lose their data go out of business, many are paying up, making this a winning financial tactic for the people to perpetrate their extortion schemes.
Protecting against ransomware has become big business for a number of traditional data protection companies.  These companies are working hard to provide customers with solutions that keep data continuously protected and that enable fast and easy recovery in the event that a customer falls victim to the scheme.  2017 will see a continued expansion of solutions intended to help protection customers from these and other kinds of attacks.
[su_quote cite=”Paul Zeiter, President, Zerto” url=””]

In the next year we are going to see a rebalancing of spend from traditional security solutions to data protection and recovery. Whilst security spend protects the perimeter fence, there are simply too many cases of breaches getting past these defenses to not have a plan B in place. A hacker only needs to be right once to gain access, whereas the company has to be secure 100% of the time. CIOs and CEOs are starting to recognize that millions of dollars in IT security investments, while critically important, are just not enough when a disaster such as a hack or ransomware breaks through the perimeter or a natural disaster like a hurricane floods their data center. In the wake of a disaster, companies quickly come to the realization that without the right investments in a disaster recovery solution, their businesses are exposed. To be proactive, companies need a plan and tools in place to recover from any disaster very quickly with as little revenue and end-user impact as possible. Even if an organization has implemented the best preventative security technology, disasters can and do still happen.


Disaster Recovery as a Service

Recent years have seen the rise of automated disaster recovery products, which many now refer to as disaster recovery as a server (DRaaS) offerings.  In fact, although DRaaS offerings have become a popular part of the solution set for traditional data protection products, such capabilities escaped this environment in 2016, a scenario we expect to see continue in 2017 as security and data protection continue their close-knit relationship.  For example, in 2016, hyperconverged infrastructure creator SimpliVity added DRaaS to their product capabilities, as did Scale Computing, another purveyor of hyperconverged gear, but which is aimed at the SMB and SME.
On this topic, Jesse St. Laurent, VP of Product Strategy at SimpliVity has these thoughts:
[su_quote cite=”Jesse St. Laurent, VP of Product Strategy, SimpliVity” url=””]

In 2016, we saw plenty of high profile examples of IT downtime resulting in lost revenues and damaged reputations for some world-wide brands. As we move into 2017, businesses will start to rethink their backup and disaster recovery strategies to ensure that they have the right solutions in place to get back online quickly and smoothly after an outage. This year was a wake-up call for simplifying and automating data protection, just as last year was a wake-up call for security.


Architecture: Public and Private Cloud Will Undergo Evolution and Revolution

If you haven’t yet heard of the cloud, it’s time to turn in your IT card.  It’s been one of the most overhyped and underhyped technologies of all time.  By now, you know that every single enterprise technology on the planet has a “cloud” solution, at least according to marketing.  Seriously, though, in 2016, we really started to see the potential for what people are really looking for when they utter the word “cloud”.  It became clear that cloud isn’t really a destination for many.  Rather, companies are looking for a certain set of outcomes and many really don’t care where their workloads operate as long as they can get the desired benefits.
Both public and private cloud have interesting opportunities, but for traditional enterprise IT vendors, private and hybrid cloud environments are where the action seems to be.  In these scenarios, customers continue to maintain their own data centers (private cloud), with some sprinkling in a bit of the public cloud for good measure (hybrid cloud).  2017 will continue a number of trends that gained traction in 2016.

Hybrid Cloud

Hybrid cloud is one of these trends.  A conglomeration of private data centers and services from various cloud providers, hybrid cloud architectures are poised to become the new normal for enterprise IT.  By offering the best of both worlds, it becomes possible to eliminate the weaknesses on each side of the equation.  For example, public cloud is sometimes considered too porous for really sensitive data, so people eschew that option in favor of keeping data local.
[su_quote cite=”Geoff Barall, CTO, Nexsan” url=””]

Towards the end of 2016 we heard that vSphere-based cloud services will run on AWS in 2017, a move that shows the major players are preparing to accommodate customers in a hybrid cloud environment. No matter how far we look into the future, there will always be data too sensitive to trust to public cloud services. So whilst some companies will create a hybrid cloud strategy, and some may go all private, most companies, especially larger ones, will not risk their data to a 100% public cloud strategy. So what of file sync and share? Today’s NAS storage solutions have not kept up with the times, creating a gap between the worlds of the data centre and the connected mobile user. 2017 will see forward thinking companies not just creating secure file sync and share capability, but intertwining them with the company’s storage environment to avoid damaging or duplicating files. In essence, next year we will see the delivery of the data centre on the go.


Hyperconverged Infrastructure

It’s impossible these days to talk about private clouds and local data centers without considering hyperconverged infrastructure.  We fully expect to see hyperconverged infrastructure continue its blistering growth in 2017 as more companies seek to exit the rat race that has become the traditional data center.  However, we also expect to see the start of consolidation in this space as big vendors gobble up smaller ones and front runners continue to grow.  What’s really interesting about hyperconverged infrastructure is that it’s a technology trend for which the hype simply isn’t diminishing… and it’s actually meeting expectations.  Our research says that people are getting the results they expected and wanted and they’re telling their friends.
The folks at Scale Computing – specifically, co-founder Jason Collier – also see a bright future for the technology, particularly for cash-strapped SMBs.
[su_quote cite=”Jason Collier, Co-Founder, Scale Computing” url=””]

In 2017, hyperconverged Infrastructure will become increasingly popular as an alternative to traditional virtualization architecture composed of separate vendors for storage, servers, and hypervisor. IT shops will increasingly move to shed the complexity of managing components of infrastructure in silos and adopt simpler hyperconverged infrastructure solutions as a way to streamline IT operations. There may likely be a much sharper rise in adoption of hyperconverged infrastructure in the SMB market where the simplicity (requiring less management) can have a bigger budget impact.



Another area that’s been getting a lot of digital ink in 2016 is containers, which many describe as the next wave of virtualization.  2016 has been the year of hype for this nascent technology, but 2017 may bring containers a bit more mainstream as well-established companies integrate them with their own offerings.  There is good reason to do so.  Containers, when used correctly, have the potential to provide companies with performance benefits without some of the tradeoffs that come with virtualization.
Tintri’s Chuck Dubuque, Vice President, Product and Solution Marketing at Tintri, see a bright future for the technology:
[su_quote cite=”Chuck Dubuque, Vice President, Product and Solution Marketing, Tintri” url=””]

Companies want high performance, but without the price they have to pay in latency that comes with modern virtualization, no matter how small. In 2017 this will equate to an increased interest in going back to physical workloads, but with a container twist. Containers give bare metal performance and hardware access for real-time transactional workloads, but with some of the abstraction and portability benefits of virtual machines. Containers on physical can be a solution for those hard-to-virtualize workloads where performance and latency are essential. Containers are also more lightweight and ephemeral, making them a great match for modern cloud-native workloads.

Although we at ActualTech Media do believe that containers will ultimately be quite popular across data centers of all sizes, we see 2017 as the continuation of large company opportunity for this technology, as is has been up to this point, at least for production use.  2017 will see an expansion of current proofs-of-concept into production with some smaller companies beginning to take the plunge, or at least test the waters.
This is also the direction predicted by Chris Brandon, CEO and Founder, StorageOS.
[su_quote cite=”Chris Brandon, CEO and Founder, StorageOS” url=””]

2016 saw the next evolution of container technology with the creation of persistent, highly available, scalable containers storage. Many SME and Enterprise customers have started to test and prepare these systems for production. In 2017, these developments will change the landscape not just for devops but for all enterprises wishing to drive down cost and decrease time to market. Containers are used to quickly build and deploy cloud native applications that run securely on multiple platforms or cloud providers’ environments. Users now require scalable, deterministic, low latency storage that can securely move data with the container based application between bare metal devices, virtual machines or cloud infrastructure. They also need to maintain the control of Service Levels and Compliance for these system. With persistent storage, containers can be used – not just for applications – but databases as well. In 2017, we will see even greater adoption of containers by service providers and enterprises of all sizes as companies continue to develop in the cloud. The winners will be the ones focused on leveraging existing investment and maintaining business control, while delivering ease of use and sophisticated integration to make developers lives easy.

With Microsoft’s inclusion of containers in Windows Server 2016, we believe small and midsized companies will begin kicking the tires a bit more than they have to this point, but will likely not undertake large-scale adoptions quite yet.  We see containers as a “slow burn” technology – they will sneak up on organizations, but in a good, non-stalkery way.
However, there remain some challenges that need to be solved in order for containers to have mass appeal.  First and foremost, security is at top-of-mind as containers bring with them new challenges in this area.  There are current and emerging solutions for securing containers, but the space remains somewhat disjointed, a scenario we expect to see corrected in 2017.  Moreover, persistent storage in the world of containers is being addressed as well.  It seems like storage is everywhere these days, no matter where you’re looking!
Ruben Spruijt, CTO of Atlantis Computing also sees containers as potentially revolutionary, particularly in terms of storage.
[su_quote cite=”Ruben Spruit, CTO, Atlantis Computing” url=””]

Storage virtualization continues to progressively transform physical infrastructure, management and architecture which is also changing how storage for enterprise workloads are delivered. However, server virtualization never significantly changed the way enterprise workloads were delivered as compared to physical servers. In 2017 we will start to see this evolve with an increased focus on containers. Containers are all about flexibility and deployment of just the application and work well within cloud environments. Containers will continue to gain traction as the concept matures further in 2017. A container can be shipped to a public cloud platform easily, meaning more newly developed applications will be utilizing containers in 2017.



You know that data is really important!  However, data doesn’t just exist in a vacuum.  It needs to be captured, stored, and analyzed, all services that will experience shifts in 2017.

Data Capture and Storage

Even as we stare at either the greatest times in the history of mankind or the abyss (it’s all about perspective, people!), one thing is certain: AI and automation rely heavily on data, and lots of it.  Further, these kinds of systems, have to have complete understanding of all that data so that it’s actionable.  Even though the AI/automation components of the story, these kinds of data analysis capabilities are increasingly important to businesses as well, as they seek new ways to innovate their products, and even to other areas, such as healthcare, as they seek ways to mine data that can help propel medicine to new heights.
And, we can’t forget about the other big data consumer and generator: the Internet of Things (IoT).  2016 has seen a spectacular rise in IoT devices, a scenario that will absolutely accelerate in 2017 and beyond.  The continued rise of IoT will bring both opportunities and challenges to consumers as well as the companies that sell such products.
Fuzzy Logix’ COO, Michael Upchurch, see the potential for challenges as well, particularly as it pertains to the ability to store massive amounts of data.
[su_quote cite=”Michael Upchurch, COO, Fuzzy Logix” url=””]

Next year will bring about another deluge of data brought on by advancements in the way we capture it. As more hardware and software is instrumented especially for this purpose, such as IoT devices, it will become easier and cheaper to capture data. Organizations will continue to feed on the increased data volume while the big data industry struggles through a shortage of data scientists and the boundaries imposed by non-scalable legacy software that can’t perform analytics at a granular level on big data.  Healthcare will especially be hard hit in this regard. Sources of huge healthcare data sets are becoming more abundant, ranging from macro-level sources like surveys by the World Health Organization, to micro-level sources like next-generation Genomics technologies. Healthcare professionals are leveraging these data to improve the quality and speed of their services. Even traditional technology companies are venturing into this field. For example, Google is ploughing money into its healthcare initiatives like Calico, its “life-expansion” project, and Verily, which is aimed at disease prevention.  We expect the demand for innovative technical solutions in all industries, particularly healthcare to explode in popularity next year.


Database Types

Databases are already critical to just about every business out there.  Data is the very lifeblood of the organization and, without it, the company would wither away and die.  As mentioned above, 2017 will see challenges storing data, but, at some point, that challenge will be solved and it will be the job of the database to hold data for later analysis and for action.  In order to enable the use of larger data sets, each containing a lot of variety, the market needs new ways to store that data.
Blue Medora’s CTO, Mike Kelly, believes that 2017 will be a breakout year for new database types.
[su_quote cite=”Mike Kelly, CTO, Blue Medora” url=””]

In 2017 more and more database types will be implemented and supported by a mix of traditional and cloud infrastructure — and the management of them will become even more complex. As more companies rely on applications, the performance of databases which power these applications will become even more critical to service and profitability. Companies who haven’t figured out that database performance will be key to their success will be surpassed by companies that have.



What good is data if you can’t do something with it?  The ability to turn data into actionable information is a key differentiator for many industries.  It’s super easy to get data, but a lot harder to make sense of it.  While not a new trend, efforts toward this goal will increase in 2017, a sentiment echoed by Jeff Evernham, Director of Consulting, North America, Sinequa.
[su_quote cite=”Jeff Evernham, Director of Consulting, Sinequa” url=””]

Historically, the focus of data has been on making sense of numbers (structured data), analytics and business intelligence. That focus will continue, but in 2017 more organizations will be tapping into their vast stores of disparate, unstructured data by finding ways to leverage it along with their structured data assets. This will be a fundamental shift.”

Tintri’s Chuck Dubuque agrees.
[su_quote cite=”Chuck Dubuque, Vice President, Product and Solution Marketing, Tintri” url=””]

For years now, organizations have struggled (somewhat ironically) to get full value out of all the data generated by their data center. Third party software is cumbersome and often only addresses one silo or another at a cost and complexity curve that doesn’t make sense in the long run. Now organizations have better data—they can drill into the behaviors of any individual virtual machine. And they have better analytics. In 2017, the industry will embrace predictive analytics as a business tool, getting smarter about how to use historical behavior of hundreds of thousands of virtual machines to forecast the future. And analytics will span silos, seeing across applications, compute and storage to offer a more complete picture of the data center.


The IT Organization: 2017 Trends

Let’s face facts.  IT isn’t what it used to be, a phrase that will also be uttered by IT pros in 2030 and beyond.  IT pros and decision makers need to adopt mindsets, training programs, and operational methodologies that are predicated on continuous change and evolution.  It’s far too easy to get left behind and struggle to catch up.

Jumping on Trends

Now, unlike some analysts, I’m not going to sit here and say that you need to force your company to latch on to every single trend that comes by.  That’s a recipe for disaster and it would be impossible to keep up with the changing winds in any meaningful way.  That’s the good news.  You don’t need to jump on everything.
But here’s the bad news: You do need to at least be aware of most trends and their potential applicability to your business.  When the inevitable boardroom question of “Why aren’t we jumping on this trend?” comes up, you need to be ready to defend your decision to either not adopt a trend or to wait for it to mature or explain when you do plan to adopt.  That means that you need to have a program that helps to expose your IT staff to new trends so that they can analyze and make recommendations regarding adoption.
With the rise of hybrid cloud, containers, DevOps, automation, and security breaches galore, enterprises in 2017 will finally invest more of their budget into staff training. In many cases, enterprises are already using cloud technologies, containers, and other newer technologies.  Beyond just operating these technologies, staff must understand the full breadth and depth of the stack to help protect the business.  Businesses must protect themselves before they are put out of business with a security breach, and they need to find ways to become more efficient.


No matter the architecture chosen, there are new realities in enterprise IT that will accelerate in 2017.  First, simplicity is chic.  It’s a goal for most organizations that have come to realize that the data center is often overly complex.  A number of today’s newer architectural options exist to solve this problem.  We expect to see this adoption of simplicity to accelerate in 2017.
As the other side of the same coin, there is also an acceleration in the breaking down of the IT infrastructure silos that have existed to this point, particularly as enterprises adopt solutions that are more efficient to administer.  The goal here is to break down the age-old IT infrastructure silos and elevate their people to being simply infrastructure admins or architects. This will require them to do some retraining but will allow them to be more efficient in the future.  We fully expect to see this desiloization trend continue in 2017.


Like all years that have come before it, 2017 will be chock full of new trends, new products, new challenges, and new solutions.  Some of the predictions discussed in this article will be proven correct and others will, well, be outed as just hopes that didn’t come to fruition.  Regardless, however, it’s clear that the technology landscape is poised to continue its ongoing fundamental transformation and IT pros and decision makers will need to be armed with all of the skills and knowledge necessary to traverse the changing landscape to ensure that they don’t get left behind.