IBM Storage Defender

IBM Storage Defender

Early threat detection and secure data recovery

 View Only

Developers, DevOps, Serverless Computing and Whatever Happened to IT

By Jason Harrell posted Tue December 18, 2018 05:20 PM

  

By Jason Harrell, backupstars.com

Disclaimer: This article is intentionally written to be a bit provocative to make the simple point that: all techs (new school or old) must code.

Unashamedly, I love British sitcoms. In particular, I especially love throwback or classic sketch comedy shows from the 70's and 80's like Mr. Bean and Benny Hill. For those unfamiliar with the show IT Crowd, it is a British sitcom set in the offices of Reynholm Industries, a fictional British corporation in London. Roy and Moss (IT helpdesk technicians) are the show's two main characters along with their clueless, non-technical boss, Jen. The show focuses on the shenanigans of these three Brits as they "work", flirt with mayhem and live out their professional (and personal) lives as Reynholm Industries' IT brain trust.

In my opinion, the show is great because, before shows like HBO's Silicon Valley, it gave IT professionals a normal personality and added humor to an otherwise stressful and (in some cases) thankless job. Before IT was kewl, techs felt more like a cult of assembly or BASIC language-obsessed basement dwellers eager to demonstrate their obscureknowledge and ability to program or hack PC hardware in order to create new things or get traditional things to function in new ways. Contrast then with now and IT is business unto itself with millions of practitioners, technologies and functional areas.

What used to be considered "a woman's job", computer programming or now 'app' developers is the domain of self-described hackers (white hat) and coders with good social skills. Far away from the dingy basement or dank cubicles, app developers are front a center in today's brave new tech world. In all truth, developers are now what little kids should dream of becoming instead of becoming firefighters and police officers (LBVS). I'm afraid such noble societal roles will soon be mechanized with life-like, silicon robots connected to an IoT fabric, powered by deep learning and artificial intelligence subsystems and algorithms. What used to be the stuff of movies is now a clear and present reality! Robocop is (now) real! (And this time I hope they give him a cool cop car like the Decepticon Barricade had on Transformers: Revenge of the Fallen. That Ford Taurus Robocop drove around in sucked!)

Moss and Roy were great guys but today, they'd find themselves as artifacts in a fading IT past; anachronisms of a time passing quickly. Moss and Roy existed at a time when IT people were more 'systems people' versus developers. Great sysadmins always knew how to script (c-shell, bash, Korn, expect, Perl, Python, Tcl, etc.) or even code in C, but they wouldn't have considered themselves coders (per se) but at best hackers. These kind of sysadmins are like reincarnations today as we find ourselves in a DevOps world dominated by scripting for greater automation and efficiency.

Back in the day, sysadmins were responsible for modifying the kernel, OS patching, disk (LVM) management, network configuration/performance, OS hardening, etc. The primary difference between then and now was that infrastructure could not be codified into a series of statements and parameters within a YAML or JSON template -you had to physically install it! There was a time when servers had actual hostnames like thor, zeus -many sysadmins had favorite servers for crying out loud! Nowadays, servers are mere cattle or ephemeral compute instances with key/value pairs that do an application's bidding.

Before 'business agility' was in our lexicon, agility was measured by how fast IT could spec, order, rack, image, network, configure, test and productionize servers. This process was measured in months not minutes. This model worked for a while as business applications were heavy, more complex and difficult to deploy. In other words, you had time because the SAP gurus weren't likely going to be ready anyway :). They'd find something else and things would get delayed yet again. Fast forward to today, a few guys and gals with an awesome (and very addicting) iOS game and/or web/mobile application can deploy their app in less than 1 - 3 months, all within the cloud so no upfront cost + limitless elasticity, and generate as much revenue as some Fortune 1000 companies at a fraction of their cost. Consider the mobile game, CandyCrush. In 2013, it was reported that CandyCrush generated $850,000 in revenue per day! Writing an application like CandyCrush with over 150 million daily active users in the past would be simply IMPOSSIBLE!

Nevertheless, as traditional business models evolved and changed to capitalize on trends in social and mobile customer engagement and acquisition, new applications were required, new development frameworks and tools, and also -new ways to manage and pay for infrastructure. <Insert cloud computing</> and <RESTful APIs here/>. Disruptive as the loom was to once prized fabric weavers, these platforms and technologies have fundamentally and forever changed application development and computing. Today, applications can leverage new services faster (via API integration) and code can be deployed and tested a breakneck speed allowing for a new "fail fast" and "fail often" ideology.

Once considered a fad (yeah), now cloud computing is the new normal. I mean, who in the world would consider not buying expensive, commodity servers and storage and instead, securely host application workloads in an off-premise, managed infrastructure for a fraction of the cost with the ability to deploy entire environments in minutes versus months -or years! And what fool would find it useful to leverage a bevy of managed services in a pay-as-you-go model versus spending a ton a money upfront on database software whose licensing model is inspired by 18th-century feudalism?

In any case, with the advent of cloud computing, infrastructure management (and control) has increasingly shifted away (call it redistribution) from traditional sysadmins and engineers to a new ruling class: Developers.

These t-shirts and flip-flop to work wearing, curly brace using developers (poor attempt at adding humor), saw traditional IT as stifling to their ability to rapidly code and deploy new features and enhancements. And they were right! So, they masterminded and recruited a race of 'DevOps engineers' to perform various ops and workflow automation tasks to ease their burden and like monks, focus more on coding. The application is now the center of the world today -not infrastructure. Today, developers are the rockstars, DevOps make up the entourage and traditional IT admins are "roadies". Today, if you aren't already or working to become a cloud architect, game-over.

If cloud computing wasn't disruptive enough to traditional IT consumption models, serverless computing is IT's merciless killing. With serverless computing or FaaS (Function as a Service), infrastructure management has become even more abstracted and pushed further down the stack. With serverless computing, no one cares if you're an "HP shop" or "Dell shop" or if you run UCS blades -so what! Even if you leverage hyper-converged, that's nothing compared to the power and soon to be prowess of serverless computing. All of these hardware solutions are like last meals to a dying man before he goes to his demise.

With serverless computing, developers don't have to worry about infrastructure at all. Entire commercially viable, multi-million dollar a day websites can be constructed out of "thin air" with their code running via AWS Lambda (for example) leveraging various AWS API services. Now, before someone says, this is all a load of crap as not all applications will run in a serverless environment and not everyone is comfortable with the cloud so while this may be disruptive -it's not fatal. And furthermore, there's still on-premise network switches and VMware and Citrix and etc. to manage so, whatever. I won't digress into a lengthy retort here but say that all roads (more or less) now lead to the cloud. You can see that for yourself. In fact, I predict that most now hybrid cloud environments will largely dissolve into full-on public cloud environments within the next 5 years (~2022).

Serverless computing is soon to become the new, new norm. Already today, serverless computing platforms like AWS Lambda, Microsoft Azure's Functions or Apache's OpenWhisk are dramatically changing the model for application hosting. Serverless computing allows for code to be run without regard to underlying infrastructure: code, upload, test, repeat. No more worrying about hosts (ephemeral or not), or storage tiering, etc., or load-balancers as the environment will scale up and down to address capacity and performance requirements on demand. The application is what it's all about and serverless computing just cuts out the proverbial middle-man. Sorry, P. Diddy, it's all about the app (versus benjamins), baby.

To digress, infosec guys are largely unaffected by a lot of this. The network is still 'the network' and as long as TCP/IP is around (and other networking protocols) and there's a need for routing, switching, tunneling, firewalling, intrusion detecting, etc., ports, packets, and logs will need to be monitored, filtered and inspected. Infosec guys are also golden children in this brave new world thanks to hackers. For security professions, their job function doesn't change much because regardless of where the infrastructure and/or applications are hosted, they must be secured.

So what happened to IT? Developers and DevOps engineers moved in on traditional IT's turf and have all but taken IT over. Resistance is futile. Code or be slaughtered. Transform or be gone.

0 comments
17 views

Permalink