Skip to content
  • About Us
  • Our Services
  • Case Studies
  • Content Hub
  • Blog
  • Join Us
  • Contact Us
Where Donald Trump Meets DevOps and the Cloud
Ben Saunders

Where Donald Trump Meets DevOps and the Cloud

Do shifts in US social and economic policy inadvertently chime with the move to DevOps and cloud computing across the financial services industry?

Before you shoot the messenger, I’d like to open by stating that this article is by no means a reflection of either my own, or Contino’s political sensitivities!

That said…during my recent trips to the US, I couldn’t help but notice that there are some inadvertent – and very interesting! – overlaps between some of President Trump’s economic and social policies and the seismic shifts towards digital transformation that we are seeing across all industries. Namely, Mr. Trump’s desire, firstly, to loosen the grip of Dodd Frank and other regulatory controls across financial markets and, secondly, to tighten the grip on H1B working visas for non-US citizens.

In this blog, I’ll examine the impacts of two of Trump’s policies and explore the repercussions – and, surprisingly, synergies – between what he’s doing and the kinds of changes that enterprises need to embrace to remain competitive.

Hopefully you will find the following account balanced and possibly even humorous. Because let’s be honest, everyone enjoys a little bit of political satire. Hang on in there with me. Honest!

Shifts in Economic Policy and the Dodd Frank “Disaster”

There has been much conjecture over the last few months surrounding many changes of political direction in the US.

Most notably on the economic front, President Trumps intends to loosen regulatory controls and compliance standards that were introduced under the Dodd Frank act. Established after the 2008 financial crisis, Dodd Frank was implemented to help prevent future financial crises from occurring and established significant compliance standards for financial institutions.

Trump is unimpressed: “Regulation has been horrible for big business, but it’s been worse for small business. Dodd-Frank is a disaster.”

But how does this relate to DevOps and the cloud?

Some could argue that the lowering of Dodd Frank standards leaves an air of ambiguity as to what controls financial services organisations will be required to follow.

The reason why DevOps and the cloud are so relevant is that, in this atmosphere of political volatility as well as regulatory uncertainty and ambiguity, the best response is to maximise agility, flexibility and cost-effectiveness. This ensures that financial organisations can turn on a dime in regard to the changing regulatory landscape, ensuring they are at once maximally productive and maximally compliant.

Leveraged wisely, DevOps approaches and Continuous Delivery patterns could almost certainly help address the many changing regulatory requirements off the bat, without the need to wade through tiers of arduous, costly and time-consuming bureaucracy.

So let’s jump into why this is the case. I’ll begin by taking you back to the days of the crisis itself…

Back to the Future … or 2008, At Least

Let’s rewind back to 2008 and consider the IT landscape at the time.

Large financial services institutions were in a state of shock and in the process of culling their internal IT functions in favour of large outsourced service integration contracts, which relied heavily on offshore resourcing. Cloud computing was in its infancy. AWS had only really been on the block for two years whilst Azure, Microsoft's competing service, was not brought to market until 2010. Service Oriented Architectures had begun to arrive in many IT ecosystems and were being keenly adopted by financial services organisations that had been forced to merge and were integrating legacy IT systems, for example, Bank of America and Merrill Lynch in the US and The Halifax and The Bank of Scotland (HBOS) back in the UK.

ITIL was pretty much the standard for how incident, release and problem management functions should be operated. Industry paranoia about releasing poorly configured software at a time of catastrophic financial turmoil resulted in a shift towards manual software delivery processes and the good old four-eyes-sign-off and box-ticking exercises.

The solutions that were established after 2008 by financial services organisations to address Dodd Frank were built on the foundations of technology from yesteryear: systems and applications were not decoupled, nor established with APIs in mind and CapEx was the new OpEx. They were also, more likely than not, delivered using cumbersome waterfall software development principles that prevented the iterative development and fast feedback practices we have become accustomed to today (Agile, Scrum, Kanban etc).

And lo and behold, the resultant, antiquated, four-times-per-year release process became widely accepted as the status quo. Ironically, this approach inadvertently introduced more risk by pulling together a plethora of application changes and rolling them into a ‘group wide’ release schedule.

Attempts to reduce risk ended up increasing it.

Fast Forward to 2017…

2017, by contrast, offers the opportunity to adjust properly to risk.

It is the age of increased transparency, DevOps and public cloud. A period when FINRA openly embraces cloud technology and automation practices to audit and analyse over 75 billion trades per day. A time when market volatility is still present, yet OpEx is now the new CapEx. Servers are provisioned in minutes via public cloud platforms, as opposed to months. A time when microservices tightly integrate disparate applications, containers enable rapid application delivery and big data is indeed big!

Yes, the technology landscape has become more complex. However, our ability to access data about data, understand who has done what, where and when has drastically improved. If regulatory bodies like FINRA can openly embrace public cloud to monitor the trading behaviours of banking institutions, then why can’t banks use public cloud to keep a check of their own house, all the while controlling cost and removing the reliance on human intervention to run, operate and maintain compliance processes and controls?

In this context, financial services can build the capability – if they want to – to turn the potential volatility of Trump’s policymaking into an opportunity to react more nimbly than their competition and to get ahead of them.

Think about how much of that $36 billion remediation budget could have been used elsewhere if DevOps and cloud computing had been used more prominently by financial institutions to tackle regulatory challenges?

Over the next few sections I’ll examine how DevOps and cloud can be used to create advantage for financial services organisations in this political context.

DevOps: Fully Automated, Full Traceability and the Digital Fingerprint

The birth of DevOps in 2007/08 allowed organisations to enable their resources with more ‘accountable empowerment’. Namely, delivering change faster in an automated fashion, with less procedural bureaucracy, yet simultaneously enabling greater visibility.

By aligning processes and technology we are now able to tightly integrate software delivery pipelines with automated solutions, right the way from requirements and design to production monitoring. We can see who requested new application features from the business, we can prove who developed them and we can ensure they were rigorously tested with automated solutions prior to being deployed in production, with digital fingerprints being captured at each stage of the software development lifecycle (SDLC).

This is to say nothing of the huge benefits afforded by being able to bake compliance and security controls into the pipeline with automated quality control gates. Indeed, the advances in technology with capabilities like infrastructure as code, automated testing and containerisation means that is now easier than ever to standardise application stacks, whilst integrating disparate data sets and applications, with huge benefits for compliance and security teams.

Cloud Storage: Right Solution, Right Cost…and Compliant?

Currently, financial services organisations are spending upwards of 60% of their IT budgets on regulation and compliance. Cheaper storage for trade, risk and market data will no doubt be of great interest.

Thanks to the public cloud, we now have the capacity to store more of data for less cost. Developments in NoSql and in-memory database solutions allow financial institutions to run real-time analytics and risk reporting faster, with greater levels of transparency. Coupled with the dynamic capability of solutions like Amazon EC2 and ECS, database servers can be provisioned rapidly and torn down once the calculation has been executed.

In addition, the onset of solutions like S3 and Glacier from AWS, means financial service organisations can store larger sets of sensitive data securely, with greater levels of control to ensure only those who should have visibility to certain sets of data, do so. Again, the cost benefits here are undeniable with the capacity to used tier storage solutions for frequently and infrequently accessed data at enhanced levels of availability and stability (99.999999999 availability from AWS).

Data visualisation and log aggregation solutions provide enhanced capabilities to capture, audit and visualise data in ways that were not possible nine years ago, whilst the capacity to leverage cloud hosted distributed compute power and big data hosting have made it possible to analyse gigantic data sets and detect irregular trading patterns and behaviours faster. There is no doubt that advancements in Artificial Intelligence and Machine Learning are already being applied in this context, although they weren’t quite ready back at the height of the crisis (though, I am sure someone was making great use of these technology offerings to some extent back in 2008).

Indeed, the opportunity to leverage these types of offerings in the cloud, will only benefit large financial institutions by allowing them to remain ahead of their competition, in being alert to market fluctuations and calculate risks, faster.

Regardless of whether or not Mr Trump does lower the barriers of the Dodd Frank regulation, the capacity for banks to store data required by the regulators has become cheaper and easier to access.

Distributed Compute: “I just don’t have the power captain!”

I alluded to the stand-up, rip down, dynamic nature that cloud compute provides in the previous passage. This contrasts massively with the relative intransigence of the distributed grid compute that financial organisations have traditionally used. Coincidentally, I recently spoke with a customer on this very subject and they told me the following:

“By using AWS to run my risk analytics, I can completely isolate myself from others in the business, meaning no degradation in service across a shared grid compute farm and full traceability of my trading activity. Whilst if I use internal grid compute power, I must book slots to run my calculations and experience sluggish response times. On premise, it costs me 7 cents per hour to execute my analysis. On AWS, its costs me 2 cents per hour”.

Now, I am not saying this alone completely solves erratic and irresponsible trading practices across financial markets. Yet, it does indeed open a huge avenue for enabling full traceability across the trade lifecycle, whilst expanding market penetration opportunities for smaller private asset management firms.

Perhaps, Mr Trump is – accidentally, of course – pointing us in the right direction?

Furthermore, if banks can spend less money on placating regulatory challenges by using cloud-based environments and automated processes, they can therefore re-invest that money in opportunities to innovate at scale with technologies that drive economic prosperity in emerging markets, like blockchain. That itself is a whole other blog on the redistribution of wealth! However, if you are interested, I suggest you check out this TED talk for more information.

I’ve spent a bit of time focusing on how technology can help stem the need to manage regulatory controls without a reliance on manual procedures. Now, I want to turn my attention to the human elements of Mr Trump’s policies and outline how these, in turn, could help the DevOps movement bring more benefits to large financial services organisations.

H1B - Green Card Changes and Stifled Innovation

Various media outlets have outlined deficiencies in the President’s border control policy. Namely, his intent to limit the release of H1B visas across the board. Indeed, there has been a lot of commentary highlighting the challenges this has bought to the business models of many large service integrators (SIs) who have traditionally delivered to customers through large offshored development, testing and infrastructure functions.

For instance, Cognizant announced that it would be making a number of senior managers redundant and reducing its annual hiring headcount as a result of tighter border controls which would reduce the movement of their resources from India into the US. Similar announcements have since followed – Infosys, for example, have announced the hiring of 10,000 Americans over the next two years – as a consequence of Trump’s manoeuvres.

Indeed, with DevOps operating models, we often cite the need for cross-functional delivery teams to be co-located and situated in close, if not immediate proximity to the business. Typically, the large SIs provide a delivery model which deploys a 20/80 rule. That’s 20% of their people based on-shore and 80% of their people based offshore. I have worked for many large enterprises in the past where this has typically been adopted and to a large extent, stifled opportunities to drive true innovation.

I would like to add that I am not questioning the quality of people hired by large SIs. I have worked with many knowledgeable peers from these organisations in the past, many of whom I now count as a friend! However, the structures in which they operate are hindered by command and control processes that prevent them from applying innovative approaches to software delivery.

This is a sentiment shared by many of my colleagues at Contino, one of whom recently confided that they felt hamstrung when operating within a large SI. They were not empowered to optimise the way change was introduced for their customer and often relied on laborious manual interactions which required sign off from four different superiors before changes could be released.

With automated quality reviews and gateways providing assurance and end to end traceability of ‘who, what, where, when?’, there is, to some extent, no longer a need for a proverbial cast of thousands to provide delivery assurance and ‘protection to production’. This would certainly lend itself to a shortfall in the aforementioned SI resources, with organisations being able to empower their own people to focus on value-add activities, whilst the mundane but essential maintenance and compliance controls are handled through automation practices.

I should add that the resource I mentioned earlier in my article is now a Technical Principal at Contino. He was so frustrated by his time at a large SI that he left and joined a start-up where his technical skills can be maximised, without the shackles of command and control.

In short, one could argue that limiting H1B applications is a positive step for both American job applicants and the economies where typical outsourced, offshored IT delivery is executed because:

  • A lesser reliance on an outsourced, offshored delivery will drive Financial Services organisations to explore U.S. centric recruitment drives, which could tap into the talent pool that was decimated as a result of the 2008 financial crisis, or indeed provide more jobs for talented technology graduates.
  • By limiting H1B application approvals, low cost geography regions will see a swell of talent return to their shores and could potentially accelerate technical innovation opportunities, start-ups and tech-hubs, just as we have seen elsewhere across the Globe.

Creating Global Prosperity

I wanted to associate the political changes we are seeing not as threats to the way we govern and regulate financial markets or as changes that might prevent people from bettering their career opportunities and earning potential.

Whilst the above is by no means a complete analysis, it’s interesting that Mr Trump is at least making us consider business possibilities that have a lot to do with the digital transformations that we are seeing across all industries, and which have DevOps and cloud computing at their core.

I also wanted to indicate that the technology landscape we have before us today, powered by the cloud, DevOps thinking and Continuous Delivery frameworks can help drive innovation, differentiation, transparency and economic prosperity across the world, not just in the US.

Hopefully, that much is apparent!

More Articles

They Call It a Royale with Cheese: What GDPR Means for Australia

They Call It a Royale with Cheese: What GDPR Means for Australia

13 June 2017 by Dan Williams
Why AWS Lambda Stands Out from the Competition

Why AWS Lambda Stands Out from the Competition

12 June 2017 by Henry Bell
DevSecOps: A Quick Compliance as Code Demo [Video]

DevSecOps: A Quick Compliance as Code Demo [Video]

7 June 2017 by Daniel Hurst
  • Londonlondon@contino.io