Using Best Practices and Automation to Increase Revenues

Marketing and Sales

Subscribe to Marketing and Sales: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Marketing and Sales: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


BuyerSteps Authors: Jason Bloomberg, Will Davis, Adrian Grigoriu, APM Blog, Sarah Watkins

Related Topics: Cloud Computing, iPhone Developer, Artificial Intelligence Journal, iPhone for Business, Wealth Management, DevOps Journal

Blog Feed Post

A Short History of Programming - Part 3 | @DevOpsSummit #AI #ML #DevOps #FinTech #Blockchain

Programming for commerce, banking, and FinTech

Code Compiled: A Short History of Programming - Part 3
By Omed Habib

Look at how far we’ve come. Just seven decades ago, the word “computer” referred to someone in an office with a pile of charts and a slide rule. Then ENIAC, the first electronic computer, appeared. It was followed quickly by the Commodore 64 personal computer and later, the iPhone. Today, the Sunway TaihuLight crunches data by combining 10,649,600 cores and performing 93 quadrillion floating point operations every second. Where will we be in another seven decades?

In part one of this series, we covered how the evolution of hardware shaped the development of programming languages. We then followed up in part two with the impact of the software created by those languages. In this final blog in the series, we’ll look at how programming has taken the leap beyond computers into devices, with an emphasis on how programming is rewriting the rules of commerce, banking, and finance.

Technology in Society Over the Past Decade
Society as a whole has adopted new technology with great enthusiasm, and the pace of that adoption has accelerated due to a few key developments over the past ten years or so. The Pew Internet Project has been keeping a close watch on the demographics of the internet. They reported that at the turn of the millennium, only 70 percent of young adults and 14 percent of seniors were online. That’s still the general perception of internet users, but it’s no longer true. In 2016, nearly all young adults (96 percent) and the majority of those over 65 years old (58 percent) are online.

The single biggest driver of internet access growth has been mobile devices. Simple cell phone ownership went from around 70 percent by American adults in 2006 to 92 percent a decade later. Smartphones, as well as the vast data-crunching resources made available by the app ecosystem, went from an ownership rate of 35 percent just five years ago to 68 percent today. Tablets have taken a similar explosive trajectory, going from 3 percent ownership in 2010 to 45 percent today.

This growing hunger for mobile devices required exponentially more data processing power and a vast leap in traffic across wireless networks. The growth rate can only be described in terms of zettabytes (trillions of gigabytes). In 2009, the world had three quarters of one zettabyte under management. One year later, data generated across networks had nearly doubled to 1.2 ZB, most of it enterprise traffic. By the end of last year there were 7.9 ZB of data generated and 6.3 ZB were under management by enterprises. In the next four years, you can expect there to be 35 ZB of data created by devices, with 28 ZB managed by enterprises.

Developers have had to work furiously to restructure their approach to software, databases, and network management tools just to avoid being swamped in all this data.

A Brief History of Financial Record Keeping
In the world of commerce, the data that matters most all relates to financial records. These define the health of the business today and the potential for growing the customer base in the future. That’s why financial data has become ground zero in the war between cybercriminals and data-security experts.

In the 1980s, the financial industry was still dominated by mainframes. The personal computer revolution that was sweeping the rest of the business world didn’t impact finance. Huge servers and clients were the only way to manage the vast amount of data (compared to processor speeds at the time) that had to be crunched. You might have to use COBOL or write SQL queries against a DB2 database to get the financial answers you needed to make the right business decisions (versus what is possible today). Mainframes were normally closed systems with applications written specifically for them by outside consultants.

All that changed dramatically in the 1990s, with the growth of faster servers, open systems, and the connectivity of internet protocols. Mid-sized computers for business gained immense processing power at lower costs. Mainframes began to be repurposed for back-end processing of transaction data as the finance industry consolidated batch-processing projects like billing.

Computers like the IBM AS/400, which had run on IBM proprietary software in the past, gained the facility for running financial software like SAP, PeopleSoft, and JD Edwards. By the late 1990s, the appearance of Linux and virtual machines running inside mainframes opened up the entire finance sector to a flurry of new open-source development projects.

Simultaneously, network connectivity to the internet and then the web opened up financial data providers to a new threat from outside: hackers. Before, password management and inside jobs were the biggest threat to financial data security. Connectivity opened a window to a new generation of cybercriminals.

Programming Challenges for Data Security
In the weeks after Black Friday in 2013, as the holiday shopping rush was in full swing, a data-security specialist announced that a supermarket chain had become a target. His report warned that the breach was serious, “potentially involving millions of customer credit and debit card records.” There had been attacks on companies before, but this attack netted financial data on 40 million shoppers during the busiest retail period of the year.

The biggest problem is that attacks like these are increasing in their intensity and sophistication. In fact, 2016 saw a 458 percent jump in attacks that searched IoT connections for vulnerabilities. Meanwhile, another front has opened up on employee mobile devices for enterprises. Last year alone, there were over 8 billion malware attacks, twice the number of the year before, most of which went after weaknesses in the Android ecosystem. In terms of the data most sought after by hackers, healthcare businesses registered slightly more attacks than even those in the financial industry.

Data-security experts have to stay ahead of risks from both the outside and the inside, whether they are malicious or accidental. Both can be equally devastating, regardless of intent.

Ian McGuinness recommends six steps for security experts to help them concentrate on covering as many vulnerabilities as possible early on, before moving on to custom development:

  1. Protect the physical servers by tightly limiting access to a shortlist of only employees who must access them for work. Make sure the list is updated regularly.
  2. Create a network vulnerability profile. Assess any weak points, update antivirus software, test firewalls, and change TCP/IP ports from default settings.
  3. Protect every file and folder that captures database information like log files. Maintain access levels and permissions dynamically.
  4. Review which server upgrade software is necessary. All features and services not in common use provide an attack vector for cybercriminals.
  5. Find and apply the latest patches, service packs, and security updates.
  6. Identify and encrypt your most sensitive data, even if it resides on the back end with no interface to end users.

These are really just the basics, though. Monitoring network traffic, recognizing malicious code, and responding in time to make a difference represent the biggest challenges for the future.

What’s Next for FinTech
Programming for financial technology (FinTech) is among the most exciting, fastest changing areas of IT right now. Startups involved in online or peer-to-peer payments, wealth management, equity crowdfunding, and related innovations were able to bring in $19 billion in investment in the past year alone. The White House signaled its support for the contribution of financial software to the greater economy in saying, “Technology has always been an integral part of financial services — from ATMs to securities trading platforms. But, increasingly, technology isn’t just changing the financial services industry, it’s changing the way consumers and business owners relate to their finances, and the way institutions function in our financial system.”

On the road ahead, among the top challenges for FinTech developers will be:

  • The creation of original processes for secure access to financial data through mobile platforms
  • The integration of blockchain capabilities into enterprise financial systems
  • A secure, real-time global transaction system that can automatically adjust to currency fluctuations

The FinTech panel at one recent AppDynamics event concluded that:

“All banks have the same problems, but the capabilities to solve these problems have changed. Banks are taking different approaches, but the endgame is the same, making sure the customers can access their money when they want and how they want.”

Imagining the Future
The technologies developed to handle the above difficulties have much wider applications for programmers in 2020 and beyond. The bigger picture is that the coming age of ubiquitous connectivity will require much greater enterprise commitment to infrastructure maintenance and software performance. As the IoT brings machine-to-machine (M2M) networking to our homes and cars, that will also demand a vastly higher bar in terms of experience. Fortunately, those embracing the latest in DevOps best-practices are uniquely qualified to approach these problems with one eye on what customers expect and another on what will keep the business flourishing.

The post Code Compiled: A Short History of Programming — Part III appeared first on Application Performance Monitoring Blog | AppDynamics.

Read the original blog entry...

More Stories By AppDynamics Blog

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally.

DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.