Published on the 23/07/2015 | Written by Beverley Head
Without a radical rethink of enterprise IT, the data that emerges from sensors and consumer focused services will become a burden on the organisation rather than a valuable commodity…
The volume of data that an organisation needs to manage, manipulate and exploit goes up by orders of magnitude every time a sensor is switched on.
Meanwhile consumer focused services have flipped the paradigm; instead of a bank sending out a statement once a month, the customer now leaps online three times a day checking their balance – creating screeds of data and unprecedented transaction levels.
According to Matt Green, vice president of product management for Software AG, who is in Australia this week, this demands a fresh approach to data management and systems design where flexibility and agility are paramount, but backed always by robust data management procedures.
The sentiment was echoed by Gartner vice president and fellow Yefim Natis speaking at this week’s Application Architecture, Development and Integration Summit, held in Sydney, who said that; “If you don’t know what to do with big data it becomes a burden.”
Both maintained that data needed to be properly collected and curated – but also made available to an ever changing range of applications and services to meet changing consumer and enterprise expectations.
Green said that Australian enterprises are among the front runners in terms of making this transition, spurred in part by the nation’s high rate of smartphone adoption. He said that Star Casino had for example pioneered consumer facing applications that were more sophisticated than those in Las Vegas.
Loyalty focused applications that depended on access to the Casino’s data reserves had been rolled out he said. Instead of developing single siloed applications, he said that Star had deployed an integration layer and an analytics layer to allow flexibility at the front end for consumers and efficiency where data was accessed at the back end.
In his presentation at the Gartner summit last week Natis advocated for more of this sort of flexible approach to systems design.
“The old architecture falls short … old models are obsolete and an obstacle to adopt the new technologies. They are monolithic – not designed to be horizontally scaled and integrated and are created as a stovepipe.”
Nor he said were older applications designed to take advantage of parallel or in memory computing which will be needed to take full advantage of the data resources that are now being collected.
“All that slows your ability to grow and adopt the most recent innovations,” said Natis.
Instead he advocated enterprises start to explore software defined application services which brought together three layers; applications, an interface control gateway, and service implementations with each layer connected using application programming interfaces (API). By using two layers of APIs it was possible to inject flexibility at the front end, linking different applications to the gateway, which then used a second set of APIs to connect to enterprise services (and data) needed for those applications.
With that in mind Natis outlined his “12 principles of application architecture” needed in order to deliver the flexibility needed in the modern business environment, along with the rigour to ensure that data was both properly used and protected.
- Services are the starting point – enterprises require a service oriented architecture to allow multiple front ends, and composite applications – where one front end accesses multiple back ends.
- Virtualise application services – this allows the interface control gateway to handle access (determined by policy) to back end services and data. (According to Natis this where tools from Software AG, Mulesoft, Dell Boomi, RedHat, Apigee and the like can be deployed)
- Guard your data. Services need to be graded, with access determined by enterprise policy. All access to data should be via APIs. Natis said that was the approach taken by Salesforce to protect data and control access. “With the Salesforce DBMS, if read the data you make no sense of it – the only way to make sense of it is to go through their APIs.”
- Adapt to blurred boundaries by developing a series of microservices that can be served up different ways. A little like Lego, this packages up transportable software units that can be used in multiple ways. Enthusiasm for microservices is one of the trends driving demand for Docker which exposes a microservice’s API, application code, application data, libraries and infrastructure code in a single portable logical unit meaning it can be used in a wide variety of ways, as long as policy permits.
- Design your data model with the “CAP” theorem in mind, understanding that it’s not possible to have consistency, availability, and partition tolerance all at the same time. Determine priorities and develop accordingly.
- Protect data integrity by creating privileged data services. One size does not fit all.
- Think cloud when developing back end applications – not because everything is going to the cloud, but because your applications will just work better.
- Think mobile when developing end user applications – it’s just a better way to create design and user interface and is ready for fast, frequent replacement.
- Think integration – nothing should be designed only for initial use – there will be always some new scenario where you might want to reuse.
- Think webscale – applications needed to be scalable, agile and elastic. This will eventually require a move to in memory computing to combine transactions and analytics, extracting more insight from data without needing to duplicate it, in a data warehouse for example.
- Use event processing. Being fully decoupled delivers more freedom, according to Natis.
- Automate context mining which again makes better use of the enterprise data reserves.