@Equate #OnSoftwareArchitecture

Boundaryless Organization

Conduct is een IT platform bedoeld om enerzijds het maken en onderhouden van gegevensstromen tussen afdelingen, niveaus en functies binnen een organisatie sterk te versimpelen en anderzijds het bewerken van gegevens sterk te versimpelen. Conduct doet dit door de afhankelijkheid met IT changes weg te halen. Wijzigingen in gegevensstromen en bewerkingen kunnen worden doorgevoerd buiten de IT organisatie om. Conduct doet dit zonder de veiligheid van gegevens uit het oog te verliezen.

Organizations that choose to move toward the Boundaryless Organization to improve their operational effectiveness are finding Information Technology resistant.

Over het algemeen ondervindt een organisatie dus weerstand vanuit de IT organisatie bij het realiseren van een boundaryless organisation. Met Conduct krijgen afdelingen, functiegroepen en andere teams binnen een organisatie de beschikking over hun eigen data lake. De teams kunnen zelf bepalen welke gegevens met wie gedeeld worden, en wat er met de gegevens gedaan wordt. Het delen en bewerken van de gegevens vereist geen inspanning van IT.

Op deze manier creëert Conduct een boundaryless information flow binnen een organisatie. De methode was ondermeer afgeleid van het Boundaryless Information Flow architectuur raamwerk dat indertijd is geformuleerd door The Open Group, om weerstand te verminderen en barrières te verlagen en zodoende boundaryless organisaties te ondersteunen.

Nadat de barrières zijn verwijderd krijgt een organisatie de volledige beschikking over de noodzakelijke gegevens en kan het naar eigen inzicht zichzelf verbeteren, om sneller marktwerkingen te volgen, en uiteindelijk op de markt vooruit te lopen.

Getting the most out of your big data

Gartner

Gartner beschrijft in haar artikel “Organizing for Big Data Through Better Process and Governance“ hoe big data moet worden georganiseerd. Met big data kunnen voordelen worden behaald op het gebied van operationeel beheer, klantbeheer, risk management en bedrijfmodelinnovatie. Toch komt big data nog nauwelijks van de grond in vergelijking met de aandacht dat het krijgt. Het blijft vooral bij prototypes maar de weg naar operationele systemen is vaak nog ver weg.

Gartner geeft in hetzelfde artikel aan dat big data moet worden beschouwd als een proces. Dit proces bestaat uit 4 stappen. De eerste 2 houden zich bezig met exploratie, de laatste 2 met exploitatie.

Bij stap 1 worden ideeën verzameld, inductief of deductief. Deze ideeën worden in stap 2 getoetst door ze toe te passen in experimenten. Overleeft een idee stap 2, dan wordt in stap 3 een business case gemaakt waarbij waarde tegen kosten worden afgezet. Uiteindelijk moeten in stap 4 de ideeën worden geïmplementeerd. En dat is waar het veelal stokt. Gartner geeft aan dat de afstand tussen de big data teams en IT te groot is omdat ideeën vertaald moeten worden naar een vorm waarmee IT het kan implementeren. Toepassing van DevOps teams zou volgens Gartner een oplossing zijn.

Conduct

AvenQure biedt een oplossing die stap 4 overbodig maakt, en daarmee tevens stap 3. Die oplossing levert ze in de vorm van het product Conduct. Conduct is een process automation platform. Conduct biedt een platform waarop naar hartenlust geëxperimenteerd kan worden met big data doordat het enerzijds non-intrusive van aard is, en operationele processen dus niet geraakt worden, en anderzijds doordat het de automatisering van processen impliciet uitvoert; Conduct draagt zorg voor kwaliteitsaspecten zoals security, performance, availability én dus maintainability. Dit maakt het organiseren van big data een zeer flexibel proces. En in hetzelfde artikel beaamt Gartner dat juist flexibiliteit is vereist bij het organiseren van big data.


Conduct biedt een aantal mechanismes essentieel voor het werken met big data.

  • Binnen Conduct wordt gewerkt met contexten. Een context is een domein waarbinnen gelijksoortige gegevens worden verwerkt. Met een druk op de knop kan een kopie van een context worden aangemaakt waarbinnen naar hartenlust kan worden geëxperimenteerd.
  • Conduct werkt met cellen. Iedere cel bevat een stukje logica. Laat 2 cellen naast elkaar werken, 1 op basis van BAU logica, en de ander aangepast naar de inzichten van big data. Maak een 3e cel om de verschillen te evalueren.
  • Conduct werkt met 3rd party big data analyse producten.
  • Binnen Conduct kunnen standaard alle gegevens met een druk op de knop RESTful worden gemaakt en is zo integratie met 3rd party producten een eitje. Net zo makkelijk kunnen gegevens ontsloten worden met bestandsuitwisseling of database replicatie.

Conduct levert dus een korte feedback loop. Ideeën kunnen direct worden uitgeprobeerd en maakt het daardoor mogelijk dat het big data team in de business unit kan worden opgenomen, zonder afhankelijkheid met IT.

Complaisance, Care Duty and Ownership of Artefacts

Having read  several papers on artefact-centric business process modelling we have found two approaches [1]: (1) artefacts as finite state machines and (2) focussing on the life cycle of artefacts. Both approaches require business analysts to validate the artefact model. We strive however, to a situation in which not the business analyst, but the business specialist is in control. The business analyst plays no longer a role or at most plays an advisory role.
The business specialist must make his/her artefacts complaisant to other artefacts. Complaisance naturally seeks a balance between leanness and richness. Leanness is required to make data quickly available and richness to make as much data available at once. During design time artefact modellers will look for the primary source of data to quickly transition to a new state but will also look to get as much data as possible in one go to limit the number of states and to limit the number of events it subscribes to. Remember that artefacts subscribe to events which are interesting to them. These events carry the information of the artefact that triggered the event.
Primary sources are artefacts that obtain the data from the end-user. There should always be one primary source. In case a (primary) source is not visible to an artefact, the source may not exist, or the authorisation of the source is limited. In such case, the authorisation may be altered to grant read access or one may create an intermediary artefact whose sole purpose is to make the data available to other artefacts.
One may decide to use intermediary artefacts as an interface to other domains. for example, the financial controllers in a company will by default block access to financial artefacts. Only data that has been approved for disclosure may be shared with other artefacts. To prevent that the ACL of all primary sources are altered to disclose the data, a special intermediary artefact is created which holds only data that may be disclosed.

Therefore, in our case, a business specialist being in control means that he/she must be able to make the artefact complaisant to other artefacts. This requires that the artefact is taken care of, which in a changing world requires constant care. The business specialist has a care duty and therefore must own the artefact.

This brings us to the tuple: complaisance, care duty and ownership. An example implementation of an artefact system was developed by AvenQure [2].

 

1. Gerede, C.E.: Modeling, Analysis, and Composition of Business Processes. PhD thesis, Dept. of Computer Science, University of California at Santa Barbara (2007)

2. "Conduct by AvenQure | Non-intrusive Workforce Empowerment.", http://avenqure.com/conduct/, AvenQure, May 2016. Web. 18 May 2016.

Artifact as the Plug for BPM

Silly me. In my previous post The Hole in BPM, I concluded that BPM lacked the concept of an type of business object or document. Little did I know that in 2007 IBM already introduced the concept of artifact-centric business process modelling in [1]. Artifacts are "business-relevant objects that are created, evolved, and (typically) archived as they pass through a business". Artifacts have an information model describing its data i.e. properties, and a lifecycle model describing the states an artifact can go through. The lifecycle is specified following an event-condition-action and/or condition-action style [3]. Indeed it seems that artifacts are the plug for the hole in BPM. 

The idea of artifacts passing through a business [1] can be interpreted as a workflow mechanism where depending on the state, people have or have not access to the artifact. And state being the equivalent of the condition mentioned above. Depending on its state, an artifact is interested only in a limited set of events. After each action the state is re-evaluated as are the set of events. The set is determined by the set of actions that are appropriate for the state. In other words, when an action has been executed, the state is reconsidered. This also implies that the state does not change halfway through an action and that the action is considered atomic.

An artifact must contain a business rule to determine its state, the state business rule. This rule is evaluated after each action. The rule inspects the properties of the artifact to determine the state. Each action must specify during which states it may be executed thus being a pre-condition of the action. In addition to state, an action has another type of pre-condition, namely its access control list. For an action its ACL is simply put those who may execute it. The ACL's of all actions together form the artifact ACL. 

In [2] the actions, or services as they are called, are specified in a declarative way using input parameters, output parameters, pre-conditions and post-conditions. Following the reasoning above an action has the following pre-conditions:

  1. state of the artifact
  2. set of events
  3. its ACL

For read-only actions, the post-condition equals the pre-condition. Otherwise, we expect the post-condition must be defined with the action.

Although [1] does not mention ACL's and does it mention transactionallity, it does appear that artifacts can indeed plug the hole in BPM. More on ACL, transactionallity and inter-artifact relations in a next post.

 

  1. K. Bhattacharya, N. S. Caswell, S. Kumaran, A. Nigam, and F. Y. Wu. Artifact-centered operational modeling: Lessons from customer engagements. IBM Systems Journal, 46(4):703–721, 2007.
  2. Deutsch, A., et al., Automatic verification of data-centric business processes, in Proceedings of the 12th International Conference on Database Theory. 2009, ACM: St. Petersburg, Russia.
  3. http://en.wikipedia.org/wiki/Event_condition_action

Business Processes should not be modelled

In a previous post I argued that BPM lacks the concept of the data object. I argued that people tend to communicate with other parties based on forms which effectively represent a data object. Consequently, it can be said that when people think of their business, they think of the business data objects, or simply business object, and when and how these objects are used. Therefore, design and modelling should take place at the level of the business object. So, a business object has relations with events (when is the object used), its roles (how is an object used) and from its roles, the relations with other objects (who uses the object). The latter is the field of Business Object Relationship Modelling which I wish to explore in a next blog post. 

The point is that modelling should focus on the business object and its direct surrounding, the events, its roles and its neighbour objects. This is how people think about their business. Put all of these business objects and one will be able to see the business processes i.e. process mining. People do not think, up front, about (lengthy) process chains. Processes do provide valuable information, and can help for example in determining the cause of stagnations.  

The Hole in BPM

Business process modelling (BPM) has been around for quite some time now. On conceptual level BPM (together with its little sister business rules engine (BRE)) has the potency to be the 4GL that companies have been looking for.

Business process modelling is an activity to represent the processes of an enterprise [1], any process. Yet, in reality BPM is used very little. BPM's are used mostly in the area of the formal business domains i.e. where it concerns legal constructions with customers or business partners. However, by far, the most processes enterprises have, concern day-to-day activities between employees. And those employees still use forms and emails to communicate with each other. 

All departments have periods in their existence when they are overwhelmed by the amount of work that is expected from them. In an attempt to structure the incoming work, departments make up intake forms that people outside the department must fill in to get something done. Although this structures incoming work it creates a lot of overhead for the requesters. These forms are the bricks that build the walls between departments.

This is where BPM would be extremely useful. It is therefore, at first glance, unbelievable that BPM is not used in such situations. Zooming in on the situation reveals that forms require a lot of information from the requester. And although the information is present somewhere in the enterprise, it is difficult to gather. Most often the information can be found in a configuration management database, in the company's employee directory, or in project documentation. The problem with these information sources, is that they do not structure the information such that it can be re-used easily. And if one manages to enter the information into a process in a BPM tool, the information is lost once the process completes or even before. So, unless databases are created explicitly to store the information, and careful thought is given to the nature and structure of the information, the information is lost.

This is where BPM has left a gap. BPM should have added the concept of an data object. An object is mutated during processes by tasks when events occur. With the notion of the object added to BPM it becomes very easy to re-use the information to fill in the aforementioned intake forms. Departments can create forms at will using the enterprise object dictionary to specify the information required. Information available in the enterprise is added automatically, real-time if needed and therefore always up-to-date. 

Hence, without the concept of the data object being part of BPM, BPM will not find its way into the day-to-day activities of employees. In next posts I will give my thoughts on how Business Object Relationship Modelling (BORM) and artifact-centric business process modelling can fill the hole in BPM.

1. http://en.wikipedia.org/wiki/Business_process_modeling

Home