Ongoing loss of agility

Do you remember Wall-E and the Axiom? If not, then you may want to go and watch the movie before reading on …

What comes into your mind when thinking about the humans on the Axiom? They are harmonious, homogenous in lifestyle, well-functioning and polite. They follow the daily routines and enjoy somehow the time on the ship. The ship has taken control of their life and organizes it for them. Everything is predictable, under control and automatically managed 24×7.

An interesting question is if such a comfortable state is really desirable. Intelligence is the ability to acquire and apply knowledge and skills in unexpected situations. But what is the result if intelligence is actually no longer required in daily life? The axiom could stimulate the humans by creating surprises and challenges but the people would soon figure out that this is only a game and does not really matter. If typical human skills are no longer required, what motivates people? What is the value and position of humans in such an environment?

The situation on the Axiom is an extreme state. But the question if we are progressing in such a direction is interesting.

It feels that the civilised world wants a “constant” to make all predictable and eliminate negative surprises. But such events may often not be positive or negative by their own. It is more how people react to them. If a system like the Axiom takes care if everything, will human intelligence, processing, or even value degenerate?

When people are in such a comfortable state, will they try or attempt something new – fearing to lose what they already have? Could the evolution of our society bring, as a consequence, a complete loss of agility? Are we already on the way to such a state caught in daily routines defined by others?

Need to break free from legacy ….

Interaction with the environment is an essential ingredient to success in the mesh economy of the age of information. Many companies seem to struggle with this as they cannot let the past go. The legacy defines what is possible and also what success means. The activities are inside out oriented dominated by the inherent constraints.

One reason might be that many organizations have lost the sovereignty to move forward. For quite some time organizations were optimized by means of business process outsourcing. Near and offshoring, it was believed, would bring efficiency. That was maybe true in the age of industrialization. But it is not true anymore in the age of the information where lean and agile structures, skills and technology make the difference while repetitive tasks are automated. The approach may have provided a bit of short-term profit polishing but unfortunately resulted in a longer term strategic disadvantage as skills and know-how moved to the sourcing partner.

Another reason might be the way how business cases are used today. Any change requires a business case. This was relative simple in the slow changing age of individualization and also works well for small changes today. It does not work well in the world of VUCA.

In addition the dynamics in organizations do not help with the approach. The own business case has to be better than most other cases to have a chance in the internal competition for funding, so it will be optimized a bit. Since everyone knows that, all the cases are being tuned. The management sees the number of excellent business cases a a luxury problem – it feels good to have such choice.

On the flip side the decision process gets difficult and tedious. During all these processes and procedures, a lot of time passes by. At the point when decisions are finally made the environmental conditions have changed, and everything should actually start again. But that would be too painful, so the execution of the initiatives starts anyway. At some point after the project start, management realizes that the projects are a little off the mark, then the organization re-prioritizes and the internal bureaucracy starts again. Then also the ego of the typical manager starts to amplify the effects – a great manager cannot have done anything wrong, which brings the whole theater to the next round.

Bigger changes are hard to get through – the assumptions feel less intuitive and the risks are perceived to be higher. Hence the small incremental changes are preferred and the core fundamental ones continuously push down in priority. The company change capacity slows down and the cultural legacy is soon be joined by technical legacy. Some people like this stage as it offers the opportunity to launch simplification programs, which of course suffer from the same effects as all other projects.

I think there is only one way to resolve such a situation – push ‘refresh’. Rethink the business model and decompose the organization into self sufficient units. Give them the competence to decide and also the responsibility to deal with outcomes.

And this leads to the most challenging point – the time “where the turkeys are asked to vote for Christmas” and to disrupt themselves.

Need for ethical data sharing standards ….

Currently there is an intense discussion about personal data and ownership. Although the assumption of exposing data is inherently dangerous, the opposite may be true as well. The key question focusses more on access control to manage data exposure in order to avoid risks while still getting the benefits. Or maybe it is more about establishing broad ethical standards as data availability is neither good or bad on its own – the way it is used makes the difference.

Let’s look at some use cases which illustrate the challenge. Let’s assume you feel an unusual pressure in your chest and decide to visit a doctor. He does an electrocardiogram (ECG) which looks normal. What does this mean – although it maybe normal in relation to a statistical sample but may be very unusual for you. The doctor does not have the necessary data to determine this.

Let’s assume now that you have a wearable which collects data of key body functions on an ongoing base. Many others do this as well and the gathered data can be analysed leading to algorithms which can detect anomalies like an indication of an increased risk of a heart attack on a personal level. Your data is now used in two ways – to build up the algorithm and to detect health issues for you. Please note that just doing some data gathering when you feel bad has not the same effect. The power comes from the availability of longer term data.

Let’s now assume that the new way to identify potential heart attacks reduces the perspective costs substantially. Especially in the long-term, trends and indicators can be used to influence the behavior e.g. towards more physical activity.

The data gathered could also be used to derive further information which could be used for unethical purposes.

You might say that it is always possible to switch data gathering on and off  – depending on the type of data such ‘white spots’ may finally be problematic and will not help to better protect privacy. If gathering data is the norm, then switching it off could easily seen as an indicator that somebody tries to hide something may actually attract attention.

Such data could be pseudonomized – but needs to have some sort of tag to allow correlations to power the analytics or it may even by identifying itself like the ECG which is also being used for authentication.

There are many such examples – if you share your GPS position, somebody may take advantage that you are far away from home or use the data to rescue you after an accident.

I think that collecting data will be the norm. Protecting data will be key – but even more important is the need to establish ethical standards on how to deal with such data and information derived from it.

The Swiss National Bank and the Cypto Franc

Financial newspaper “Finanz and Witschaft” published an article titled “SNB will keinen Krypto-Franken”  reflecting on the speech “The financial markets in changing times – Changes today and tomorrow: the digital future” by Andréa M. Maechler, Member of the Governing Board of the Swiss National Bank (SNB) held at Money Market Event in Zurich on 05th of April 2018.

There are various views on the nature and definition of money, currency and cash which has substantially changed over time. The post How to explain ‘new’ things like bitcoin … provides some thoughts on this topic. It is very difficult to talk about all these terms as the definitions incorporate specific perspectives.

The challenge all countries, including Switzerland, face is to sustain global competitiveness. How does one move from the current economic system, of the industrial age, to one which is digital in its core. The post “Crypto Franc” – key ingredient for a Swiss national blockchain” provides a proposal published by the Fintech Rockers.

This essay proposes to create a Swiss national blockchain. A blockchain allows to track and change ownership of digital assets without the need of a central authority or intermediate. The exchange is peer to peer and cryptographically secured. Money is a key ingredient when values are exchanged and hence a blockchain infrastructure should contain some form of it to unleash the full potential.

“Such blockchain infrastructure, carried jointly by all Swiss cantons, will have an equivalent catalyst effect as the initial introduction of the railway system or the creation of the Gotthard tunnel during the age of industrialization. The Swiss national blockchain will enable local as well as foreign entities and all people with an interest and/or business relation with Switzerland to hold genuine Swiss cryptocurrency and/or execute transactions via legal compliant smart contracts.”

On this blockchain environment a cryptocurrency bound to the Swiss franc controlled by the SNB is envisioned:

“The introduction of Swiss cryptocurrency “Crypto Franc”, bound to the issued fiat Swiss Franc by the Swiss National Bank (SNB), revolutionizing digital payment capabilities. The national blockchain will enable and bring the Swiss industry(s) to the international forefront of the digital age.”

In her speech, Andréa M. Maechler commented on aspects of the proposal:

“A more prominent role for central banks in this end-customer business area is currently a subject of debate, amid calls for ‘digital central bank money for the general public’. The SNB opposes this idea. Digital central bank money for the general public is not necessary to ensure an efficient system for cashless payments. It would deliver few advantages, but would give rise to incalculable risks with regard to financial stability.”

“Which technologies and solutions ultimately prevail on this solid foundation should in principle be left to the market to decide, however. This division of roles between central banks and commercial banks epitomises our current two-tier financial system. It contributes to the stability of the system, while allowing sufficient leeway for innovation.”

Both are for sure valid points – but again perspective matters. If a “Crypto Franc” is seen as a modern form of cash then not a lot would really change. The SNB controls the amount of cash in circulation and it would continue to control the amount of cryptographic cash which could be safely kept and exchanged using a blockchain infrastructure.

Introducing cryptographic cash could have an impact on the financial system as it would allow people to keep their cash safe – within your e-wallet on a blockchain instead of your existing cash account in the bank. With a cash account the client gets a repayment promise by the bank in turn the bank can use the amount from this account for other business (e.g. mainly lending). The value of the repayment promise depends on the trust in the bank while the trust in the Crypto Franc depends on the trust into the currency.

Thus issuing cryptographic cash is not of the interest of the bank when ownership and value is no longer with  the bank but with the individuals. Regardless if a cryptographic currency is introduced, the prevalent systemic risk will continue as long as there is still book money or money created though credit. As the adoption of cryptographic currency increases, it will eventually reduce the systemic risks as commercial banks need to compete for such currency of clients. Clients will need to make an explicit decision when putting Crypto Francs onto an account which means transferring ownership to the bank against a repayment promise. They need to assess the risk(s) whether a bank is able to meet its repayment promise. On the other side the bank can provide return on investment where such transparency would rather de-risk the system invalidating some of the arguments stated by the SNB.

There is an increasing interest by companies and other organizations to create their own form of money. Dr. Edward De Bono suggested years back that companies should consider to issue their own currency using IBM as the example (Towards a digital barter economy?). It was futuristic at the time where he made  this proposal – but today in a digital world it becomes feasible (David Birch – Before Babylon, Beyond Bitcon – from Money that We understand to Money that understand Us ). There seems to be a trend going into this direction which could accelerate with potential huge implications on the existent monetary system.

“The SNB will keep a close watch on developments to ensure that it always remains able to assess their potential impact on the financial system in good time.”

This sounds given the context to passive. The SNB needs to active in the developments to gain the required experience. Observers will typically be late. Being late means becoming defensive and reactive. The SNB in my view should be active and engaged in building the best possible future for Switzerland and its financial system.

The ‘No’ by the SNB feels premature influenced by past paradigm rather than by the current developments and the needs of the future.  One aspect became prevalent from all the statements across the various parties engaged in the thought exchange is the complexity of the topic. It is a complex network problem and requires much more thoughts and discussions. It is key to have these exchanges in a frequent but immediate manner in order to come to conclusions before other organizations/entity(s) may take over and suddenly provide “the” new form of money pushing organizations like the SNB into a reactive rather than proactive  position.

Towards Canary Banking

Services offered by banks are mainly commodities sold with rather opaque pricing strategy. Client experience has become the main differentiator as underlying services are largely the same. The challenge has become creating outcomes which integrate into the life of the customer via respective touchpoints of the client’s eco system. This becomes difficult as the number of potential touchpoints increases significantly (ubiquitous computing).

The focus within banks has moved from operations and backend infrastructure to the touchpoints. Such focus leads to the shift in investments – which leave the backends under-invested hidden from the user by layers of newer systems.  This trend started years back when the financial services industry realized that the internet is not a transient effect and changed the way business is fundamentally done. The adoption accelerated with the rise of mobile and the next catalyst with the increase usage of conversational technologies like chat or voice.

Adding more layers does not improve the overall system – instead only to delay and cover up the increasing issues and complexities of the fragile legacy backend  which can no longer be sustainable (Measuring complexity in networks). From an opertional perspective the whole architecture becomes more and more unmanagable and risky highlighting the need for a substantial refactoring or more likely a total re-architecting.
Let’s for the moment assume that you own such a “backend” which becomes increasingly hard to change and difficult to operate. What could be a way out of this situation?

  • Be bold and start greenfield. This allows to get rid of the layers of legacy and build a system which matches current needs.
  • Try to be clever and refactor the system in an incremental approach. You have to be faster then the changes in the environment as you have to catch up and eliminate debts while adding the functions required to stay competitive. Implementation risk and sunk cost avoidance are the typical arguments backing this option.

Most organizations will go for the second option and try hard. They neither have strong leadership and capability(s) to aim for the first option nor the skills to actually build something new based on current state practices. The organization and the management are only trained to do small changes and the “internal immune system” reacts immediatly against any major change which could endanger the status quo.

Maybe it is better to start with a canary banking approach – where the next version of the bank is built up while the first is operational.

It is similar to driving a car – you choose one, drive it, maintain it and at some point you switch to another model. The new model may contain some parts already used before – that’s fine as long as they fit into the new model. The approach is rather different than continuously adding/ changing external cosmetics elements to an outdated Ford-T backend.

Measuring complexity in networks

People like simplicity. They prefer linear systems where an input translates in a well understandable way into an output or result.
The world is a mesh, a complex network of elements which influence each other. Small effects on one side may lead to a chain reaction in the network and cause a big change.
Many organizations have big difficulties to deal with the increasing complexity which is inherent to the shaping mesh economy. Established rules and patterns become less effective or even contra productive. It has become difficult to stay lean while things change fast – typically organizations and their supporting infrastructures grow continuously amalgamating approaches and technologies over many years.
The typical reaction is to launch some form of a simplification program. Such programs require good KPI’s to understand the benefits of the measures taken. But meaningful KPI’s are hard to find.
Organizations typically start counting elements as a measure and aim to reduce complexity by reducing the number of similar elements. Reducing e.g. the number of applications in use feels right – but it may come with side effects. This measure may decrease agility. Agility is required in order to deal with complexity. It may also lead to less productivity as broadly used applications may make it harder to deal with complicated situations where specialized tools provide a better support.
All problems are different and a best practice only fits to a specific set of problems. One approach which I find useful to measure complexity in the network is based on a heuristic which goes back to Robert Glass.
“For every 25 percent increase in problem complexity, there is a one 100 percent increase in solution complexity”
Let’s make a thought experiment. Let’s assume we want to measure complexity of a system with a set of functions which must interact with each other in a defined way.
  • One extreme is to have all functions in a single block – we have now a complicated block which hides all interdependencies between functions from the outside world.
  • The other extreme is to have a block for each function and then connect the blocks based on the functional dependencies. Now each block is simple but all interdependencies have been externalized which makes the network complicated.
This heuristic above means:
  • If we have 4 functions in a block and add another to it, then the solution complexity of this block doubles.
  • If we add an additional connection to 4 existing connections between blocks, the complexity of the network doubles.
The heuristic allows to find a setup where the functional bundling into nodes and network connections are in a good balance. Given this heuristic it becomes possible to start measuring and optimizing  the complexity of networks. Obviously not all functions are equal neither are all connections. Such factors can be included into a more detailed approach which allows measuring and optimizing an application landscape towards a low complexity.
The most decisive factor may finally however be the solution approach and the clarity of the design. If there is a simpler way to solve the problem then this is the first step to do. And this leads back to the starting point – in order to truly simplify first the problem space and the solution approach need to be challenged. They change continuously.

“Crypto Franc” – key ingredient for a Swiss national blockchain

In various countries there are discussions about a crypto version of the national fiat currency, which in the case for Switzerland this could be the “Crypto Franc”. We joined the Fintech Rockers writing the exposé Swiss national blockchain and cryptocurrency.

swiss-crypto-franc-768x768.jpg

The exposé proposes the “Crypto Franc” in the context of a national blockchain which would serve as the digital backbone in the shaping mesh economy. We expect with a stable currency available on the blockchain that it will provide a catalyst effect of the economy. Such stable currency could be the “Crypto Franc” issues by the SNB directly pegged to the Swiss Franc.

“Such blockchain infrastructure, carried jointly by all Swiss cantons, will have an equivalent catalyst effect as the initial introduction of the railway system or the creation of the Gotthard tunnel during the age of industrialization. The Swiss national blockchain will enable local as well as foreign entities and all people with an interest and/or business relation with Switzerland to hold genuine Swiss cryptocurrency and/or execute transactions via legal compliant smart contracts.”

For Switzerland (and other countries) it is imperative to think about its future in a digital mesh economy. Such an envisioned blockchain would be an excellent foundation. There are ongoing debates about such a strategy but prompt and immediate decision and actions on this topic are required in order to stay a leading country in the global financial system.

“The introduction of Swiss cryptocurrency “Crypto Franc”, bound to the issued fiat Swiss Franc by the Swiss National Bank (SNB), revolutionizing digital payment capabilities. The national blockchain will enable and bring the Swiss industry(s) to the international forefront of the digital age.”

The exposé was mentioned under the title “E-franc pipe dream fails to arouse Switzerland” on swissinfo.ch recently.

The PSD2 and GDPR gym

There is an intense ongoing debate about the pros and cons of PSD2 in media. On one side some feel that the financial sector is already heavily regulated and with any additional regulation,like PSD2, it will hinder the free evolution of markets. Others on the other hand think that PSD2 is good, as it allows fintechs and bigtech to access client account and transaction data as the catalyst ti add-on services and provide superior user experience. The client owns the decison power and the incumbents are mandated to collaborate. The incumbents may actually get the biggest value from PSD2. Sounds counterintuitive so lets explore a few thoughts.

There is a general set of ongoing trends – the age of industrialization has been superseeded by the age of information. Thus the pipeline businesses which domintated during industralization are being superseded by the platform business models. The platform models are superior as they take advantage of the benefits of network effects. They leverage ideally through similar advantages of the internet. Platform business models are typically composed of many smaller organizations grouped together via the platform into a structure  much more powerful than the sum of their parts. Successful platforms create a pull effect as each participant increases the value of the platform for all.

Banking is not yet a platform business – many talk about the uberization of banking and I am convinced it will happen. Typically a very small number of big players will dominate a specific sector in a platform economy  These are companies that successfully establish the pull effect described above. When banking moves into such a model, the only players are ones which can integrate seamlessly, offer highly competive services or the one which owns the platform and masters the integration and collaboration. All successfull entities must be well connected and are experts in the interaction and integration through API’s.

Todays incumbents today are often the opposite – they operate in a closed and private environment and try to create captive businesses individually. They interact with very selective partner(s) and does not empower their clients to make their own choices l. They also typically create specific standards and make it very difficult for others to interact with the large number of incumbents.

This is where PSD2 changes the game – it forces the incumbents to think about interfaces and forces them to open up. This is a threat as business(s) may be lost but it brings to the attention of incumbents wanting to change, a survival training in the world of VUCA (Dance on the VUCAno). This forces companies to participate in the mesh economy and its dynamics. It increases the chance to become robust enough to delay the time when the tech giants might take over (Next stop fintech giants). It may well be that a fair amount of revenues of incumbents get eroded during this process and that more client touch points are lost to competitors who just create better client outcomes and experience. But it also strengthens the companies in the upcoming big tech challenge.

This leads to a key question: Should other regulators follow the EU and mandate a regulation like PSD2 and GDPR in order to strengthen the financial system by exposing the incumbents to a shock to increase anti-fragility (In a world of VUCA seek anti fragility) or should it leave the choice to the incumbents which may prefer to protect what they once had and then experience at one point their Kodak moment with potential disastrous impact on the sector and economy?