Ongoing loss of agility

Do you remember Wall-E and the Axiom? If not, then you may want to go and watch the movie before reading on …

What comes into your mind when thinking about the humans on the Axiom? They are harmonious, homogenous in lifestyle, well-functioning and polite. They follow the daily routines and enjoy somehow the time on the ship. The ship has taken control of their life and organizes it for them. Everything is predictable, under control and automatically managed 24×7.

An interesting question is if such a comfortable state is really desirable. Intelligence is the ability to acquire and apply knowledge and skills in unexpected situations. But what is the result if intelligence is actually no longer required in daily life? The axiom could stimulate the humans by creating surprises and challenges but the people would soon figure out that this is only a game and does not really matter. If typical human skills are no longer required, what motivates people? What is the value and position of humans in such an environment?

The situation on the Axiom is an extreme state. But the question if we are progressing in such a direction is interesting.

It feels that the civilised world wants a “constant” to make all predictable and eliminate negative surprises. But such events may often not be positive or negative by their own. It is more how people react to them. If a system like the Axiom takes care if everything, will human intelligence, processing, or even value degenerate?

When people are in such a comfortable state, will they try or attempt something new – fearing to lose what they already have? Could the evolution of our society bring, as a consequence, a complete loss of agility? Are we already on the way to such a state caught in daily routines defined by others?

Need to break free from legacy ….

Interaction with the environment is an essential ingredient to success in the mesh economy of the age of information. Many companies seem to struggle with this as they cannot let the past go. The legacy defines what is possible and also what success means. The activities are inside out oriented dominated by the inherent constraints.

One reason might be that many organizations have lost the sovereignty to move forward. For quite some time organizations were optimized by means of business process outsourcing. Near and offshoring, it was believed, would bring efficiency. That was maybe true in the age of industrialization. But it is not true anymore in the age of the information where lean and agile structures, skills and technology make the difference while repetitive tasks are automated. The approach may have provided a bit of short-term profit polishing but unfortunately resulted in a longer term strategic disadvantage as skills and know-how moved to the sourcing partner.

Another reason might be the way how business cases are used today. Any change requires a business case. This was relative simple in the slow changing age of individualization and also works well for small changes today. It does not work well in the world of VUCA.

In addition the dynamics in organizations do not help with the approach. The own business case has to be better than most other cases to have a chance in the internal competition for funding, so it will be optimized a bit. Since everyone knows that, all the cases are being tuned. The management sees the number of excellent business cases a a luxury problem – it feels good to have such choice.

On the flip side the decision process gets difficult and tedious. During all these processes and procedures, a lot of time passes by. At the point when decisions are finally made the environmental conditions have changed, and everything should actually start again. But that would be too painful, so the execution of the initiatives starts anyway. At some point after the project start, management realizes that the projects are a little off the mark, then the organization re-prioritizes and the internal bureaucracy starts again. Then also the ego of the typical manager starts to amplify the effects – a great manager cannot have done anything wrong, which brings the whole theater to the next round.

Bigger changes are hard to get through – the assumptions feel less intuitive and the risks are perceived to be higher. Hence the small incremental changes are preferred and the core fundamental ones continuously push down in priority. The company change capacity slows down and the cultural legacy is soon be joined by technical legacy. Some people like this stage as it offers the opportunity to launch simplification programs, which of course suffer from the same effects as all other projects.

I think there is only one way to resolve such a situation – push ‘refresh’. Rethink the business model and decompose the organization into self sufficient units. Give them the competence to decide and also the responsibility to deal with outcomes.

And this leads to the most challenging point – the time “where the turkeys are asked to vote for Christmas” and to disrupt themselves.

Data Ethics: Privacy in the future?

47BEF63F-B172-420D-B5BC-347310E6B2A2
Technology progress has radically transformed our concept of privacy. The way we share information and display our identities has changed as we have migrated to the digital world. We carry/ interact with device(s) that give us access to all information, but they can also offer to the world vast quantities of information about us. We leave digital footprints as we navigate through the digital world. While sometimes this information can be harmless, it’s often valuable to various stakeholders, including government, corporations, marketers, and criminals.
The ethical debate around privacy is complex. Has our definition and standards for privacy evolved over time? How will it continue to do so in the decades to come?
Implications of emerging technologies will only challenge the notion of privacy as we further extend the integration of technologies such as virtual reality, the internet of things, brain machine interfaces, etc…
Company such as Neuralink attempts to merge the human brain with machines leading to potential implications for privacy. Brain machine interfaces by nature operate by extracting information from the brain and manipulating it in order to accomplish goals. There are many parties that can benefit and take advantage of the information from the interface. Marketing companies, for instance, would take an interest in better understanding how consumers think and consequently have their thoughts modified. Employers could use the information to find new ways to improve productivity or even monitor their employees. There will notably be risks of “brain hacking” which we must take extreme precaution against. On the other hand what if such approach was used for medical treatment purposes?
Technological advancement are significantly growing as part of who we are. The digital world has become an extension of our identities and the physical world. We will to a certain extent need a new definition of privacy until such a notion becomes obsolete in the near future. We have started to be more open as a society in the digital world, voluntarily sharing identities, interests, views, and personalities. The question is the trade off and balance between transparency vs privacy.
Will there and if so what is the new definition of privacy. Will ethical standards evolve and do people agree to it?

Digital to replace the human touch?

 
6441F716-9E3D-4E6B-973D-9515927000F2.jpeg
Can technology/ intelligent machine replace the human touch? While a machine can perform a given task, often more efficiently than we can, what it lacks is the creativity in the activity, a uniquely human ability to cater to the needs of the individual. What is creativity? What are the needs of the individual? Even if a machine could determine an appropriate plan humans will still want to interact with another who has the creative expertise/ experiences to talk us through, one who understands that creativity in that context.
How much of this effect is real, how much of it is specific to our generation? Will the next generation or future generations after have the same distrust and longing for such “human” touch? Especially digital natives who grew up with the internet and social media as part of their everyday world.
Technologies of the industrial revolution (steam power and machinery) – largely complemented human capabilities. The great question of our current time is whether digital technology will complement or instead replace human capabilities…can digital technology replace human capabilities especially the understanding and judgement – let alone the empathy – requires to successfuly deliver services such as social care; or that lead us to enjoy and value interacting with each other rather than with machines.
Faster isn’t wiser: Intelligence is defined in terms of the ability to acquire and apply knowledge and skills, but what is often missing is the act of taking decisions base on the ability to choose objectives or hold values that shape it.
Values are experience, not data: As we use increasingly powerful computers to create more and more sophisticated logical systems, we may succeed in making systems resemble human thinking, but there will always be situations that can only be resolved by humans employing judgement based on values that we can empathise with, based in turn on experiences that we can relate to.
Artificial life, experience, values:
Intelligent machines can make choices based on data available to them but this is very different than a judgement based on values that emerge from our experience of life.

Need for ethical data sharing standards ….

Currently there is an intense discussion about personal data and ownership. Although the assumption of exposing data is inherently dangerous, the opposite may be true as well. The key question focusses more on access control to manage data exposure in order to avoid risks while still getting the benefits. Or maybe it is more about establishing broad ethical standards as data availability is neither good or bad on its own – the way it is used makes the difference.

Let’s look at some use cases which illustrate the challenge. Let’s assume you feel an unusual pressure in your chest and decide to visit a doctor. He does an electrocardiogram (ECG) which looks normal. What does this mean – although it maybe normal in relation to a statistical sample but may be very unusual for you. The doctor does not have the necessary data to determine this.

Let’s assume now that you have a wearable which collects data of key body functions on an ongoing base. Many others do this as well and the gathered data can be analysed leading to algorithms which can detect anomalies like an indication of an increased risk of a heart attack on a personal level. Your data is now used in two ways – to build up the algorithm and to detect health issues for you. Please note that just doing some data gathering when you feel bad has not the same effect. The power comes from the availability of longer term data.

Let’s now assume that the new way to identify potential heart attacks reduces the perspective costs substantially. Especially in the long-term, trends and indicators can be used to influence the behavior e.g. towards more physical activity.

The data gathered could also be used to derive further information which could be used for unethical purposes.

You might say that it is always possible to switch data gathering on and off  – depending on the type of data such ‘white spots’ may finally be problematic and will not help to better protect privacy. If gathering data is the norm, then switching it off could easily seen as an indicator that somebody tries to hide something may actually attract attention.

Such data could be pseudonomized – but needs to have some sort of tag to allow correlations to power the analytics or it may even by identifying itself like the ECG which is also being used for authentication.

There are many such examples – if you share your GPS position, somebody may take advantage that you are far away from home or use the data to rescue you after an accident.

I think that collecting data will be the norm. Protecting data will be key – but even more important is the need to establish ethical standards on how to deal with such data and information derived from it.

The Swiss National Bank and the Cypto Franc

Financial newspaper “Finanz and Witschaft” published an article titled “SNB will keinen Krypto-Franken”  reflecting on the speech “The financial markets in changing times – Changes today and tomorrow: the digital future” by Andréa M. Maechler, Member of the Governing Board of the Swiss National Bank (SNB) held at Money Market Event in Zurich on 05th of April 2018.

There are various views on the nature and definition of money, currency and cash which has substantially changed over time. The post How to explain ‘new’ things like bitcoin … provides some thoughts on this topic. It is very difficult to talk about all these terms as the definitions incorporate specific perspectives.

The challenge all countries, including Switzerland, face is to sustain global competitiveness. How does one move from the current economic system, of the industrial age, to one which is digital in its core. The post “Crypto Franc” – key ingredient for a Swiss national blockchain” provides a proposal published by the Fintech Rockers.

This essay proposes to create a Swiss national blockchain. A blockchain allows to track and change ownership of digital assets without the need of a central authority or intermediate. The exchange is peer to peer and cryptographically secured. Money is a key ingredient when values are exchanged and hence a blockchain infrastructure should contain some form of it to unleash the full potential.

“Such blockchain infrastructure, carried jointly by all Swiss cantons, will have an equivalent catalyst effect as the initial introduction of the railway system or the creation of the Gotthard tunnel during the age of industrialization. The Swiss national blockchain will enable local as well as foreign entities and all people with an interest and/or business relation with Switzerland to hold genuine Swiss cryptocurrency and/or execute transactions via legal compliant smart contracts.”

On this blockchain environment a cryptocurrency bound to the Swiss franc controlled by the SNB is envisioned:

“The introduction of Swiss cryptocurrency “Crypto Franc”, bound to the issued fiat Swiss Franc by the Swiss National Bank (SNB), revolutionizing digital payment capabilities. The national blockchain will enable and bring the Swiss industry(s) to the international forefront of the digital age.”

In her speech, Andréa M. Maechler commented on aspects of the proposal:

“A more prominent role for central banks in this end-customer business area is currently a subject of debate, amid calls for ‘digital central bank money for the general public’. The SNB opposes this idea. Digital central bank money for the general public is not necessary to ensure an efficient system for cashless payments. It would deliver few advantages, but would give rise to incalculable risks with regard to financial stability.”

“Which technologies and solutions ultimately prevail on this solid foundation should in principle be left to the market to decide, however. This division of roles between central banks and commercial banks epitomises our current two-tier financial system. It contributes to the stability of the system, while allowing sufficient leeway for innovation.”

Both are for sure valid points – but again perspective matters. If a “Crypto Franc” is seen as a modern form of cash then not a lot would really change. The SNB controls the amount of cash in circulation and it would continue to control the amount of cryptographic cash which could be safely kept and exchanged using a blockchain infrastructure.

Introducing cryptographic cash could have an impact on the financial system as it would allow people to keep their cash safe – within your e-wallet on a blockchain instead of your existing cash account in the bank. With a cash account the client gets a repayment promise by the bank in turn the bank can use the amount from this account for other business (e.g. mainly lending). The value of the repayment promise depends on the trust in the bank while the trust in the Crypto Franc depends on the trust into the currency.

Thus issuing cryptographic cash is not of the interest of the bank when ownership and value is no longer with  the bank but with the individuals. Regardless if a cryptographic currency is introduced, the prevalent systemic risk will continue as long as there is still book money or money created though credit. As the adoption of cryptographic currency increases, it will eventually reduce the systemic risks as commercial banks need to compete for such currency of clients. Clients will need to make an explicit decision when putting Crypto Francs onto an account which means transferring ownership to the bank against a repayment promise. They need to assess the risk(s) whether a bank is able to meet its repayment promise. On the other side the bank can provide return on investment where such transparency would rather de-risk the system invalidating some of the arguments stated by the SNB.

There is an increasing interest by companies and other organizations to create their own form of money. Dr. Edward De Bono suggested years back that companies should consider to issue their own currency using IBM as the example (Towards a digital barter economy?). It was futuristic at the time where he made  this proposal – but today in a digital world it becomes feasible (David Birch – Before Babylon, Beyond Bitcon – from Money that We understand to Money that understand Us ). There seems to be a trend going into this direction which could accelerate with potential huge implications on the existent monetary system.

“The SNB will keep a close watch on developments to ensure that it always remains able to assess their potential impact on the financial system in good time.”

This sounds given the context to passive. The SNB needs to active in the developments to gain the required experience. Observers will typically be late. Being late means becoming defensive and reactive. The SNB in my view should be active and engaged in building the best possible future for Switzerland and its financial system.

The ‘No’ by the SNB feels premature influenced by past paradigm rather than by the current developments and the needs of the future.  One aspect became prevalent from all the statements across the various parties engaged in the thought exchange is the complexity of the topic. It is a complex network problem and requires much more thoughts and discussions. It is key to have these exchanges in a frequent but immediate manner in order to come to conclusions before other organizations/entity(s) may take over and suddenly provide “the” new form of money pushing organizations like the SNB into a reactive rather than proactive  position.

Towards Canary Banking

Services offered by banks are mainly commodities sold with rather opaque pricing strategy. Client experience has become the main differentiator as underlying services are largely the same. The challenge has become creating outcomes which integrate into the life of the customer via respective touchpoints of the client’s eco system. This becomes difficult as the number of potential touchpoints increases significantly (ubiquitous computing).

The focus within banks has moved from operations and backend infrastructure to the touchpoints. Such focus leads to the shift in investments – which leave the backends under-invested hidden from the user by layers of newer systems.  This trend started years back when the financial services industry realized that the internet is not a transient effect and changed the way business is fundamentally done. The adoption accelerated with the rise of mobile and the next catalyst with the increase usage of conversational technologies like chat or voice.

Adding more layers does not improve the overall system – instead only to delay and cover up the increasing issues and complexities of the fragile legacy backend  which can no longer be sustainable (Measuring complexity in networks). From an opertional perspective the whole architecture becomes more and more unmanagable and risky highlighting the need for a substantial refactoring or more likely a total re-architecting.
Let’s for the moment assume that you own such a “backend” which becomes increasingly hard to change and difficult to operate. What could be a way out of this situation?

  • Be bold and start greenfield. This allows to get rid of the layers of legacy and build a system which matches current needs.
  • Try to be clever and refactor the system in an incremental approach. You have to be faster then the changes in the environment as you have to catch up and eliminate debts while adding the functions required to stay competitive. Implementation risk and sunk cost avoidance are the typical arguments backing this option.

Most organizations will go for the second option and try hard. They neither have strong leadership and capability(s) to aim for the first option nor the skills to actually build something new based on current state practices. The organization and the management are only trained to do small changes and the “internal immune system” reacts immediatly against any major change which could endanger the status quo.

Maybe it is better to start with a canary banking approach – where the next version of the bank is built up while the first is operational.

It is similar to driving a car – you choose one, drive it, maintain it and at some point you switch to another model. The new model may contain some parts already used before – that’s fine as long as they fit into the new model. The approach is rather different than continuously adding/ changing external cosmetics elements to an outdated Ford-T backend.

Measuring complexity in networks

People like simplicity. They prefer linear systems where an input translates in a well understandable way into an output or result.
The world is a mesh, a complex network of elements which influence each other. Small effects on one side may lead to a chain reaction in the network and cause a big change.
Many organizations have big difficulties to deal with the increasing complexity which is inherent to the shaping mesh economy. Established rules and patterns become less effective or even contra productive. It has become difficult to stay lean while things change fast – typically organizations and their supporting infrastructures grow continuously amalgamating approaches and technologies over many years.
The typical reaction is to launch some form of a simplification program. Such programs require good KPI’s to understand the benefits of the measures taken. But meaningful KPI’s are hard to find.
Organizations typically start counting elements as a measure and aim to reduce complexity by reducing the number of similar elements. Reducing e.g. the number of applications in use feels right – but it may come with side effects. This measure may decrease agility. Agility is required in order to deal with complexity. It may also lead to less productivity as broadly used applications may make it harder to deal with complicated situations where specialized tools provide a better support.
All problems are different and a best practice only fits to a specific set of problems. One approach which I find useful to measure complexity in the network is based on a heuristic which goes back to Robert Glass.
“For every 25 percent increase in problem complexity, there is a one 100 percent increase in solution complexity”
Let’s make a thought experiment. Let’s assume we want to measure complexity of a system with a set of functions which must interact with each other in a defined way.
  • One extreme is to have all functions in a single block – we have now a complicated block which hides all interdependencies between functions from the outside world.
  • The other extreme is to have a block for each function and then connect the blocks based on the functional dependencies. Now each block is simple but all interdependencies have been externalized which makes the network complicated.
This heuristic above means:
  • If we have 4 functions in a block and add another to it, then the solution complexity of this block doubles.
  • If we add an additional connection to 4 existing connections between blocks, the complexity of the network doubles.
The heuristic allows to find a setup where the functional bundling into nodes and network connections are in a good balance. Given this heuristic it becomes possible to start measuring and optimizing  the complexity of networks. Obviously not all functions are equal neither are all connections. Such factors can be included into a more detailed approach which allows measuring and optimizing an application landscape towards a low complexity.
The most decisive factor may finally however be the solution approach and the clarity of the design. If there is a simpler way to solve the problem then this is the first step to do. And this leads back to the starting point – in order to truly simplify first the problem space and the solution approach need to be challenged. They change continuously.