Skip to content

Latest commit

 

History

History
280 lines (166 loc) · 39.8 KB

ETAC.md

File metadata and controls

280 lines (166 loc) · 39.8 KB

Emerging Technology Analysis Canvas (ETAC)

Version 0.8

Abstract

This paper introduces the “Emerging Technology Analysis Canvas” (ETAC), a framework to assess an individual emerging technology. Inspired by the Business Model Canvas, this approach includes a set of questions that probe the technology arranged around a logical narrative. It represents different aspects of technology visually on a single page. The visual representation is concise, compact, and comprehensible in a glance. We believe it provides a logical framework to understand the technology, provides a framework to look at different aspects of the technology, and can be used as a brainstorming tool when trying to understand emerging technologies.

Introduction

This paper introduces the “Emerging Technology Analysis Canvas” (ETAC), a framework to assess an individual emerging technology. Many organisations have a desire to evaluate emerging technologies, either for their own research and development (R&D) teams’ benefit or for the benefit of their customers and partners.

We use the term “technology” to mean “the application of scientific knowledge for practical purposes” as defined by the Oxford English dictionary. We use the term “emerging technologies” to identify “technologies that have the potential to have accelerated adoption and thereby make a significant impact on their market or wider society”.

The ETAC approach is intended to be used by individuals or organizations who want to critically understand an emerging technology. It is used to explore the tension between the potentials of the technology and the environment in which it has to operate, to evaluate its impact, and to understand potential future of the technology in various markets. Consequently, it can also be used for long-term planning by senior management in an organization where emerging technologies are a focus.

This work was inspired by the authors’ desire to formalise our methodology and approach to the analysis of technologies that are relevant to our work. We took inspiration from the Business Model Canvas [4].

The rest of this paper is laid out as follows. The first section discusses why such a framework is required. The second presents the canvas and demonstrates its use with emerging technologies. Finally, the third section draws conclusions and identifies further areas of research.

A Case for a Visual Framework

Throughout history, we have seen the emergence of many different technologies. Some have died and some have merely survived. However, some have woven themselves into today's world so well that we do not even notice them. An organization must be aware of those trends to interact with such technologies, to build on them, and sometimes even to shape them. During this process, such an organization greatly benefits from objective analysis, deeper understanding of the impacts, and estimations of the probable success of each emerging technology. In particular, these combine to help understand the risks to the adoption of a new technology. The response to emerging technologies often decides the fate of organizations. It pays to be thorough.

Assessing a technology is a subjective exercise. It is driven more by judgment and less by numbers. Risks include unknown-unknowns, overlooking side effects, and bias stemming from preconceived notions [1,3]. A successful analysis often results not from haphazard judgments but from a well-designed framework, a framework that asks hard questions, a framework that compares, contrasts, and criticises what is known and what is not.

One way of doing this is a list of questions [2]. Well-designed questions force us to think, provide honest answers, and consider multi-level effects. Here are some examples that are used to evaluate emerging technologies:

  • What causes the emergence of the technology (e.g. use 5-Whys [5])?
  • Who will benefit? Who will be in trouble?
  • What existing technologies are affected by the new technology and what existing technologies affect the new technology?
  • What resources and investments are needed for development or adoption of the technology?
  • Can current policy and law handle the deployment and side effects of the technology? How will policymakers react to the technology?
  • How long does it take to hit 16% of the market (“Crossing the Chasm” [6])?
  • What problems will it solve? What are the direct, second level, and nth level repercussions of those solutions [7]?
  • What revenue streams can it create for an organization that develops or adopt the technology? What business models can it enable?
  • What are the risks the technology faces in its development, funding, timing, and deployment? What is the biggest risk to the adoption?
  • What were similar trends? What happened to them?

While questions are a powerful tool, they have several weaknesses. They are not concise, and as a result this can hinder the aim of understanding the bigger picture. A visual representation, by contrast, is concise, compact, and comprehensible in a glance.

Furthermore, questions themselves, without a structured approach to guide them, do not help to cover the many diverse aspects of a given technology: we may not know what questions we have overlooked. A structured approach, into which the questions can be fitted, allows us to use questions as tools probing the different aspects of a phenomenon. In particular, a logical structure can help identify gaps in the structure itself and thereby lead to a deeper and more thorough analysis. A good example of this is the periodic table. A visual format can also emphasize the relationships between different aspects, which gives an additional chance to identify gaps.

While trying to build a structure for questions, we found inspiration from the Business Model Canvas [4]. Introduced by Alexander Osterwalder, it is a set of questions and a narrative designed to critically analyze different aspects of a business models: the value proposition, customer segments, channels, and revenue pitted against activities and costs. Its strength lies in the visual narrative that links its questions into a coherent whole. It is designed as a one-page visual representative, which is to be printed out and used as a brainstorming tool.

Applicability and Examples

The approach that we are proposing has been developed with IT and software models in mind because these are the areas that the authors are using this framework to evaluate. However, there is nothing inherent in the approach we have taken that restricts this to just IT or software. We would encourage further research in other disciplines to identify if this proposed framework has value in other areas.

In this paper, we will use several example domains to demonstrate the use of the approach. The main domains that we have used to demonstrate the approach are Artificial Intelligence [18] and Serverless Computing [17]. Artificial Intelligence is a technology to build systems that from the outside seem to think and to act rationally. Serverless is a cloud computing approach whereby infrastructure and middleware are hidden from developers, enabling them to focus on code and leave issues such as scaling, security, and monitoring to the cloud provider. We also use some examples from blockchains [19], which are technologies to maintain distributed, decentralised immutable ledgers, together with automated contracts.

Emerging Technology Analysis Canvas

We propose a model we call the Emerging Technology Analysis Canvas (ETAC), which is built around similar concepts as the Business Model Canvas, but targeting emerging technology assessments. The ETAC provides a visual framework and a methodology for a structured analysis of emerging technologies. The process of filling out the ETAC encourages the researcher to answer a wide set of questions and creates a holistic approach to analysing emerging approaches. In addition, an ETAC may have a supporting document that details the references and analysis.

Based on our experience in working in several areas of emerging technology, we see that the success of an emerging technology depends on four main factors:

  1. The identification of a problem and related innovation that addresses the problem, which we call a trigger. (We consider that both the problem and innovation must come in conjunction because often the innovation changes our perception of the problem).
  2. The technology needs to have a significant potential impact. Often the impact may extend beyond the initial problem.
  3. The technology has to be feasible given the available resources.
  4. The technology has to navigate risks related to technology development and adoption. For example, the technology must develop and be adopted quickly enough to justify any investment.

As shown in the next figure, the canvas is built around the above four elements. Each subsection drills down into the details.

The graphical representation of the ETAC (Figure 1) is designed to be concise and compact. One can see a spectrum of aspects of a technology at a glance – on a single page. This significantly encourages a holistic comprehension so that evaluators can look deeper and ask more informed questions specific to certain focus areas.

Figure 1 - the Emerging Technology Analysis Canvas (ETAC)

Figure 1:The Emerging Technology Analysis Canvas

Let’s explore the canvas in detail.

The ETAC is typically read (and filled-in) from left-to-right and then top-to-bottom. We discuss the creation of an ETAC further in the methodology section below. When we use the term organization, this can refer to either an organization or an individual that is using the ETAC to evaluate the technology.

We have observed that often technologies are triggered by exploring a solution to a limited problem. However, often the solution has much wider applications and evolve into a broader technology. For example, bitcoin was built while trying to create decentralised digital cash, but has found wider use cases as an immutable database and model of automated contracts. Another example is the nuclear technology, which was originally productised for weapons, but found other uses such as nuclear power and in medicine.

The opportunity captures the problem and the solution that acts as a trigger, drivers that are positively affecting the technology, and the current state of the market.

Trigger: A problem and a solution that captured broader imagination that later evolved to a broader technology promising to solve broader problems.

Players: Organizations or individuals who are actively improving or using the technology to solve problems. For example, Amazon, Microsoft, and IBM are active players in serverless. We do not include the end users of the technology as players. Instead these are discussed under the value chain.

Drivers: external forces that positively impact the technology such as legislation. For this analysis, one can use the PESTLE framework [10] which considers the political, economic, socio-cultural, technological, legal, and environmental impacts on a technology. Among examples of drivers are cost saving, agility, productivity, automation, communication, trust, privacy, government policy, and law. Also, current industries and other emerging technologies can also act as drivers.

The impact section in ETAC includes potential political, economic, socio-cultural, technological, legal, and environmental changes enabled by technology. The ETAC considers two types of impacts: macro and micro. The macro impact captures the effects of technology on the world as a whole. The micro impact captures the impact seen by individual organizations as part of the supply chain. It is important to note that impact is analyzed with respect to potential future and not limited to the current state of the technology.

Macro Impact: this discusses the impact on the industry under the following three themes.

Network effects and Interactions: A technology has network effects if the increased adoption increase the technology’s value for existing users thus creating a positive feedback loop. For example, as more people join Facebook, more and more people are encouraged to join Facebook. Griffin [9] discusses network effects in detail.

Distruptees: - What technologies or industries will be affected by the emerging technology? The effects are twofold: the industries that compete with the emerging technology will be challenged, while the industries that complement it will be propelled. For example, AI may improve disease diagnosis, but at the same time that will reduce jobs for doctors. Some of the effects may be disruptive, where the affected technology or domain is significantly transformed by the technology. Furthermore, there are several impact areas we can consider. For example, we can use the PESTLE framework [10] in reverse and consider the political, economic, socio-cultural, technological, legal, and environmental impacts of the technology. We need to consider not only the first level effects but also the nth level effects as described by [7]. Furthermore, it is important to consider the crossover potentials of the technology where it can solve fundamental problems in a different industry or a segment, which in turn unlocks a wave of advance. An example of this is blockchain, where a technology designed to solve problems of decentralised money is being applied to areas such as identity, provenance or smart contracts in many industries.

Micro Impact: this discusses the impact on the organization under the following four themes.

Competitive Advantage: how does the emerging technology affect competition between organizations? For example, AI enables organizations to automate decisions and make better decisions, thereby helping them out-compete their competitors.

Financial Benefits: How would the technology affect the bottom line of organizations. The benefits come in two forms. The technology might make the organization efficient saving some costs or the technology might enable new revenue sources. For example, AI is being used to automate expensive human tasks, thus reducing costs significantly.

Supply Chain: this represent activities that are carried out starting from raw materials and skills until the product or a service is delivered and consumed by the end user. This section discussed how the technology affects the supply chain. Three parts of the supply chain are partners, suppliers, and customers. For example, with the rise of analytics technology, partners and suppliers may expect organizations to share data about how customers bought their products. Furthermore, new payment methods (such as blockchain) may cause customers to demand support for these payment methods. However, since this is considered under the micro impact, the analysis will focus on an effect seen by the organization based on it’s placement in the supply chain. If the organization is a business-to-business (B2B) organization, the customers under consideration will be other businesses; while for business-to-consumer (B2C) organizations the customers will be individual consumers (known as end users).

While exploring the impact of technology, we should ask how the technology can provide advantages such as cost savings, agility, productivity, integration, trust, and improved communication as a direct impact or indirect impact. Furthermore, we should explore whether technology can reduce the challenges such as the shortage of skills, privacy, complexity, security risks, and monopoly as a direct impact or indirect impact.

We have observed that technologies, while they are growing, create a promise to their prospective users. This is an idea that is related to the “hype” in Gartner’s Hype Cycle. The promise is often bigger than the initial problem the technology is identified for, and also often bigger than current technology development. We consider that the promise includes both potential use cases and benefits of the proposed technology. When we discuss the impact of a technology we assume that the promise is technically possible.

Feasibility - Feasibility in ETAC evaluates the probable technical possibilities of the promise. In other words, even if the promise is technically possible, is it likely to come to fruition? This includes three factors.

Technical Merit - This discusses technology breakthroughs the technology has made as well as any technical limitations. For example, AI has achieved several breakthroughs such as deep learning, which has helped in surpassing human accuracy with many problems. However, there are multiple technical challenges including algorithmic attacks, the need for expert knowledge while tuning and applying the AI, and the significant time required for data cleanup.

Tools, Ecosystem, & Skills - This discuss the availability of required skills, tools and best practices, and a community. An example of a community is an open source user forum. As the technology matures, tools and ecosystems will get better, which improve the odds of success. For example, blockchains have built a healthy developer community and tools.

Friction - What kind of friction will the emerging technology face in its deployment? Here we only consider technical friction, and nontechnical considerations are discussed under Risks. As an example, blockchains are seeing concerns over the transaction rates and power consumption costs.

The Future section captures two key aspects of technology development.

Timeline: What are the key possible milestones in the technology development? For example, how long will it take for the core technology to be ready? How fast will the adoption be? For example, blockchains may need further breakthroughs before they are adopted widely and it might take at least 5-10 years to reach those milestones.

Risks: What are risks that might limit the technology deployment? This includes non-technical risks as well. We can think of these as the inverse of drivers. Just like with drivers, we can use the PESTLE framework [10] in reverse to find drivers. An example of a risk is that regulators may restrict blockchains based on taxes or money laundering legislation, even when those blockchains are designed to resolve other aspects than digital cash. Among other risks to be considered are the need for standards, privacy concerns, business models, current law and policy frameworks, lack of skills, complexity, security risks, monopoly and vendor lock-in.

Summary: The summary section discusses possible technology development and deployment scenarios while weighing other parts of the ETAC. In the opportunity section, we discussed drivers. In the impact and feasibility sections we explore the potential of the technology and technical reality. The future section discusses risks and potential key milestones. The summary section explores the tension between the potential of the technology and environment in which it has to operate to understand potential scenarios and associated likelihoods.

Furthermore, the summary can be used to order outcomes by their likelihood. Some of the potential questions include: can the impact be significant? If yes, given the impact and resources available is the technology feasible? If yes, can technology yield results before momentum is lost? If yes, can technology navigate the risks associated with deployments and side effects? This section is most relient on expert judgment as it is hard to find concrete evidence to justify future outcomes.

Just like the business model canvas, the Emerging Technology Analysis Canvas can be used to ask questions, brainstorm, and draw conclusions. It can also be used with post-it notes as a brainstorming tool.

To demonstrate the use of ETAC, we have applied this tool to Artificial Intelligence (AI) and serverless domains. It is worth noting that AI is a technology that has a long history but still has the potential to disrupt the status quo. Hence our analysis is both retrospective and predictive. Serverless is a newer area with a short history and therefore is more predictive: the full impact of serverless is yet to be realised.

Example 1: Artificial Intelligence (AI)

Artificial Intelligence (AI) refers to computer systems that from the outside seem to think and act like humans and think and act rationally. Or in other words, any system that can replace human decisions can be considered as AI.

Figure 2: Artificial Intelligence ETAC)

Figure 2: Artificial Intelligence ETAC

Opportunity: AI has fascinated humans for many years. Researchers have tried to automate human analysis for many years. Because it is considered an example of intelligent analysis, the game of chess was, for many years, considered a proxy for human analysis automation to be solved by AI. When IBM’s Deep Blue initiative beat Kasparov in 1997 this was seen as a watershed in AI. More recently Google DeepMind’s AlphaZero AI taught itself to play chess in 4 hours and instantly became one of the strongest chess playing programs.

The AI market includes Amazon, Google, Microsoft, Facebook, and IBM as key players and many smaller players.

Automation enables us to make decisions faster and cheaper by removing humans in the loop. The need for automation and the availability of computing and data are key drivers of AI.

Impact: At the macro level, AI has a symbiotic relationship with Big Data, Internet of Things (IoT), and observability. It has been suggested that wide applications of AI could bring forth an age of abundance where basic human needs such as food, water, warmth, together with many resources and professional services such as health, education, legal support, will become abundant. It is certainly widely acknowledged that AI will displace many existing jobs.

At the micro level, considering the organizational level impact, AI provides a competitive advantage via automation and agility. It provides financial benefits via automation, via optimisations, and via creating new revenue streams. Furthermore, AI affects supply chain via data sharing and insight sharing partnerships between participants.

Feasibility: In terms of technical merits, AI has achieved several breakthroughs such as deep learning, which has helped surpassing human accuracy levels in many problems. Among technical challenges are algorithmic attacks, the need for expert knowledge while tuning and applying AI, and significant time required for data cleansing. A recently identified problem with AI is the fact that many AIs continue to promote existing inequalities (and prejudices) since they are trained on data that includes these prejudices. This is harming the reputation of AIs but may also meet legal or policy roadblocks.

AI has a wide community and a rich set of open source tools. There are many practitioners, but the demand is still significantly lagging the supply. Although there is a wide range of tools, still they are lacking in terms of usability and providing control across of full development lifecycle.

On the other hand, AI faces friction due to lack of training data and lack of ability to explain why AI made a certain decision. Furthermore, many are skeptical of AI decisions and sometimes hostile due to its potential to replace humans.

Risks: Foremost are the replacement of humans from many current job functions, which can lead to large-scale unemployment, and superintelligence where AI surpasses human intelligence in leaps and bounds. Privacy concerns pose a risk to AI because AI depends on large amounts of data and because AI can be used to infer sensitive information from seemingly unrelated data. Another risk is lack of our understanding of AI in competitive environments. For example, in use cases like stock markets, the application of AI can degrade the situation because the use case becomes a battle between two algorithms creating scenarios such as flash crashes and increased the costs.

Summary: After nearly 60 years of development, AI already has a wide range of applications. Considering the future timeline, AI will likely handle complex problems such as disease diagnosis thus replacing or significantly reducing human involvement in many current job functions in the next 10-20 years. This could create significant wealth but at the same time could create unemployment. Consequences are still unclear. On the other hand, general AI is likely 50-100 years away.

Example 2: Serverless

In order to demonstrate the use of the ETAC, we have applied this tool to the serverless domain. This description is a summarised version of a full analysis of the serverless domain that we have undertaken.

Serverless is defined as Functions as a Service (FaaS), resources (compute and storage) as well as a complete cloud execution environment including platform services providing “Resource Pooling,” “Rapid Elasticity,” “Measured Service” in an opaque manner. The three terms are defined in NIST cloud computing definition.

Figure 3: Serverless ETAC

Figure 3: Serverless ETAC

Opportunity: Before serverless, to run a function in the cloud, users had to create a virtual machine or a container, configure it, and manage it. Serverless removes this requirement by allowing a developer to simply upload a program that is run in response to an incoming message or request. The market includes Amazon, Google, Microsoft, and IBM as key players and many smaller players. Among drivers, the success of cloud technology is a key driver of serverless.

Impact: At the macro level, serverless is likely to have a symbiotic relationship with containers, APIs, and event-driven architecture (EDA) but likely to hurt middleware, DevOps, and IDEs. Serverless can drive functional disaggregation [8] by making API implementations easier. Furthermore, in disaggregated applications, co-locating application logic in the same cloud as APIs can reduce the latency imposed by the network calls. Hence, elevated serverless adoption will place more and more APIs in the cloud, which likely to create further adoption: this makes it more effective to build new applications in the cloud thus creating a network effect, which will drive workloads into the cloud.

At the micro level, serverless provides competitive advantage via agility. It provides financial benefits via reduced development cost, reduced total cost of ownership, and cost transparency.

Feasibility: The technical merits of serverless are built-in scalability and high availability. However, it has higher cold start latencies and tail latencies. The EDA based programming model imposed by serverless is hard to understand and debug. Lack of architectural skills and lack of tools with integrated support for serverless development and debugging likely to be a challenge. On the other hand, serverless enables polyglot programming increasing flexibility. Also, it reduces the abstractions programmers are required to understand before starting hence make it easier for newcomers to get to up to speed.

Considering ecosystems and community, the primary challenge is tooling. Latency and vendor lock-in concerns can limit use cases but are likely not be show stoppers.

The friction that serverless has is mainly around the requirement to rewrite and re-architect systems around the opinionated model behind FaaS systems.

Future: Serverless faces risks due to a lack of standards and potential vendor lock-in. Considering the future timeline, assuming tools can catch up, we believe most new projects will use serverless within next five years.

Methodology

So far we discussed what the ETAC will look like in the finished form. We also discussed a number of tools that could be used at each stage. This section organises those concepts, together with a wider approach, into an outline methodology for building an ETAC for a specific domain. Inherently this is not a prescriptive approach, because the holistic nature of the visual canvas encourages multiple paths. The aim of any methodology is to guarantee the validity of research. Since the aim of the ETAC is to predict the future and to generate opinions, this is inherently impossible! The best we can do is to identify where the facts transition into predictions, judgement and opinions. We encourage further exploration of this area.

Our overall proposed approach to using the canvas has three major components, which are typically created in three phases:

  1. Findings: these are specific, referenceable data points that are identified during research. Typically these are too detailed to appear on the visual canvas and can be captured in a supporting document.
  2. Observations: these are the amalgamation of individual findings into supportable statements that appear on the canvas. Generally, these appear in the Opportunity, Impact and Feasibility sections. These can be directly defended in terms of the findings.
  3. Conclusions: these are the result of critical thought and analysis. The conclusions include predictions and expert opinions derived from the observations. These typically appear in the Future and Summary sections of the ETAC. These are justified by analysing the observations, but in general, these are where the practitioners' experience, critical thinking, analysis, and opinions are used to create a forward-looking analysis of an emerging technology.

Each ETAC section includes some useful questions that can be asked as part of the analysis of each section. Applying the ETAC approach to a technology domain and completing the canvas includes the following steps.

  1. Creating or ascertaining a working definition of the technology to make it clear what is included and not included.
  2. Conducting a literature survey to understand the opinions and evidence expressed with respect to the technology. This step belongs to the first and second phases (finding and observations). This literature review can be focused by using it to answer the questions asked by the ETAC. This generates the findings. When we use the ETAC, we specifically reference the findings and use these to defend the observations in the Opportunity, Impact, and Feasibility sections. These references to related sources and/or primary evidence can be captured in a supporting document if needed.
  3. In the Future and Summary sections, we draw conclusions from findings and observations. Hence this step belongs to conclusions phase. We consider Opportunity, Impact, and Feasibility Sections in ETAC holistically and apply critical thinking, include expert observations and create conclusions. The conclusions can be justified in more detail in the supporting document.

In the rest of the methodology, we will discuss some tools that are useful in the third step (the conclusion phase). The last step is the most challenging part of the ETAC as it moves from observations and clearly justified conclusions into predictions and opinions.

One such tool is to evaluate if the core technology is ready, and if not, how soon it is likely to be. Unless the core technology is ready, the next steps may depend on technological breakthroughs or developments in production-readiness. These aspects are hard to predict. An important concept that can be used in this analysis is Dominant Design [11]. Dominant design postulates that there are a few critical technological features that will become a de facto standard. We can explore whether a dominant design has been achieved.

Another useful set of tools to be used in the third step are frameworks such as the Gartner hype cycle [12], the Carlota Perez Framework [13], and the EU Technology Readiness Levels framework [14].

A related tool to evaluate the technology is Rogers’ Five Factors which analyse adoption rates: Relative Advantage, Compatibility, Simplicity, Trialability, Observability [15]. In addition, Roger’s diffusion of innovation curve [15] allows the analysis of the status quo.

The final tool that we highlight here is to use the following factors to evaluate the timeline:

  • Critical mass required to cross the chasm [6] and timelines.
  • The impact from megatrends (trends such as digitization, urbanization, globalization, climate change, aging population).
  • The impact of specific technology trends (for example the rise of cloud computing).
  • The capital requirement, production costs, and delivery models.
  • Dependencies of the technology on other building blocks (perhaps from other domains). For example, electric cars require a network of charging stations, as well as increased electricity production.

The methodology of using the ETAC is an amalgamation of the above approaches to answering the questions, as well as seeing the ETAC as a holistic approach where the visual layout helps validate the cross-cutting concerns. In addition, expert practitioners will bring other approaches and techniques to bear. When using the ETAC, we may produce a separate supporting document, which elucidates the references, data points, and arguments. The visual canvas and the optional supporting document form the output of the process.

Together these approaches offer a method of using the ETAC to document an emerging technology area and to draw conclusions in a structured and defensible manner.

Related Work

We have already mentioned much of the related work in previous sections. In many ways, we see the ETAC approach as a visual framework that can incorporate many other tools in its usage. The Business Model Canvas helps evaluate a business model as a whole. The ETAC does not attempt to understand a specific businesses success with a technology, but instead looks at the technology across multiple players.

The Hype cycle [12], Carlota Perez Framework [13], and the EU Technology Readiness Levels framework [14] offer a way of looking across multiple technologies. These can be used to feed into the ETAC which is designed to evaluate a single technology space more closely.

The PESTLE framework [10] looks at a technology from multiple perspectives but does not provide the same 4 stage approach to understanding the likely adoption.

Wardley Maps [20] provide a valuable approach to understand the evolution of successful technologies. We consider the ETAC to take an earlier role in the evaluation and to help understand which technologies will migrate from the Genesis and Custom Built phases of the Wardley Maps into productization and commoditization.

Conclusions and Further Research

In this paper, we have outlined a new approach to analysing emerging technologies. The Emerging Technology Analysis Canvas (ETAC) provides a framework to holistically and objectively analyze and predict potential outcomes for emerging technologies. It includes questions that highlight different aspects of an emerging technology and a visual narrative that connects those questions to a coherent whole.

This is an early report on this approach and we see a need for further research. Firstly, we encourage adoption, evaluation, and improvement to the ETAC. We are currently applying this to a number of emerging technology domains with the joint objective of both evaluating those domains but also enhancing the ETAC. In addition, we would welcome experimentation to see if the ETAC approach has any validity outside the software and IT areas.

We are also doing further research to see if using the ETAC across multiple new technologies allows a fairer comparison of the merits and potential of different technologies, to enable better resourcing and investment decisions.

Update 03/12/2019

Released Future outlooks based on ETAC.

References

  1. Buster Benson, “Cognitive Bias Cheat Sheet”, https://betterhumans.coach.me/cognitive-bias-cheat-sheet-55a472476b18, Accessed October 2018
  2. Frank Sesno, “Ask More: The Power of Questions to Open Doors, Uncover Solutions, and Spark Change”, https://www.amazon.com/Ask-More-Questions-Uncover-Solutions-ebook/dp/B01HUER128/, Accessed October 2018
  3. “Charlie Munger on the Psychology of Human Misjudgement”, https://buffettmungerwisdom.files.wordpress.com/2013/01/mungerspeech_june_95.pdf, Accessed October 2018, Accessed October 2018
  4. Alexander Osterwalder, “Business Model Canvas”, https://strategyzer.com/canvas/business-model-canvas, Accessed October 2018
  5. Whys, https://en.wikipedia.org/wiki/5_Whys, Accessed October 2018
  6. Geoffrey A. Moore, “Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers”, https://www.amazon.com/Crossing-Chasm-Marketing-High-Tech-Mainstream/dp/0060517123, Accessed October 2018
  7. “Second-Order Thinking”, https://fs.blog/2016/04/second-order-thinking/, Accessed October 2018
  8. “The Rise of APIs”, https://techcrunch.com/2016/05/21/the-rise-of-apis/, Accessed October 2018
  9. Tren Griffin, “Two Powerful Mental Models: Network Effects and Critical Mass”, https://a16z.com/2016/03/07/network-effects_critical-mass/, Accessed October 2018
  10. “What is PESTLE Analysis? A Tool for Business Analysis”, http://pestleanalysis.com/what-is-pestle-analysis/, Accessed October 2018
  11. “Dominant Design”, https://en.wikipedia.org/wiki/Dominant_design, Accessed October 2018
  12. “Gartner Hype Cycle”, https://en.wikipedia.org/wiki/Hype_cycle, Accessed October 2018
  13. “The Carlota Perez Framework”, https://avc.com/2015/02/the-carlota-perez-framework/, Accessed October 2018
  14. “EU Technology Readiness Level”, https://en.wikipedia.org/wiki/Technology_readiness_level, Accessed October 2018
  15. Everett M. Rogers, “Diffusion of Innovations”, https://www.amazon.com/Diffusion-Innovations-5th-Everett-Rogers/dp/0743222091, Accessed October 2018
  16. Kopecky et al. “A history and future of Web APIs”. https://researchportal.port.ac.uk/portal/files/681058/ITIT_13_1035_1.pdf, Accessed October 2018
  17. Mike Roberts, “Serverless Architectures”, https://martinfowler.com/articles/serverless.html, Accessed October 2018
  18. The fourth industrial revolution: a primer on Artificial Intelligence (AI), https://medium.com/mmc-writes/the-fourth-industrial-revolution-a-primer-on-artificial-intelligence-ai-ff5e7fffcae1, Accessed October 2018
  19. Chris Berg et al, The Blockchain Economy: A beginner’s guide to institutional cryptoeconomics, Accessed October 2018
  20. Simon Wardley, An introduction to Wardley (Value Chain) Mapping, https://blog.gardeviance.org/2015/02/an-introduction-to-wardley-value-chain.html, Accessed October 2018

Original Authors

Original authors are Srinath Perera (srinath@wso2.com), Paul Fremantle (paul@wso2.com), Frank Leymann (frank@wso2.com), Joanne Jenkins (joanne.jenkins@port.ac.uk).

Acknowledgments

Many thanks to Selvaratnam Uthaiyashankar and Nuwan Bandara from WSO2 who have provided significant and useful feedback.

Help us improve ETAC

We welcome and apprciate any feedback, changes, or contributions. Please send a pull request, create a github issue, or send a mail to srinath@wso2.com.

To receive updates to ETAC and ETAC-based emerging technology analysis, please subscribe to our Global Technology Outlook Updates Newsletter.

About WSO2 and Our Research

In 14 years in business, and as the #1 Open Source integration and API management vendor, WSO2 continually invests in areas beyond products. Our Research for Integration team regularly produces our Global Technology Outlook (GTO) — an effort to identify emerging technology trends, classify them based on their potential, and assess a relevant subset of technologies in detail. We do so using our Emerging Technology Analysis Canvas (ETAC). Our research is public in an effort to maintain our commitment to openness, quality, innovation, and integration leadership.