Monthly Archives: July 2019

Innovation and Academic Health Science Centres: some policy thinking

To some extent Academic Health Science Centres [AHSCs] are caught between the research push and market pull.

If they prioritise technology transfer, they opt for a research push approach that emphasises the availability of technologies or innovations for market-based actors based on potential commercial application. While the university mission within an AHSC will emphasise the quality of the technology rather than end-user or market benefits, this fails to address the adoption opportunities available from the healthcare service mission.

The primary weakness of ‘research push’ is that the acceptance of new technologies generally depends more on social and cultural factors within clinical communities, than on the merits of the technology itself.

On the other hand, the transformation of research into innovations that can be used to solve problems facing practitioners and patients is ‘market pull’, or perhaps more precisely ‘solution’ pull. Internal research and development communities of an AHSC need to be closely linked to the problems faced by practitioners and patients.

However, researchers often lack the inclination to pursue the innovation exploitation agenda. Indeed, a focus on adoption and the translation of research arises precisely because research productivity has in the past been favoured over solving real-world problems. In healthcare, the problems needing solving are swamped by a vast sea of research and many governments continue to fund research and wonder why they slide down the innovation scale – they value academic citations over patents for instance.

Taking these different configurations of AHSCs into account, the organisational options also need to consider how innovations move from bench to bedside. A “gated”1 scientific and market-based review process with in-house industry expertise and a network of extramural experts for assessments would create a degree of granularity, enabling assessment of benefit from the initial insight (pure research) through translation to end-user benefits. Some well-known institutions use gated processes to filter research to identify innovations, or to assess market-readiness of research for commercialisation.

In the absence of internal gated processes, institutions may use external expertise. One example is to out-source to market-facing intermediaries the technology transfer process or to commercialisation agents. This brings knowledge of markets through the retained third party. Appetite for risk in life sciences by the private equity and venture capital communities fluctuates over time, specialist groups are more likely to emerge to act as agents to commercialise intellectual property. Many universities have structured such relationships with firms that develop their intellectual property on a licensing basis, so the commercial benefits accrue to that firm, and less to the university, which may get royalties.

AHSCs, though, combine universities and hospitals, and so they need to harmonise where possible differing conditions of employment as these firms could be seen as exploiting these administrative challenges. The technology transfer route which is very widely used takes the context for commercial and entrepreneurial exploitation away from the AHSC. This means that AHSCs will fail to build internal capacity to assess intellectual property and know what to do with it.

An out-sourcing approach is useful when there is little internal interest or or more likely ability in commercialisation. The critical question for any AHSC is to make sure they have a view of the value of their own work. An intellectual property audit, for instance, is often necessary. If the AHSC lacks the ability to understand this, this would be evidence that it has failed to internalise the clinical service priorities from the hospital partner.

A risk facing any AHSC is whether they are being measured by some innovation metric that tracks spin-outs, licenses, or patents (rather than papers published or citations), as it may encourage premature commercial activity, which can take the form of single-technology companies (“one-trick ponies”). These types of start-up generally have high failure rates, and longer time-to-market; they may also be encouraged by a preference for simple licensing deals, reflecting either commercial naivete, impatience or lack of interest (which could be evidence of complacency). Regretfully, metrics such as these can actually hamper the realisation of the translational research agenda and produce less innovation since it measures the wrong thing – companies – rather than solutions. A gated process would, or at least should, determine whether this were happening.

AHSCs with a strong entrepreneurial perspective may choose to develop their own venture funds to develop and ‘de-risk’ innovations to make them attractive for subsequent acquisition. This is an attractive option as it also encourages the development of a domestic market in which to invest. Whole countries, too, have pursued venture funds as a national strategy with more failure than success.2

The US dominates life sciences research for a variety of reasons. One of these may lie in the institutional flexibility tied to the market, and which may be a partnering factor for AHSCs. Many well-established and highly differentiated not-for-profit research commercialisation institutes have been formed in the US, such as MITRE (from MIT) and SRI (from Stanford University), or endowed facilities have been created, such as Battelle and Howard Hughes Medical Institute, while others have emerged as specialists at exploiting particular knowledge domains such as the Santa Fe Institute for complexity or chaos theory. Comparable facilities in other countries that are separate from government are few, as most are tied in one form or another to state paymasters, such Germany’s Fraunhofer and its constituent institutes.

In many countries, one must wonder why there fails to be greater entrepreneurialism at creating novel organisational entities to take forward innovation agendas, themselves, as what such institutions offer is a better way for bringing innovations to market. The challenge for policymakers, though, is distinguishing between enabling the existence of such institutions and letting them function within a mandate with a high level of autonomy versus government determining what it does.

In the end, the options for AHSCs may be constrained by the public funding rules and have little to do with innovation itself.

Setting the policy agenda for AHSCs

Underpinning the notion of an AHSC as a nexus of innovation is whether such a nexus, will attract talent and entrepreneurial zeal. Obviously a process of development, extending perhaps over a number of years may be necessary.

A policy agenda for AHSCs would entail thinking about the following:

Should life science research funding be aligned to favour AHSC-type arrangements?

Obviously this would lead to non-AHSCs losing funding as well encourage migration of researchers toward AHSCs. This may not be compatible with national policy goals on local employment and wealth creation based on simplistic notions of clustering.

While ensuring that other centres do not suffer from a skills drain as the most talented people may be attracted to the special opportunities AHSCs might provide, we need to be mindful that AHSCs can quite easily poach talent as they are more likely to offer superior opportunities. Increasingly “brain circulation” (as talented people move from country to country and perhaps back home again) describes how researchers move about rather than the narrow and parochial “brain drain”. AHSCs are better positioned than other industries to exploit the global mobility of talent, while research funding and innovations forum-shop for the most favourable locations – an AHSC needs to be seen as a most-favoured location.

Should the funding and performance management of higher education and healthcare systems take account of AHSCs tripartite mission?

Since they are expensive, and disproportionately consume the healthcare expenditure budget, they may need to be judged by different performance standards. It might be better that AHSCs are accredited or recognised through explicit criteria rather than a system of self-certification.

Health professions education in traditional teaching hospitals should be replaced by AHSC supervised training arrangements; the logic here is to ensure that students have access to the best and appropriate clinical learning opportunities, within structured “clinical teaching” centres in healthcare providers. That hospitals are monopoly suppliers of clinical placements limits training opportunities but a focus on quality should prune that tree.

In addition, this would enable greater career mobility between academe and clinical service, even if such mobility challenged academic appointment criteria, or public sector employment requirements. Enabling greater flexibility here could encourage more entrepreneurs without losing them from training as well as create greater visibility of the value of entrepreneurialism within professional training.

Are current national restrictions on ownership or management of hospitals, universities hampering the development of AHSCs?

In Europe, universities and hospitals would benefit form new ways of organising their interconnected missions, but there is much to be done to understand how they are evolving and what national forces are shaping them as they are in the main, subject to the will of the state.

Investigation is needed to identify the performance, role and function of AHSCs in Europe, and to understand whether they are in fact a nexus of innovation or a quagmire of bureaucratic interference, as this could be a rate-limiting factor in innovation development. The general poor performance of European universities in international ranking may suggest the latter and a misuse of public money.

The potential scope of AHSCs comprises innovations in technologies impacting clinical care (software, medical devices, medicines), and ways of working (demarcation of health professions, clinical workflow). It is necessary to review relevant policy environments to learn at least [1] whether policies enable or inhibit high performing AHSCs where they exist, [2] whether policies inhibit AHSCs coming into existence, and [3] whether policies have perverse consequences on research and innovation production.

What is the best way to design and constitute an AHSC?

The preferences outlined here seek to understand the form/function balance, but we need more empirical evidence within the models to assess whether there is a critical size below which an AHSC may be ineffective in terms of mission attainment. Size alone may not be as important as the ability to align the various components as needed which is more a function of autonomy.

Nevertheless, size does matter to the extent that a small dysfunctional academic/hospital network or partnership will only become a small dysfunctional AHSC. This gives us one reason we need something better than sui generis self-certification as claims of excellence need evidence.

Notes

1 A ‘gated’ review process involves assessing an innovation at different stages using specific evaluative criteria. Failing to pass a gate is fatal at that stage, so the process passes through to the end innovations that have passed all the gates (which may also be thought of as filters). Gated innovation processes are used by scientifically-oriented organisations such as NASA and military defence agencies. A gated process must have a failure regime to be meaningful, which has consequences for performance assessment of research productivity.

2 See as an example: D Senor, S Singer, Start-up Nation: the story of Israel’s economic miracle. Council on Foreign Relations/Hachette, 2009. This, though, needs to balanced against the more cautionary perspective of the improper role of government in commercialisation in J Lerner, Boulevard of Broken Dreams: why public efforts to boost entrepreneurship and venture capital have failed and what to do about it. Princeton, 2009. A useful comparison of venture funding relevant to this discussion is J Lerner, Y Pierrakis, L Collins, AB Biosca, Atlantic Drift: venture capital performance in the UK and the US, NESTA, June 2011.

Managerial control of medicines cost drivers

It is not unreasonable to have concerns about the cost of medicines.

Drug costs are usually influenced by government policies on pricing and reimbursement of medicines themselves. These range from simple discount seeking to more complex approaches such as conditional approvals, and value-based pricing (perhaps a subject for another posting). These can achieve a measure of drug cost control, but may also distort the market of medicines themselves.

For instance, tendering for generic medicines can sometimes lead to unacceptable consequences, such as unexpected product substitution by suppliers, patient and clinician confusion as medicines change appearance, and complications in medicines management or pharmacists. And a ‘winner take all’ award of contract can mean that the losers exit the market, removing a source of price competition and choice for consumers and governments. This is an unintended but avoidable consequence of using this crude procurement instrument.

Regulations and health technology assessment together are challenging to free pricing of medicines, but it is unsurprising that medicines should be subject to some assessment of efficacy and performance in the real world, and not just on the results of clinical trial evidence on a highly selected study population. HTA has also thrown into the spotlight the logic by which drug prices are established by the pharmaceutical industry. This scrutiny is not a bad thing as it highlights the methodologies used, whether they accurately produce a price reflecting the value of the medicine as used. Separately, the cost of the research to produce the medicine is a factor, and one should not be surprised that the prices of successful drugs should try to recoup the costs of all the failed drug research, even if those costs could be seen as the price of the risk of doing business for the industry.

Apart from these approaches to drug cost control, there are opportunities to reduce costs within the healthcare system itself.

Achieving improved cost control, value for money and improved health outcomes are consequence of better management of medicines procurement, patient adherence, dispensing and waste reduction and reduction in variations in prescribing practices.

These are processes and organisational interventions designed to enable improved professional practice through hospital formulary controls and best practice in medicines logistics. These enable the ability to reduce prescribing variance, strengthen quality systems and improve patient acceptability whilst strengthening the foundations of professional practice.

The following “logic map” shows how this works:

A central feature of any high-performing healthcare system or organisation includes best-practice in medicines use and clinical management.

As all aspects of healthcare are under varying degrees of financial stress, cost controls and appropriate use of medicines are a legitimate focus of scrutiny to achieve the highest standards of clinical practice and safe patient care.

Failure to achieve clinical and managerial control of the use of medicines across the patient treatment pathway may arise from:

  • misuse of medicines (failure to prescribe when appropriate, prescribing when not appropriate, prescribing the wrong medicine, failure to reconcile medicines use across clinical hand-offs
  • “clinical inertia” and failure to manage patients to goal (e.g. management of diabetes, and hypertension post aMI) [see for example: O’Connor PJ, Sperl-Hillen JM, Johnson PE, Rush WA, Blitz WAR, Clinical inertia and outpatient medical errors, in Henriksen K, Battles JB, Marks ES et al, editors, Advances in Patient Safety: From Research to Implementation Vol 2: Concepts and Methodology), Agency for Healthcare Research and Quality, 2005]
  • failure to use or follow best-practice and rational prescribing guidance
  • lack of synchronisation between the use of medicines (demand) and procurement (supply), with an impact on inventory management and
  • loss of cost control of the medicines budget.

The essential challenge is ensuring that the healthcare system and its constituent parts are fit for purpose to address and avoid these failures or at least minimise their negative impact.

Medicines costs are the fastest growing area of expenditure and comprise a major constituent of patient treatment and recovery.

The cost of drug mortality was described in 1995 [Johnson JA, Bootman JL. Drug-related morbidity and mortality; a cost of illness model. Arch Int Med. 1995;155:1949/56] showing the cost of drug mortality and morbidity in the USA and costed the impact at $76.6 billion per year (greater than the cost of diabetes).

The study was repeated five years later [Ernst FR, Grizzle A, Drug-related morbidity and mortality: updating the cost of illness model, J Am Pharm Assoc. 2001;41(2)] and the costs had doubled.

And costs and use have continued to rise since then.

Evidence from a variety of jurisdictions suggests that drugs within the total cost of illness can be substantial, for instance:

  • Atrial fibrillation: drugs accounted for 20% of expenditure [Wolowacz SE, Samuel M, Brennan VK, Jasso-Mosqueda J-G, Van Gelder IC, The cost of illness of atrial fibrillation: a systematic review of the recent literature, EP Eurospace (2011)13 (10):1375-1385]
  • Pulmonary arterial hypertension: drugs accounted for 15% in a US study [Kirson NY, et al, Pulmonary arterial hypertension (PAH): direct costs of illness in the US privately insured population, Chest, 2010; 138.]

There are upward pressures that increase costs, downward pressures that decrease costs and pressures that influence costs in either direction; the diagram illustrates a few:

Many of the drivers can be addressed through a combination of professional staff development, better use of information, particularly within decision-support systems to support guidelines and prescribing compliance, and organisational interventions.

An interventional strategy to manage medicines cost drivers involves a structured review of central drivers of drug cost and use within existing national or organisational priorities.

The range of possible solutions fall across of spectrum of interventions and any or all of these are good starting points:

  1. development of drug use policies
  2. development of clinical policies, guidelines, and clinical decision-support algorithms
  3. drug-use evaluation studies
  4. clinical and medical audit
  5. cost-benefit studies
  6. professional development
  7. procurement effectiveness performance review
  8. patient treatment pathway analysis
  9. analysis of waste reduction opportunities
  10. management/organisational improvements to support appropriate behaviours.

To start involves assessing the current state of these aspects, and determine any gaps with national or organisational policy, or evidence-informed best practice. As a proxy measure of the necessary changes, measurement of this gap becomes the focus, and requires evidence of current practice against the desired goal. In many cases, where systems are weak or poorly performing a comprehensive root-and-branch review may be needed, with a corresponding impact on existing managerial, organisational and professional practice.

All healthcare systems and organisations are different and whilst it is difficult to precisely quantify the outcomes in advance, organisations undertaking a sustained process of medicines review and optimisation should be able to release more than 10% of existing drug expenditure and possibly more.

In organisations with a less-well developed clinical pharmacy, where medicines information systems are not well developed, and where clinical guidance is not proceduralised, greater savings are likely, perhaps to 25% or more, reflecting the possibility that the lack of information conceals upward drivers of costs, masks inefficient medicines management or evidence of misuse and waste.

In the longer run, healthcare organisations will need to ensure sustainability of any medicines optimisation review, by ensuring strong organisational structures, practices and behaviours. Development of these frameworks is an important by-product of medicines optimisation interventions, with a corresponding improvement in medicines safety.

Healthcare Cognology: autonomous agency for patient empowerment and system reform

Healthcare systems have been slowly evolving toward a model of care delivery that seeks to leave behind the traditional medical model, based on fighting diseases – sometimes called the lesion-theory of medicine – and which has driven health care thinking since the 1800s.

The direction of travel is toward the health ecology model conceiving healthcare as about helping people live their lives well, seeing ill-health and disease within an ecology comprising choices they make, the context in which they lead their lives and importantly, on the central role of the individual within that ecology to decide how to organise healthcare to help them lead this life. In that respect, the ecological model is more in tune with the real, complex, nature of the world with the various parts working together more in a self-organising manner to achieve desired results. This contrasts with the never-ending top-down plans of state run systems, over harnessing the forces in the society to drive quality and performance improvement of outcomes.

Self-care has been the main policy response to the realisation of this complexity and we have examples such as the expert patient, patient activation, patient reported outcome measurement, disease or care management programmes, managed care, and health promotion and lifestyle programmes.

At present, many health systems and policymakers are focused on chronic ill-health or long-term conditions, which entail continuing healthcare requirements perhaps over the lifetime of individuals and requiring varying degrees of support and perceived unsustainable funding needs. Many long-term conditions arise from lifestyle choices in part and that explains why there has been a focus on engaging the patient in the care process, to ensure that they are inclined to make the necessary choices to avoid further exacerbations in their conditions, or indeed to avoid these conditions in the first place; another goal of self-care is to achieve a policy driven cost-shift to the patient user, to exploit financial co-payments, for instance, to alter behaviour, in the spirit of liberal paternalism.

The California Healthcare Foundation has stated [www.chcf.org/topics/health-it]: “information technology is still fairly new and untested in health care, making experimentation, analysis and evaluation critically important”.

We know technology helps to enable not just efficiencies and effectiveness, but also the greater personalisation of services – consumerisation. The impact of technology, therefore, includes, but is not limited to:

  1. breaking down (or disintermediating) processes to remove steps that do not add value to the end-user experience, or which have no useful role to play, despite being seen as current good practice by professionals; this can create novel service integration
  2. shifting skills toward customer-facing staff (e.g. consider how different banking has become)
  3. widening public access to hitherto restricted health information to patients, including information on clinical performance. In some cases, this has been mandated (such as public information on hospital performance) or has evolved in response to customer interest (such as health websites providing information and advice on health conditions)
  4. enabling organisations to create new ways to engage with the consumer or end-user more effectively in improving products and services than the traditional customer/supplier relationship.

A particular impact is relevant in healthcare, namely, moving knowledge across the boundaries of regulated professions (e.g. to imaging technologists from radiologists, from doctors to nurses).

Healthcare is highly controlled and the application and use of professional knowledge legally regulated. The effect of this has been to compartmentalise knowledge and skills within a broad hierarchy, with the doctor at the top, in effect, as the default health professional who supervises and validates the application of knowledge and skills by other professions. This, of course, is changing, partly as a response to the sheer complexity of healthcare and the levels of knowledge and skill involved, but also through new ways of working, in teams, and across organisational boundaries, with skilled nursing care facilities, polyclinics, etc. The patient, though, has not been an immediate beneficiary of this.

As knowledge has migrated away from people into devices, we have seen the invention of patient-use devices which in the past have required sophisticated testing and professional knowledge; an obvious example is the pregnancy testing kit, and many mice and rabbits are no doubt relieved at its invention.

The impact of embedding knowledge in devices in healthcare, and thereby the potential impact on the internet of things within hospitals and for patients unbundles knowledge cartels and redistributes it.

Putting knowledge into people means training them, and it can either shift knowledge to other professionals, such as is found in interventional radiotherapy (imaged-guided surgery), whereby surgeons interpret imaging results in theatre, replacing a separate radiologist. Knowledge can also be given to patients, often by simply enabling them to have access to more knowledge and insight; this has been a key impact of the internet and which has raised many issues around the quality of health information on the internet.

Knowledge can be put into devices, which can be used by patients and consumers, and where the device does what a health professional used to do. This is the artificial intelligence revolution.

Finally, technology can enable knowledge to be put into ‘systems’ to generally interact with people, such as in the home, or hospital, for instance; it is the embodiment of smart devices within systems that offers particular benefits.

The Internet of Things, for want of better terminology, can help achieve greater personalisation of service delivery and move toward such notions as the Smart Hospital and the Smart Home to support the Smart Consumer.

Why do we want this greater personalisation within a healthcare context? Because evidence demonstrates that customising services is effective – patient outcomes are improved, patient experience is positive, and the provider gets better value for money.

Personalisation has the potential to be enabled through autonomous agents acting on behalf of patients, enabling the patient/consumer to drive their preferences and choices, rather than these emerging through professional delegation or proxy interpretation. Is this Alexa or Google’s Assistant on steroids?

A vast array of device technologies are used in healthcare, particularly in hospitals, probably the most complex organisations in our society. A known priority within healthcare is to integrate the vast sea of information produced, whether conclusions by clinicians, activities of patients, the output of devices, or underlying information such as financial performance, inventory, or quality. Progress is slow and mixed.

E-health has largely failed to get substantial traction, either as a mode of service delivery, or commercially, despite being seen as having considerable potential, by enabling better linkages between operational parts of the healthcare system with the patient. This, despite evident progress is still work in progress.

There are many approaches to integrating information across the information value chain, with the electronic health record (EHR) seen as key from a clinical perspective, along with opportunities real-time monitoring of patients outside hospital through sensors, or interacting with patients through video teleconferencing. Most countries are grappling with how to enable patient access to the EHR, with concerns around identity determination, privacy regulations and security being central, but this debate is being carried by the healthcare providers and their regulators that see the EHR as belonging to them, and not something owned and under the control by the patient.

Electronic prescribing, is seen as reducing medical errors, and better correlating patient data with rational prescribing, but the benefits to patients are limited, in the main, to electronic delivery of the prescription to the pharmacy of their choosing, but this is a choice that is already theirs that is not enhanced by e-prescribing itself. The benefits here accrue to reduce processing time, or commercial capture of the prescriptions themselves through co-location of pharmacies and prescribers, which in the end sort of defeats the point from a patient perspective.

Other areas, such as care management programmes using remote monitoring, SMS alerts, etc. but little of this is really new, as they are mainly automating existing activities, and facilitating better communication.

Let’s consider starting in a different place.

I am mindful of underlying clinical requirements in the hospital, such as linking the dispensing of a medicine to a patient (informed through clinical decision-support prescribing systems and documented accordingly) with bed-side capabilities to ensure the right patient gets the right medicine, and linking that in turn back to batch control and inventory control, budgeting and procurement, not to mention links to quality assurance, audit and utilisation review. And should the patient react badly to the medicine, batch control can help identify any problems with the medicine itself, such as expiration date, or even whether it is counterfeit. How are we to design a system that seamlessly makes all this work?

I am starting with the relationship between the patient and the hospital (mindful that perhaps what we mean by hospital will evolve over the next decade for other reasons), a relationship, built on trust, and on service delivery, communication, treatment, and information. Illustratively, a wireless world of healthcare is possible, which respects this.

Autonomous agents and the next stage of evolution of the Internet of Things

“Cognology” is a term coined by myself to describe the evolution toward technologies with embedded intelligence. So what can the internet of things be in this context? I have adopted the operational definition of how the internet of things should work in healthcare from Kosmatos et al 2011:

“… a loosely coupled, decentralized system of smart objects—that is, autonomous physical/digital objects augmented with sensing, processing, acting and network capabilities.”

The implication of operationalising devices within a cognology and fitting this definition is to alter our current notion of the internet of things from a cognitive perspective. That is to say, the ‘thing-ness’ of devices that we perceive to be the interesting development evolves as autonomous agents give functional purpose to these things. This in effect means moving from a view of the internet of things defined as bundles of technological capabilities, and more as a ‘distributed cognitive system’ [Tremblay 2005] defined in its ability to evolve and transform itself in response to changing circumstances, rather than a strict functional hierarchy.

Conversion of the internet of hospital things into the internet of self-care (or what might be thought of as ‘my things’), through autonomous agents bridges the gap between the hospital setting and the personal context (home, school, work, play), in effect by having the autonomous agents ‘repurpose’ the device.

In a wireless world, the individual is the focus of the cognological capabilities provided by smart device technologies. This achieves the additional benefit of shifting the focus away from technologies that can deliver this or that service, to the use of the information and its manipulation to achieve various goals.

I also think it is important to adopt Simon’s technological agnosticism, to ensure we are focused on results, rather than ‘things’ as such.

I think of this shift from technology to cognology as achieved in part through advances such as the potential of the internet of things, with the embedding of functional intelligence in devices transforming them from physical things into cognitive things.

In this respect, the internet of things is a misnomer.

The internet of hospital things

Healthcare technologies should have certain degrees of freedom:

of geography: in terms of home, hospital/clinic, ambulance, workplace, etc. to support location-independent care;

of intelligence: embedded ‘intelligence’ of one sort or another proving a constellation of capabilities, but perhaps most importantly, a predictive and anticipatory capability;

of engagement: seeking out and exchanging at various levels and in various forms with people (doctors, nurses, patients, carers, etc.), with processes (admission, discharge, alerting, quality monitoring, etc.) and with other objects (blood gas monitor, diabetic monitor, cardiac monitor).

I see the Internet of Things as a different approach, which, when coupled to the use of autonomous agents, offers substantial opportunities to recast clinical processes so making the patient central to healthcare. This consumerist approach will render dated many e-health initiatives for example, as well as the current approach to the use of EHRs.

References and Want to Know More?

Autonomous Agents and Multi-Agent Systems for Healthcare, Open Clinical, www.openclinical.org/agents.html#properties

Kosmatos EA, Tselikas ND, Boucouvalas AC, Integrating RFIDs and Smart Objects into a Unified Internet of Things Architecture, Advances in Internet of Things, 2011, 1, 5-12, doi: 10.4236/ait.2011.11002

Lehoux P, The Problem of Health Technology, Routledge, 2006.

Simon LD, NetPolicy.com: public agenda for a digital world, Woodrow Wilson Centre, 2000.

Storni C, Report on the “Reassembling Health Workshop: exploring the role of the internet of things, Journal of Participatory Medicine 2(2010), www.jopm.org/media-watch/conferences/2010/09/29/report-on-the-reassembling-health-workshop-exploring-the-role-of-the-internet-of-things/

Tremblay M, Cognology in Healthcare: Future Shape of Health Care and Society, Human and Organisational Futures, London, 2005

Tremblay M, The Citizen is the Real Minister of Health: the patient as the most disruptive force in healthcare, Nortelemed Conference, Tromso, Norway, 2002.

Wireless World Research Forum (2001) Book of Visions 2001.

Want to know more? There are some diagrams I excluded which showed a schematic of the system at work.

9 Tribes of the Internet and their health interests

Discussions on health literacy are increasing as healthcare providers, clinicians, payers and patients consider what this means for healthcare. Having been involved in launching the world’s first digital interactive health channel in the UK in 2000, one thing I learned is not to assume that everyone is alike or has common interests.

Healthcare systems are poor doing what retailers take for granted, namely the segmentation of their users. When we did the health channel, we worked with a simple framework drawing on work by the California HealthCare Foundation, in their report “Health E-People”. This gave us a workable model of the different types of users and their different needs, and that in developing content and services for them through the Channel, we needed to be mindful of this. More recent work by the Pew Internet Project has identified the “9 Tribes of the Internet”, to reveal how different people interact and use technology. Of course, segmentation can be quite elaborate, but at this stage we need a scaffold to guide our further understanding.

The main assumption we need to make about technology is how it will be used by people and thereby how this informs the adoption/diffusion process. Health and social care are traditionally “high touch” activities, given the way that knowledge has been organised, who knows it and how it is used. This, however, is being challenged by technologies that embody what traditionally has been found in the brains of specialist clinicians — what I call ‘cognologies’.

Increasingly we are seeing technological innovations that can embody both that knowledge (in decision algorithms for instance) and in skill (in robotic devices, vision systems for instance). Will people accept a shift toward high technology care at the expense of its traditional focus on care by humans? Is that an aesthetic preference (we like it) or might people come to prefer “lower touch” technologically-enabled services if it is reliable, and on-demand?

As we think about this, I suggest the following as some thoughts for policy makers and care providers:

  1. Eventually, the individual will have to own, in some form, their own health record if much of the desired changes in patient behaviours are to be realised. This will lead to patients having a new understanding of information about themselves, and as such this information will need to be clear without mediation or interpretation by others. Patients will, therefore, become involved in decisions about what to do with their information, and with whom it is shared and used; for instance, use in databases whether in commercial or public organisations that will be accountable to the patient for the use of that information. The patient, as what I call the ‘auditor of one’ will come to take a keener interest in the accuracy of the health record and be less tolerant of mistakes or inaccuracies, as is the case in other areas (e.g. banking, credit scoring).
  2. Not everyone will be digitally enabled in the way technology pundits fantasise about. This is not a digital divide and is not evidence of social exclusion, but is a personal choice of people to lead their lives as they wish in a pluralistic society; it may be that in the end, we all end up as digital natives over time, but some will still be hold-outs, or ‘islands in the net. The key implication here is that service providers will need to move in some cases very slowly to adopt technologies with some types of people. In time, perhaps people may adopt low-level access and interactivity, but for some people technological interactivity will remain at best an option not a preference within an evolving technological ecosystem. It remains to be seen whether this will continue to be the case; evidence from other technologies suggests not, that in time, technologies are broadly universally accepted, but not necessarily used in the same ways by everyone.
  3. The assessment of benefits of technologies in the traditional health technology assessment [HTA] model will need to pay much greater attention to the segment of the population likely to be involved and the social context of that group, taking account of distinct patterns of use and preferences. This challenges the current paradigm used within HTA communities. The conclusion that one-size-fits-all HTA assessment will increasingly prove inadequate. This means that designing and implementing technologies will need to be far more flexible when it comes to the structure of service delivery as the adoption/diffusion process itself will come to determine the socio-economic benefits. Consider that few today would subject the telephone to an impact assessment – it is now part of our expectations, and we should not be surprised if the same thing happens to evolving technologies in healthcare focused on the use by consumers and patients.
  4. The tribes model suggests that not everyone will necessarily buy into the technology revolution. For many people, they work in care precisely because they want to have personal contact with people, and not through intermediating technologies. Since many patients also would have that preference, organisations may need to structure services and staffing to ensure the right mix of people to service the right publics. This will challenge approaches to the organisational design of service providers, in the main suggesting more pluralism in variety, scale and function.
  5. Patient compliance, concordance, adherence may become more dependent on the features of the technologies, their design and ease of use, than on the willingness of the patient to follow a particular care regime. Patients are deliberately non-adherent for many good reasons (some of which reflect fundamental flaws from the medicine itself, its delivery system, or side-effects). Accidental non-adherence is another matter obviously. Helping people understand their limitations in using and working with technologies as matter of personal preferences will become very important, which increases the focus on personalisation.

It is common for health and social care systems, especially where the state is the main source of funding, to tend toward omnibus systems of service delivery, which has difficulty dealing with individual service preferences. Whether it is fully appreciated, such systems favour professional and provider interests and depend on proxy interpretations of patient preference. It would be a mistake to assume a similar approach with technologies. Instead, we should be encouraging approaches which are sensitive to the preferences and usage patterns of individuals. In this way, too, we may actually see services being offered that people will value and use.

The 9 Tribes in Health

Background

Pew Internet Project identified the “9 Tribes of the Internet” in a report in 2009 [http://www.pewinternet.org/2009/06/10/the-nine-tribes-of-the-internet/], to ascertain how different people interact and use technology. The California HealthCare Foundation, in its “Health E-People” [http://www.pewinternet.org/2009/06/10/the-nine-tribes-of-the-internet/], identified three broadly defined populations: the well with an interest in health, the newly diagnosed, and those with long-term or chronic health conditions.

The Pew research was instructive in thinking about how people might deal with a more technologically enabled health and social care system. I have sketched out some relationships in the table which gives an overview of the sort of considerations that are likely relevant and important.

NOTE: This was first written in 2010, and updated in 2019.