Quote: Universities and business models

There are three generic types of business models: solution shops, value-adding process businesses, and facilitated user networks. Each of these is comprised of its own value proposition, resources, processes, and profit formula. Universities have become conflations of all three types of business models. This has resulted in extraordinarily complex—some might say confused—institutions where much of the cost is tied up in coordinative overhead rather than in research and teaching. A key reason why the for-profit universities and other universities such as Western Governor’s have been gaining such traction in today’s higher education market is that they don’t conflate the three types of business models.

[…]

Universities emerged in the 17th and 18th centuries primarily as teaching institutions, but most gradually evolved to become expensive conflations of all three types of models with three value propositions: research, organized as a solution-shop model; teaching, which is a value-adding process activity; which is a value-adding process activity; and facilitated networks, within which students work to help each other succeed and have fun. A typical state university today is the equivalent of having merged major consulting firm McKinsey with Whirlpool’s manufacturing operations and Northwestern Mutual Life Insurance Company. They have three fundamentally different and incompatible business models all housed within the same organization.

Christensen, C. M., Horn, M. B., & Caldera, L. (2011). Disrupting College: How Disruptive Innovation Can Deliver Quality and Affordability to Postsecondary Education  [pp. 33-35]

Advertisements

Higher education research, networks, and the new metrics

Reading about “The benefits of the research blog” at the Kpop Kollective, an interesting corollary of having a research blog springs to mind. William Gunn, who is Mendeley’s head of academic outreach, noted recently in a Research Trends virtual seminar (available here, along with the other seminars of the day) that when it comes to making your work/publications/etc. available for download on repositories (etc.) that “Readership patterns correlate with eventual citation patterns”, when we understand readership patterns here to be synonymous with actual downloads rather than just clicks. We assume here when somebody downloads the pdf (or recording, or whatever else) of your work, that they intend to read it (clicking on the page which has this download link usually just has the abstract, which is of limited use in deciding “I am/am not interested in this work”). This allows you, as a researcher, to know in advance of publication how your work will be received. Thinking about this leads to some interesting questions.

Fig. 1: Traditional research output model

Fig. 1: Traditional research output model

The older model of how we conceive publication is as in Fig. 1. We have a fairly direct vector going from research to output to the reception of this output. This reception, and all other related factors such as citation counts and other forms of bibliometric analysis taking place on a meso- and macro-level, is what influences a researcher’s continued work. Do I continue to focus on the same things? Is there an interested audience for this type of work, and am I connecting with them? This is one of the points which is often glossed over in discussions of bibliometrics. There is something of a binary attitude towards such forms of quantitative analysis, where by those in the sciences, technology, engineering, and mathematical fields (STEM) are held to be most readily catered for by bibliometrics, where as those in the arts, humanities and social sciences (AHSS) are not. In my work in this area, for both STEM and AHSS researchers, I take a somewhat tempered view. My philosophical background in hermeneutics and the philosophies of information and knowledge (Gadamer, Ricoeur, et al.) allows me to see that bibliometrics is a tool to try put a researcher’s work into context. That is what hermeneutics is all about. The difficulty is that this tool is extremely blunt, and that the results are far too slow in coming. Calling these “metrics” is misleading, as it arrogates a notion of precision and agreed-upon rules of measurement where at best we have indicators – and beyond that not much aside from a lot of disagreement!

Continue reading

Infrastructure and Superstructure

Aside

Again and again, however, such confusion causes people who should know better to decide that, because they have located some pervasive superstructural pattern (a prevalence of petty street crime in neighborhood X, say), superstructure here is actually producing all the visible infrastructural changes (“There was an influx of Puerto Ricans in neighborhood X, and a subsequent rise in drugs and petty street crimes; because of this, eventually the neighborhood was driven down till it became an all but abandoned slum where nobody, not even the Puerto Ricans, would live anymore . . .”), when, at the infrastructural level, what has actually happened is that landlords-as-a-class have realized that the older buildings in neighborhood X require more maintenance and thus a greater expenditure, so that they concentrate all their economic interest on newer properties with larger living units in neighborhood Y to the east, which is popular with young white upwardly mobile executives. The result is the decline of neighborhood X, of which street crime, drugs, and so on are only a symptom—though, as superstructural elements, those symptoms stabilize (i.e., help to assure) that decline and combat any small local attempts to reverse it by less than a major infrastructural change.

Samuel R. Delany, Times Square Red, Times Square Blue, p. 163.

Hierarchy and network

Aside

“If this book displays a clear bias against large, centralized hierarchies, it is only because the last three hundred years have witnessed an excessive accumulation of stratified systems at the expense of meshworks. The degree of homogeneity in the world has greatly increased, while heterogeneity has come to be seen as almost pathological, or at least as a problem that must be eliminated. Under the circumstances, a call for a more decentralized way of organizing human societies seems to recommend itself.

However, it is crucial to avoid the facile conclusion that meshworks are intrinsically better than hierarchies (in some transcendental sense). It is true that some of the characteristics of meshworks (particularly their resilience and adaptability) make them desirable, but that is equally true of certain characteristics of hierarchies (for example, their goal-directedness). Therefore, it is crucial to avoid the temptation of cooking up a narrative of human history in which meshworks appear as heroes and hierarchies as villains. ”

Manuel De Landa, A Thousand Years of Nonlinear                                                                                                                             History, p. 69

Enhanced by Zemanta

Outsourcing the self

If social networking is creating a new ecology of communication, how is it doing so? It is effectively a way of outsourcing our connections with other people, whereby we no longer need to worry about doing anything so vulgar as actually having to remember birthdays, or ages, or where you met them, or…in certain cases, what their name is. The upside to this should be that by outsourcing our memory in this manner, then we should have more time free to do more things with these people. We don’t need to write a letter, or even an email. Perhaps we think we can just send a message, and all is well. The problem is that this theoretical free time is a black hole. Most have had that that experience of trawling Facebook for what we think is just a few minutes, then *snap* you’re back in the room and two hours have passed. How is this? Well, as with all technologies, there are benefits and there are drawbacks. This form of communication is low-grade, and labour-intensive. We may get more connected, but this means more active effort is needed maintain these connections.

Network theory in the context of neuronal activity gives us what is termed “Hebb’s rule“, which is sometimes encapsulated with the phrase “cells that fire together, wire together”. According to this, connections between different areas of the brain grow stronger the more that these connections are activated. Analogous with this is the image of all the possible routes across uneven terrain. If many people over a long period need to get from point A to point B, then over time efficiencies will present themselves. People will begin to avoid a certain rock, or a marshy area. They may note that taking an oblique approach to a hill may be longer as the crow flies, but it takes less effort. The result is an emergent solution, emergent because it is the result of multiple actors over a long period of time, and with no over-arching co-ordinating strategy. The path is, in the barest sense, an example of Hebb’s rule being applied as general principle. What does this have to do with social networks, you might well ask. The point with this example is that it is emergent, it is contingent, it is effectively passive. It creates something that we think must have been goal-directed (this appearance is the cause of much confusion in discussing emergent and self-organizing phenomena), but it was anything but. In terms of the network of friends on something like Facebook, stronger connections that we have with people in Real Life are a result of actual contact – or even contact via means of communication which we somehow regard as less ephemeral, such as the telephone.

Adam Ferguson, a philosopher of the Scottish Enlightenment, described emergent phenomena in society as the “result of human action, but not the execution of any human design“. The problem with social networks is that the owner of the network takes away the accidental aspect of human action which might allow us to strengthen some connections more than others, and instead the task is taken over by an algorithm, which edits our experience. I can see how a radical critique of this situation then becomes possible, whereby because an element of our autonomy has been eroded in this manner, we have not only outsourced memory – we have outsourced our will, our volition, our thought. We are zombies, in all senses. We are co-managing the shitpile. Every utterance we make is spam. But I am enjoying myself too much, and so have run away with the argument. The ideology surrounding social networking does not allow us to consider it dispassionately. It is another mode of communication. It is not evil – this argument is as old as Plato, and continues up to Morozov – , but nor is it the saviour of humanity. We must be pragmatic, but we must commit to a radical pragmatism.

Hello, can I help me?

We must attempt to analyse our behaviour down to the very root, and attempt to foresee just what the unknown results of our actions may be. Every new technology presents new possibilities, and these are amoral. They are without intentionality. They know nothing of ethics. Technology allegedly frees us from every manner of banality, with the promise of allowing us to do what we like. The practice is rather different. With each new example of cognitive outsourcing we no longer think as we once did. What once were basic skills and abilities become degraded out of a lack of use. Hebb’s rule applies in the negative just as it does in the positive. What once were well worn synaptic paths become overgrown with laziness and inactivity. This has more than personal implications. Algorithmic editing seems to make life easier for us, by giving that sickly sheen of “tailor made” to our inter-personal relations, but it can give us over to a philosophy of futility – we become ever more passive, given over to ephemera. We have no thoughts, we have a ‘like’ button. We have no awareness, we have updates. Our being is voided and wiped clean, only to be time-stamped, location-stamped, and finally branded.

Is open source inevitable? II

[If open source is to have its day, some implications must be examined]

Technologies of privacy:

  • Old style : passive, reactive, the default position. 
  • New versions?: Opt-in, active privacy; specifically designed ways of deciding what we share, and with whom.

Transition away from previous economic models. Manufacturing and the mass ownership of capital has been on the wane for generations. Consider the MIT model of spinning-off industries. According to this study “Entrepreneurial Impact: The Role of MIT”, if one were to regard companies developed by MIT alumni, collectively they would form the 11th largest economy in the world. Technologies must be proven, thus they must be peer-reviewed as well as tested by the market. If everybody can use the same ideas (goodbye proprietary anything), if open source and the intellectual commons get their day, then the matter of ‘economic viability’ is set aside in favour of “technical viability” and ‘environmental viability’. The new models will have to incorporate recognition that there are diseconomies of scale, and what we called economies of scale were all too often a fetishization of size. This is a realization from the realm of network theory, which brings the long tail to bear on our everyday lives. It is not merely a niche element – the long tail is not long tail, as it were. (E.F. Schumacher had an intimation of this in his collection of essays, Small is Beautiful.)

How do we then differentiate between old and new? It becomes a matter of serviceWill digital mean that we are all eventually a part of the service economy? We may be able to set our businesses apart according to how we deal with our customers. It may be a matter of approach rather than cost. A fully open source world, with respect for the intellectual commons, is utopian. Too much seems to stand in its way, but elements of this can be used to consider alternatives to developing nations making the mistakes that the industrial and post-industrial nations have made. Consider the principle of the long tail applied to national economies. Of course there will factors that lead a country to be wealthy by virtue of some natural resource, as long as unsustainable practices are maintained, but an open principle towards information will in theory allow innovation to take place anywhere. We see this in the emergence of ‘regional hubs’ and ‘centres of excellence’, but the best example yet, in terms of something that will actually last (unlike Dubai), is Singapore. CNN has fifty reasons to account for this (about 20 are compelling, but that’s enough for me). For long enough have people considered the first part of McLuhan’s “Global Village”. We need now to give greater attention to the ‘village’ part. That is the locus of differentiation, and of what we can manage, to make our actions environmentally and socially sound.

Ideas and criticism

If an idea is presented, and elements of it are questioned in a manner that parries but does not thrust, if it is purely surface, does this in fact have any merit as criticism? A criticism ideally has a purpose, which is basically to be a form of troubleshooting. It must be specific, it must engage with the topic. To not do this is equivalent to saying “well I haven’t used the software you developed, but it’s probably rubbish” (we might call this The Troll’s Refrain). Stand back and ask how does this help us to actually improve the idea. Even thinking ‘us’ is useful, because if there’s a conversation, then there is a joint effort. David McCandless (author of the excellent Information is Beautiful) has a useful breakdown of how we think about the world, in the following pyramid:

In this, there is a progressive hierarchy of whittling out what is irrelevant, the old effort of separating the wheat from the chaff. This means that the bottom is a field somewhere, and so the top is presumably a Weetabix. Ideas are about turning information into knowledge, they move us from one level to the next. The non-criticism I am thinking of above seeks to move us back down, from knowledge back to information, or data. Instead of information, it focuses on noise.

There is a temptation in all debate to ask for greater and greater precision, but at the same time you have to step back and ask what level of perfection you want. If ideas are to be abstract, then they of course cannot account for everything. What we are searching for is a degree of functional ‘robustness’. This is a term which Kevin Kelly makes continuous reference to in his writing (I am mainly thinking about 1995’s Out of Control, which has dated far less than I would have expected), denoting that capacity for enduring networks of communication to cope with unpredictability. It is a nice antidote to management horseshit about “best practice” because it factors into its very structure that we start from least worst, and go on from there. We trade off a small degree of efficiency for a considerably greater degree of structural toughness. There will ever be noise, but not every blip is a threat to the entire system. Ideas are supposed to be solutions, and they are not to be rejected on aesthetic grounds. If it works it works, and from there we can begin the work of making our solution elegant.