Bacon, Vico, and the “Long Tail”

[Image Creative Commons licensed / Flickr user spratmackrel]

[Image Creative Commons licensed / Flickr user spratmackrel]

In his essay, “The Long Tail,” Chris Anderson observes that the our ability to overcome the ‘tyranny of physical space’ through the use of a combination of online databases and streaming services has fundamentally altered business models and, as a consequence, has radically increased our access to information. In the past, limited by the physical constraints of location and shelf space, retailers were forced to carry a small selection of material that appealed to the greatest proportion of a local market. In terms of the book industry, these limitations mean that books rapidly go out of print and become very difficult to come by after a relatively short period of time. In contrast, however, the ability of companies like Amazon to replace small store fronts with massive warehouses, and to leverage the internet to reach global markets, has allowed them to carry and generate significant revenue from products in a way that smaller markets would not produce. This model becomes even more profitable when the products themselves are digital (as in the case of music, movies, books, software, etc), since a single stored copy can be infinitely licensed and distributed, which is to say, sold. The first of three ‘rules’ that Anderson offers businesses in this new digital economy is “Make Everything Available.” Since there is a market for everything, and since the cost of storage is so incredibly low, there is profit to be made even from the most obscure (and awful) material: “In the Long Tail economy, it’s more expensive to evaluate than to release. just do it!”

We do indeed seem to be moving more and more from an economy of scarcity to one of abundance. From a scholarly perspective, this means the development of a rich, extensive, inexpensive (Anderson’s second rule is “cut the price in half, then lower it”), and easily accessible archive of material. As Anderson observes, “it is a fair bet that children today will grow up never knowing the meaning of “out of print.” On the other hand, however, I wonder what an economically driven abundance (concerned with quantity over quality) will have on our ideas about the value of tradition.

In the early days of the Enlightenment, there was some discussion about the scarcity of available information relative to the total amount of material that had presumably been produced. In this, there seems to have been two primary perspectives. From Francis Bacon we learn that it was common for scholars at that time to believe that the works that had survived had done so by virtue of their importance and, consequently, represented the best that the history of ideas had to offer. In contrast to this dominant position, Bacon argued that it was not in fact the best that had endured, but rather the most trivial.

Another error, that hath also some affinity with the former, is a conceit that of former opinions or sects after variety and examination the best hath still prevailed and suppressed the rest; so as if a man should begin the labour of a new search, he were but like to light upon somewhat formerly rejected, and by rejection brought into oblivion; as if the multitude, or the wisest for the multitude’s sake, were not ready to give passage rather to that which is popular and superficial than to that which is substantial and profound for the truth is, that time seemeth to be of the nature of a river or stream, which carrieth down to us that which is light and blown up, and sinketh and drowneth that which is weighty and solid. (Bacon, The Advancement of Learning)

In essence, Bacon advocates an approach that would abandon tradition entirely, and systematically create new repositories of knowledge built upon firm foundations. Since the best has been lost, and what remains has little value, Bacon leaves us with little choice in the matter. Responding to Bacon, who is notorious for his rejection of the value of tradition, Giambattista Vico supports the former view, the view that Bacon insists is in error:

There is, therefore, more wit than truth in Bacon’s statement that in the tidal wave of the barbarians’ invasions, the major writers sank to the bottom, while the lighter ones floated on the surface. In each branch of learning, instead, it is only the most outstanding authors who have reached us, by virtue of being copied by hand. If one or another was lost, it was purely by chance. (Vico 1990, 73)

For Vico, Bacon makes a mistake in accounting for scarcity by emphasizing what falls away. For Bacon, it seems, knowledge persists unless something happens to it, and it just so happens that the finest knowledge is the first to be lost. In contrast, Vico argues that the opposite is the case: that knowledge naturally decays over time, and that its endurance is only made possible through the active (providential) intervention of scribes motivated by an interest in preserving the finest and best. Both authors are resigned to the fact that what’s lost is lost. When it comes to accounting for the scarce intellectual resources that have persisted through time, however, Bacon views this scarcity as evidence of inferiority, and Vico of eminence.

Under conditions of scarcity, whether high (in the case of Vico) or low (in the case of Bacon), knowledge has value, and this valuable nature of knowledge demands a response. If received knowledge is of high value, then it ought to be preserved; if not, then it ought to be jettisoned. But what kind of value does knowledge have within a ‘long-tail’ information economy characterized by abundance? (This would, of course, be an obvious place to bring in Walter Benjamin and his comments on the Work of Art in the Age of Mechanical Reproduction, and for that reason I will resist the temptation).

A central part of digital literacy (something that we, in higher education, are increasingly encouraged to incorporate into our learning outcomes, at least latently) is the ability to evaluate and judge the quality of sources that are found online. With so much information at our fingertips, and the flattening of value that comes as a result of an approach to content delivery that would release rather than evaluate, are we entering a period that, with Vico, is appreciating tradition more and more by virtue of the fact that we have more and more of it? Or is tradition quickly being stripped of its value as a consequence of the fact that all knowledge is lumped together as equally valuable within the marketplace of ideas? In other words, does our increased access to the past (and other marginal material) give it more importance, more of a voice, in the present? Or does this abundance justify its dismissal (a la Bacon) in the face of a present and future that really count?

Showing Appreciation

In conversations with several people over the last few days, the theme of appreciation has been among the most prominent. Questions like the following are difficult to answer, and in fact don’t seem to lend themselves easily to the application of general ‘rules of thumb:’

  • How do I recognize those vital individuals who work tirelessly behind the scenes, contributing to the success of an organization or event, and who are uncomfortable with public form of acknowledgement (the exceptions to the ‘praise in public, criticize in private’ rule)
  • How do I show appreciation for things my partner does that I could not begin to reciprocate in kind?
  • How do I motivate my staff to continue to perform despite limits placed upon my ability to promote or increase salaries?

[Image Creative Commons licensed / Flickr user http://www.flickr.com/photos/carlos_maya/C!...]

[Image Creative Commons licensed / Flickr user C!…]

People don’t leave jobs because of money. They leave because of poor management, which is to say that they leave either because they are not fully utilized (a problem of unrecognized potential) or because their work is going unnoticed (a problem of unrecognized activity). Assuming that salaries are reasonable, the best way of ensuring employee retention is a positive work environment in which ability and activity are recognized and encouraged.

The same principle of appreciation applies in education. Tinto (1975) has convincingly argued that Social Integration (through informal peer group associations, semi-formal extracurricular activities, and interaction with faculty / personnel) is the most directly associated factor when predicting what he calls ‘persistence.’ In other words, a student who feels appreciated at their college, by their peers, teachers, and administrators, is more likely to persist despite other factors like poor grades or dissatisfaction with other more institutional aspects. On the other hand, an otherwise successful student (i.e. a student with top grades) who is not successfully integrated into a social group is likely to voluntarily withdraw.

Appreciation is important. In fact, I would argue (provisionally, or course) that financial rewards are meaningful only to the extent that they are (1) expressions of appreciation, and/or (2) perceived as a means through which to achieve additional appreciation. When viewed through the lens of appreciation, a powerful yet unquantifiable driving factor (a factor that is powerful because it is unquantifiable), open information phenomena like open access journals, open source software, and creative commons begin to make sense.

In his book, The Public Domain: Enclosing the Commons of the Mind (available to read online HERE), James Boyle discusses several paradoxical effects of copyright and patent law. In particular, he observes the fact that, although patents and copyrights are ostensibly in place in order to encourage innovation, the lengthy and increasing terms actually have the opposite effect (by blocking the use of a great deal of material for the sake of future creative efforts). The glorious thing about open access initiatives, creative commons, open source, etc., is that they demonstrate that the profit motive is not in actual fact necessary to encourage innovation. In fact, ‘distributed creativity’ models have proven so effective that they are being increasingly embraced by major technology companies.

What is remarkable is not merely that the software works technically, but that it is an example of widespread, continued, high-quality innovation. The really remarkable thing is that it works socially, as a continuing system, sustained by a network consisting both of volunteers and of individuals employed by companies such as IBM and Google whose software “output” is nevertheless released into the commons. (187)

What, then, is the motivation for innovation and creativity here. Of course, when looking to Google and IBM, the profit motive is anything but absent. From the perspective of the larger community of authors in open source programming communities, however, the primary motivations appear to be two-fold. First, there seems to be an intrinsic motivation: there is something about programming that is game-like, that absorbs individuals in a task because of a pleasure derived from solving a problem for its own sake. There is something about the mere acquisition of mastery, apart from external rewards, that feels good in itself. Second, however, there is also an extrinsic motivation. But this external incentive is not for profit, but for recognition. Through creative commons licensing, authors give up potential financial rewards in exchange for an acknowledgment of their efforts that is transmitted through every iteration of its use. It may seem tautological, but a creation that is used is a useful creation. In other words, the fact that a program or piece of code (or song, or blog post, or book, or…) is put to use is a sign that it is valuable and that, in turn, the author is valuable as well. Even if the author is not thanked directly, but merely acknowledged, the use of some piece of intellectual property is itself an implicit gesture of appreciation. What we have, then, is an alternative economic model, an economy of appreciation that seemingly has all the social benefits intended by copyright law (encouraging innovation), without the unfortunate intellectual hoarding and orphaned works that we see as a result of copyright law in practice.

How can we establish an open economy of appreciation in the classroom? Too often, instructors lean too heavily on grades (the classroom equivalent of the cash economy) in order to produce results. “Perform this exercise / show up for class / participate, or else I’ll dock marks.” The problem with grades, however, is that they have a tendency to produce compliance rather that creativity. The key, then, is to structure class time and assignments in such a way as to maximize intrinsic motivation while also cultivating an economy of appreciation, wherein students can freely encourage one another, recognizing each other’s contributions to a common project (or range of projects) in a way that can be praised and further built upon. Assignments like digital storytelling projects and blogs can be powerful means by which to encourage this type of environment. Developing these assignments with a view to cultivating appreciative environments, however, is hard work, just as the development of the infrastructure necessary to encourage open source code sharing and a creative commons took (and continues to take) a lot of hard work. As we are beginning to see, however, the potential benefits for community and innovation are tremendous.


References
Boyle, James. The Public Domain: Enclosing the Commons of the Mind. New Haven: Yale University Press, 2008.

Tinto, Vincent. “Dropout from Higher Education: A Theoretical Synthesis of Recent Research.” Review of Educational Research 45, no. 1 (1975): 89-125.

Revisioning Argument?: Notes on “Theory in the Machine”

In her recent talk at Georgia Institute of Technology (February 13, 2013), entitled “Theory in the Machine: Or a Feminist in the Software Lab,” Tara McPherson described how she came to the digital humanities, her work as a founding editor of Vectors, and her current involvement in the development of Scalar, “a semantic web authoring tool that brings a considered balance between standardization and structural flexibility to all kinds of material.” McPherson prefaced her presentation with the disclaimer that it would not be deeply theoretical. Nonetheless, the talk was informative, introducing a variety of exciting new digital approaches to scholarship, and providing a wide variety of jumping off points for further inquiry.

[Image Creative Commons licensed / Flickr user FilPho

[Image Creative Commons licensed / Flickr user FilPhoto]

One aspect of McPherson’s talk that was particularly irksome, however, were common variations on the theme of “revisiting scholarly argument.”  From the perspective of screen theory, McPherson spoke about the possibility of “playing an argument like a video game,” or “watching an argument like a film.”  She talked about “refracting arguments through multi-modal lenses,” and adopting a non-linear approach. What I would like to claim here is that McPherson’s use of the term ‘argument,’ betrays a lack of clarity about what an argument is and, consequently, a failure to recognize that what is being proposed by many projects in the digital humanities is not a new approach to argument, but rather something else, a return to an old form of expression, namely the mythic.

Myth is primordial and originary. In the words of Claude Lévi-Strauss, “Myths get thought in man unbeknownst to him” (Levi-Strauss 1978, 3).  In contrast to philosophical thinking, which begins by assuming a difference between the knower and a thing to be known, Ernst Cassirer argues that myth is a function that makes abstract objective thought possible, but in a way that is in itself unmotivated either metaphysically – as if thought served to mirror some pre-existing reality – or psychologically – as a mirror of subjective psychic states or as a response to some set of pre-existing drives. In myth it is “Language itself [that] initiates such articulations and develops them in its own sphere” (Cassirer 1946, 12). The basis of mythological thought is metaphor, or the transmutation of one cognitive or emotional experience into a medium that is foreign to that experience (87). Mythical thought is not representational. It is a function by which relationships between experiences are spontaneously generated in such a way that allows those experiences to come into view. Mythological thought does not bear any relation to reality. It opens up reality, makes reality possible. It totalizes the world because it is the world.

Myth is not argument. Instead, as a mode of cognition and the distinguishing feature of Western philosophy, argumentation emerged and has persisted under a very particular (albeit long-lasting) set of conditions. First, as Marshall McLuhan observed, the written phonetic word is a crucial precondition for the emergence of philosophy. On the one hand, the phonetic alphabet served to sever the mythological identity of word and thing, thereby making it possible to map real relationships through conventional representation. On the other hand, the written word favors linear modes of deductive reasoning in a way that pictographs and strict orality do not, and that is actually alien to our lived experience of consciousness. Prior to widespread phonetic literacy, mythological thought, or what McLuhan also calls “tribal consciousness,” takes place as “an instant vision of a complex process (McLuhan 1964, 38), the communication of a tangled web of emotions and feelings (59) using metaphors meant to produce an effect rather than convey a meaning (85). With the advent of linear-sequential thinking, however, it becomes possible to map the world and determine causal relationships that allow for the prediction and control of the natural world and the progressive rationalization of the social world through the establishment of stable social institutions. With the development of electronic and digital communications technologies in the twentieth century, however, McLuhan insists that our experience is being fundamentally reshaped once again, that the increasing instantaneity made possible by electronic communication marks a return to mythical experience, but in a way that is at odds with institutions that emerged as a result of, and are therefore strongly committed to, discursive thought: “In the mechanical age now receding, many actions could be taken without too much concern. Slow movement insured that the reactions were delayed for considerable periods of time. Today the action and the reaction occur almost at the same time. We actually live mythically and integrally, as it were, but we continue to think in the old, fragmented space and time patterns of the pre-electric age” (20).

Work in the digital humanities, like that of McPherson, is exciting in so far as it is perhaps helping us to reconcile our mythic lives to scholarly modes of thought.  Put differently, revisioning standard forms of scholarly presentation might more accurately reflect the way we live the world. On the other hand, however, my fear is that the claim to ‘rethink argumentation’ may reveal a lack of reflection upon the modes of cognition and consciousness that the digital humanities claim to call into question.  More importantly, misunderstanding the history and character of argumentation is perhaps a symptom of a lack of reflection about the modes of consciousness that some work in the digital humanities are promoting.  Under the auspices of criticism, it is possible that these alternative modes of presentation may actually represent an uncritical embrace of our contemporary digital tribalism and, to that extent, function to promote and legitimate the status quo rather than call it into question.

From the perspective of teaching with technology, this can serve as a reminder of the fact that we, as teachers, are not merely shaping our students’ knowledge, but also the modes of cognition through which that knowledge is processed. If, as a consequence of their ubiquitous exposure to electronic and digital media, our students are increasingly coming to us with McLuhan’s ‘tribal consciousness,’ is it our task to embrace and cultivate a more mythological approach to sense-making? Or, is it in fact the case that the formation of a critical consciousness is important now more than ever, and that we should be more conservative in our use of digital technology in the classroom?  For all the criticisms of strictly empirico-deductive forms of reason (and there are many), what the philosophical / argumentative lens offers is the ability to put a distance between us and the technologies we use in order for use to ask exactly these kinds of critical questions.


References
McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: New American Library, 1964.

Lévi-Strauss, Claude. Myth and Meaning: Five Talks for Radio by Claude Lévi-Strauss. Toronto: University of Toronto Press, 1978.

Cassirer, Ernst. Language and Myth. Translated by Susanne Langer, K. New York: Dover Publications, 1946.

Badges: Revisiting the Peer-Review Process

On 17 April 2012, The Faculty Advisory Council at Harvard University issued a memorandum expressing concern over the rising costs of library subscriptions to scholarly journals, and strongly encouraging faculty and graduate students to submit their work to open access journals as a way of transferring prestige away from print and other for profit journals with high subscription costs. Indeed, concern over rising subscription costs, public access to information, and often-lengthy turn-around times for submitted materials have led to the emergence of numerous open access journals in recent years, with numbers increasing almost exponentially.

As of 2013-02-05 [15:55:30], The Directory of Open Access Journals (DOAJ) lists 8621 journals, 104 of which have been added since 1 January 2013. In order to be included in the directory, a research journal must be free and open to the public without delay, and must “exercise quality control on submitted papers through an editor, editorial board and/or a peer-review system.”[1] Since open access journals are also online journals (online content delivery significantly decreases the costs associated with traditional print publication), many journals, like Southern Spaces and Vectors, are also very interested in delivering scholarly content through a variety of alternative media. Because they are not tied to the economically motivated decision-making processes governing major print publishers, open access journals arguably have more freedom to publish and encourage alternative modes of scholarship, and, perhaps, to provide a legitimate space for otherwise marginal voices. Crucial to the legitimacy of open access journals, however, is a commitment to traditional processes of establishing authority, a ‘blind’ process whereby expert opinion is validated by expert opinion, anonymous though it may be.

So here’s the beginning of a half-baked (and somewhat utopian) idea…

[Image Creative Commons licensed / Flickr user fczuardi]

Recently, the idea of ‘badges’ has entered into discussions about assessment. Badges are a new approach to credentialing, a way of demonstrating competency, perhaps in the absence of formal training, through recognition by institutions, on the one hand, or peers, on the other. Most basically, having demonstrated some skill mastery or other characteristic (like ‘helpfulness,’ for example) an individual receives a badge (some kind of graphic) that they can display on their website and/or social network profiles that confers legitimacy to claims about skill-sets and aptitudes, by linking each back to the badge issuer. Although still in its infancy, the idea of using badges as a universal peer assessment framework has been most fully developed by the Mozilla Open Badges Project through the creation of the open source Open Badges Infrastructure. The strongest proponent of the use of badges in schools as a legitimate alternative to grades, is Cathy N. Davidson, co-founder of HASTAC (Humanities, Arts, Science, and Technology Advanced Collaboratory), an international organization dedicated to rethinking the future of learning for the information age.[2]

My half-baked idea goes something like this: Why not eliminate journals entirely? Why not revisit every part of the traditional approach to academic publishing, including the peer-review process? In place of academic journals, why not encourage authors to host content themselves and self-publish their work in a way that decentralizes knowledge production?

According to the Budapest Open Access Initiative, “There are many degrees and kinds of wider and easier access to this literature. By ‘open access’ to this literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. The only constraint on reproduction and distribution, and the only role for copyright in this domain, should be to give authors control over the integrity of their work and the right to be properly acknowledged and cited.” A decentralized approach to academic publication that encourages scholars to self-publish on personal websites fulfills every condition of Open Access. The greatest drawback to this kind of approach (and a longstanding reason for distrusting self-publication in general), is that it bypasses peer-review, an 18th century system of legitimizing knowledge production. In other words, open access to scholarship doesn’t really mean much if there is no way to judge the quality and credibility of the work. This is where badges come in.

I produce a piece of scholarship and put it online. If it is found to be of merit by others in my community, or by others who discover my work organically, then they can give me a badge, a seal of approval that vouches for the quality of my work. The meaning of a badge, then, becomes a transparent reflection of the quality of the work by way of the credibility of the issuer. On the one hand, the ‘value’ of the badge is determined by the amount of recognized expertise the issuer has to evaluate the work. The fact that a badge is not anonymous means that different badges carry different weights, and in a way that is often obscured by the peer-review process (not all peer reviewers have the same level of expertise, which is why there is almost no article that is so bad that one can’t find a journal to publish it). On the other hand, the issuer of a badge will be forever tied to the life of the work, and in such a way as to put their credibility on the line alongside the original author. Consequently, issuers are discouraged from ‘badging’ willy nilly, as if they were ‘liking’ posts on Facebook. According to the scheme, assessments of the merit of a work would involve a consideration of the author and the work itself, but also the expertise of badge issuers. This additional information would also be very helpful hermeneutically, as a way of locating a work with respect to the communities it touches and the range of meanings it might have.

Okay, so maybe this idea isn’t even really half-baked, but it is an idea. Realistically, even if this or something like it were good in theory, in practice replacing the old peer-review model would require that nearly everyone jumped on board at once, lest the badging mavericks find themselves cut off and set adrift from the larger, legitimate, and legitimizing community of scholars. Implementing this kind of framework would most certainly be fraught with logistical and technical challenges as well. The point of this thought experiment, however, is not so much about proposing an alternative reality, but rather bringing to light the fact that, in challenging the hegemonic authority of major publishers as the only credible mechanism for delivering legitimate scholarship, open access approaches to publication do not at the same time challenge the notion of centralized authority per se. The world wide web offers us more than just an opportunity to share knowledge openly, but also the opportunity to challenge more basic assumptions about how knowledge is (and should be) produced. Upon re-evaluation, It may in fact be that journals continue to be the best way of delivering knowledge and ratifying expert opinion. The opportunity for decentralization afforded by the internet, however, allows us to call our standard ways of doing things into question, and in so doing, to transform passive assumptions about the way things are into active decisions about the ways things should be.


Why I Take Attendance

An interesting conversation has been taking place on one of the listservs to which I subscribe. What began as an innocent query about available apps for tracking attendance, has quickly transformed into a discussion about why attendance should be taken in the first place. There have been questions about the extent to which ‘seat time’ is really an effective way of measuring participation, and discussions of other more administrative reasons for why tracking attendance might be important (i.e. institutional policies about seat monitoring, student loan conditions, etc). What is lacking in these discussion, however, and the reason why I take attendance in my classes, is a more humanistic perspective.

attendance-logoAttendance2 and Attendance by David M. Reed

It may seem paradoxical, since more often than not attendance functions as a quantitative measure of participation (not a good measure of participation, mind you, but at least it can be determined with a reasonable amount of precision), but the reason that I take attendance has to do with establishing relationship with my students. Especially during the first few classes (okay, lets be honest…several…or more), it is helpful to perform role call as a way of learning student names, particularly in large classes when there are a lot of names to remember. Sure, this memory game can be won more quickly through other means (i.e. flash cards with photos, wither taken by the instructor or on file with the registrar), but there are several other benefits to role call, benefits that cannot be achieved strictly through study.

For the last couple of years, I have been using an iPhone/iPad app called simply Attendance, by David M. Reed. What I like about Attendance is its ease of use, intuitive interface, ability to import from *.csv, Dropbox sync, and photo integration (yes, flashcards still have their place). As of the writing of this post, however, I have become aware of a new version of Attendance, Attendance2, which was favorably reviewed by Brian Croxall in a post for the Chronicle of Higher Education here: http://chronicle.com/blogs/profhacker/attendance2-an-update-for-the-attendance-app-for-ios-devices/41850. In spite of the fact that I am relying on what is now an old version of Reed’s software, the benefit of using an app for attendance has been proven time and time again. If we accept what I have said about the more relational aspects of attendance-taking, it should be fairly obvious that the actual recording technology has little to no impact on the achievement those inter-personal outcomes. Where it does have a demonstrable impact, however, is in the reduction of the number of paper scraps floating around (not so much an environmental issue, and an issue of tidiness), and ease of record keeping. Furthermore, apps like this also enable the instructor to go beyond merely checking ‘present’ or ‘absent,’ but also make it easy to record the condition of an absence, in addition to making quick notes about a student’s participation.

In calling out each student’s name (the most beautiful word in the world, let’s recall, is one’s own name), the instructor is able to accomplish even just a little bit of rapport. In calling out each student’s name, the instructor may extend a kind of personal welcome: “John Doe?…oh, hi John. I’m glad you are here.” In this sense, attendance is not simply a requirement of success in the course, but also an extension of the spirit of hospitality. What of those students that are not present? Well, I do my best to extend my hospitality to them as well. “Jane Smith?…No?…I’m sorry she is not here. She made an insightful comment on the discussion board that I want to talk about today.” For a student to know that they are present in the teacher’s eyes, even if absent, is likely to increase, not just the chances of their physical presence, but the quality of their participation as well.

The practice of talking attendance can also function as an effective way of gauging the ‘temperature’ of the room. Asking whether a student is present is a call to which every student can respond, and respond with certainty. If the teacher pays close attention to the response of each student, and has attended to the quality each response in the past, it becomes possible to establish a baseline (albeit an anecdotal one) according to which the instructor can assess the student’s preparedness and excitement about the course material for that day, but also their mood. This is the kind of thing that more sophisticated analytics attempt to achieve by mapping performance onto dispositional and performance indicators, but that, at the end of the day, is actually exceedingly difficult to quantify. If the instructor is attuned to the moods of individual students, and to the mood of the class as a whole, then they will be better equipped to deliver their course material in a way that is optimal to that particular time and place.

All this has been to say that attendance does not need to be strictly a quantitative component of a course grade, but it may also be a way of reaching out to each student individually at the start of class, indicating that they are individually valuable to that particular classroom environment, and increasing student engagement as a result.

The Costs of Privacy

In November 2012, in response to threats of expulsion from John Jay Science & Engineering Academy on account of her refusal to wear a mandatory RFID badge, Andrea Hernandez filed a law suit against San Antonio’s Northside Independent School District. If she continues to refuse even to wear an RFID-disabled badge–an accommodation sanctioned by a federal district judge who ruled against her–Hernandez will be placed in Taft High School beginning in September 2013, the public school to which she would normally be assigned.

In refusing to wear even an RFID-disabled badge, Hernandez’s case seems to have lost its ‘bite’ (it’s difficult to justify her appeal to religious freedom once tracking mechanisms are disabled). In spite of the fact that her concerns were ultimately voiced in terms of an interest in preserving religious freedom, however, the case nonetheless draws attention to the potential costs of privacy.

As elite institutions increasingly adopt comprehensive analytics programs that require students to give up their privacy in exchange for student success, are they also strongly contributing to a culture in which privacy is no longer valued? A robust analytics program requires every student to opt-in (i.e. students are not given the option of opting out). If analytics programs are seen as effective mechanisms to increase the chances of student success, and such programs are effective only to the extent that they gather data that is representative of their entire student body, and, as such, consenting to being tracked is made a condition of enrollment at the most elite universities (universities with the resources necessary to build and sustain such programs), then students must ask what it is that they value more: an education at a world-class institution (and all of the job prospects and other opportunity that such an education affords), or the ability to proverbially click ‘do not track.’ My suspicion is that, if explicitly given the choice, the vast majority of students are willing to give up the latter for the former, a symptom of our growing acceptance of, and complacence toward, issues of electronic privacy, but perhaps also an indication that a willingness to sacrifice privacy for success increasingly forms a key part of the ‘hidden curriculum.’

Erasing Privacy

[Image Creative Commons licensed / Flickr user Alan Cleaver]

(Interestingly, in addition to gathering data from Learning Management and operational systems, universities also regularly collect data from student id card swipes. This data can easily be mobilized as part of a kind of ‘card-swipe surveillance’ program, as in fact has been done by Matthew S. Pittinksy (co-founder of Blackboard) at Arizona State University. According to Pittinsky, tracking card-swipe behavior can allow an institution to effectively map a student’s friend group, determine their level of social integration, and predict their chances of attrition.)

Learning to be Human from the Center of the Internet

Attending strictly to the more phenomenological aspects of the internet, it is easy to fall into a kind of idealism.  Zygmunt Bauman (2005), for example, has argued that the era of space has come to an end, that the extraterrestrial realm of cyberspace has broken away from the realm of places and, consequently, social life has become reconfigured in such a way as to privilege decentralization, mobility, and fluidity over the centralized institutions, rigid borders, and stable relationships.  Increasingly, it is argued, the material world is becoming irrelevant as we live more and more in a utopia, a ‘no place’ where identities are as liquid as the virtual planes they navigate.

In Tubes: A Journey to the Center of the Internet, Andrew Blum argues that our everyday experience of the internet (as a ‘cloud,’ for example) obscures the fact that the world wide web actually depends upon a physical infrastructure that is located in space, adapted to geography, and surprisingly vulnerable to human error, environmental conditions, and general decay.  In fact, among the more shocking discoveries made by Blum is that, in contrast to the distributed network envisioned in the 1960’s, the contemporary internet is actually made up of a relatively small number of major centers connected, at times, by what seems like only the thinnest of threads. Suffice it to say, attention to the physical infrastructure of the world wide web paints a very different picture from the infinite and eternal cloud that we experience as users each time we open a browser.

In uncovering the hidden materiality of the internet, Tubes helps to raise some interesting pedagogical questions. On the one hand, there is a strong contemporary tendency to praise advances in web-based technology for allowing us to offload knowledge functions and focus, instead, on cultivating the imagination. The goals of education are less and less about delivering content, and more and more about empowering students to seek out relevant information necessary to finding innovative solutions to emergent problems. The world wide web is powerful because, like never before, it allows us to create new worlds and, explore a seemingly infinite range of potentialities. On the other hand, however, I wonder if obscuring (or simply forgetting) the physical and technological infrastructure that makes the world wide web possible doesn’t actually end up promoting a particular set of philosophical perspectives, namely, idealism.

Idealism is a philosophical perspective according to which the greatest amount of reality is given to the immaterial. For Plato, sensible things are real only to the extent that they participate in the forms. For Berkeley, all of our sense experiences are caused by God. For Kant, our knowledge about empirical reality is mediated and made possible by the basic structures of consciousness. To the extent that we ignore the material infrastructure supporting the world wide web, and as we increasingly incorporate web 2.0 technologies into the classroom that aim to be as transparent as possible (facilitating productivity and creativity without also making it obvious that we are using tools–transforming tools into prosthetics), are we tacitly encouraging an idealistic view of the world? Does a failure to educate students about the solidity and vulnerability of the internet as an infrastructure contribute to an ethics that values minds over bodies? Technology is not value neutral. The world wide web is not merely a tool for learning and communicating, but rather also actively reinforces certain world views at the expense of others. By shining a light on the material side of the internet, Tubes effectively brings the body back, reminds us that even our spiritualized identities in cyberspace are dependent on space and place. Ironically, in considering the more technical components of the internet, its vulnerabilities and dependencies, its greasy and dirty underside, we are perhaps reminded of the same qualities in ourselves, and so reminded of what it means to be human, tubes and all.


REFERENCES
Bauman, Z. (2002). Society Under Siege. Malden, MA: Polity Press.
Blum, A. (2012). Tubes: A Journey to the Center of the Internet. New York, NY: HarperCollins.